-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot load onnx checkpoint from huggingface repository #553
Comments
Thanks for the quick fix. It works for "multilingual-e5-base", however it fails for "multilingual-e5-large", which has a different directory structure: It now fails with a different error, though:
|
Oh yes this will be an issue for model for model >2GB, could you use instead :
|
Unfortunately that fails
I was thinking something along the lines of @staticmethod
def _cached_file(
model_path: Union[Path, str],
use_auth_token: Optional[Union[bool, str]] = None,
revision: Optional[str] = None,
force_download: bool = False,
cache_dir: Optional[str] = None,
file_name: Optional[str] = None,
subfolder: str = "",
local_files_only: bool = False,
):
# locates a file in a local folder and repo, downloads and cache it if necessary.
model_path = Path(model_path)
if model_path.is_dir():
model_cache_path = model_path / file_name
else:
model_cache_path = snapshot_download(
repo_id=model_path.as_posix(),
allow_patterns=None if subfolder is None else subfolder+"/**",
use_auth_token=use_auth_token,
revision=revision,
cache_dir=cache_dir,
force_download=force_download,
local_files_only=local_files_only,
# local_dir="/tmp/openvinoconverter/",
# local_dir_use_symlinks="auto"
)
if subfolder is not None:
model_cache_path = os.path.join(model_cache_path, subfolder)
model_cache_path = os.path.join(model_cache_path, file_name)
model_cache_path = Path(model_cache_path)
return model_cache_path might solve the issue. At least it downloads all the right files. However, I suspect that there is another bug in openvino itself that prevents this from working unless I use this localdir="/tmp/openvinoconverter" line because of all the sym links. openvinotoolkit/openvino#22736 I'm not sure how to sensibly set this to a local directory, I'd much rather work only on the symlinks to the huggingface cache. |
The OpenVINO export for sentence transformers models should be fixed in optimum-intel v1.15.0, could you try : model = OVModelForFeatureExtraction.from_pretrained("intfloat/multilingual-e5-large", export=True) |
nice! now it works, thanks |
I am currently facing an issue while attempting to load an OVModelForFeatureExtraction from a pretrained checkpoint that is available as an ONNX file. The checkpoint is located in a subfolder within the Hugging Face repository.
I have tried to load the model using the following code snippet:
However, this approach fails with an error stating that
model.bin
is not available. It seems that the loading process does not consider the ONNX file (model.onnx) as it should. I have also attempted to explicitly set the file_name parameter as "model.onnx", but the issue persists.What works: Interestingly, when I clone the Hugging Face repository and load the model from the local onnx folder path, everything works correctly.
Expected Behavior: I expect the
from_pretrained
method to successfully load ONNX files from Hugging Face repositories, allowing me to directly use the pretrained checkpoint without needing to clone the repository locally.Traceback:
The text was updated successfully, but these errors were encountered: