-
Notifications
You must be signed in to change notification settings - Fork 3.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CoreML: MLModel of type mlProgram cannot be loaded just from the model spec object. #1495
Comments
[UPDATE]
|
[UPDATE] Even if I had the conversion completed
it seems that files are empty: % ls -l models/coreml-encoder-large-v3.mlpackage
total 8
drwxr-xr-x 3 loretoparisi staff 96 Nov 16 11:45 Data
-rw-r--r-- 1 loretoparisi staff 617 Nov 16 11:45 Manifest.json
% ls -l models/coreml-encoder-tiny.en.mlpackage
total 8
drwxr-xr-x 3 loretoparisi staff 96 Nov 16 00:43 Data
-rw-r--r-- 1 loretoparisi staff 617 Nov 16 00:43 Manifest.json |
@ggerganov any idea? |
Not sure why it fails - I have very basic understanding of the CoreML stuff. Probably somebody with more expertise can help out |
Having a same issue still today |
I ran into this error today using an coreml export script for my own model. The script used to run fine on an older version of coremltools. The issue in my case was that in the oldv version the coreml model type was "neuralnetwork" (without setting it explicitly), but in the new version the model type was "mlprogram". Setting |
@mgrachten Thank you so much! This solved my problem after at least 40 hours trying various methods to solve it. I wouldn't have nearly as good of a master's capstone project without this comment!!! |
I get this CoreML error when running conversion with quantization:
Stacktrace:
while if I run it without quantization it works fine:
The text was updated successfully, but these errors were encountered: