Replies: 1 comment
-
Sorry for late reply. Do you have a minimal script to reproduce? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, I'm getting error with loading compiled model with
torch.jit.load
Interesting that it works within python scripts codebase used to train the model, but doesn't work from python interpreter and C++ code. it is reporting the same error
Ultimately I need the model loaded in C++ for inference.
Could anyone suggest how to possibly solve this?
thanks
Beta Was this translation helpful? Give feedback.
All reactions