Replies: 2 comments
-
Is there any way to save it as a full PyTorch model so that I can convert it to ONNX? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Unortunately, onnx export is not supported at this point. This is tracked in #224. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I would like to save my pre-trained adapters with the model to later load them into an ONNX runtime environment (as I don't think it would be possible with transformers.js to load them adapters at the moment).
Is this possible somehow?
Beta Was this translation helpful? Give feedback.
All reactions