You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have developed some Colab notebooks around PaliGemma 2 that do the following:
Convert and quantize the PaliGemma 2 models (including the latest paligemma2-3b-mix models) into ONNX format, compatible with Transformers.js link.
Inference with Transformers.js for different tasks - Zero-shot object detection, image captioning, OCR, and Visual Q&A using the converted model weights link.
Can these be added to the Cookbook?
What problem are you trying to solve with this feature?
Add new examples and scripts for inferencing PaliGemma 2.
No response
Any other information you'd like to share?
No response
The text was updated successfully, but these errors were encountered:
Description of the feature request:
I have developed some Colab notebooks around PaliGemma 2 that do the following:
Convert and quantize the PaliGemma 2 models (including the latest
paligemma2-3b-mix
models) into ONNX format, compatible with Transformers.js link.Inference with Transformers.js for different tasks - Zero-shot object detection, image captioning, OCR, and Visual Q&A using the converted model weights link.
Can these be added to the Cookbook?
What problem are you trying to solve with this feature?
Add new examples and scripts for inferencing PaliGemma 2.
No response
Any other information you'd like to share?
No response
The text was updated successfully, but these errors were encountered: