Skip to content

Conversation

@simondanielsson
Copy link

@simondanielsson simondanielsson commented Sep 2, 2025

What does this PR do?

This PR adds supported tasks and normalized config for Gemma3, to support exporting Gemma3 models.

Accompanied PR here.

Fixes huggingface/optimum-onnx#49 (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Who can review?

@echarlaix, @JingyaHuang, @michaelbenayoun, @IlyasMoutawwakil

@simondanielsson simondanielsson force-pushed the feature/add-gemma3-export branch from 262a819 to e9f5bdd Compare September 2, 2025 17:47
"gpt_neo": NormalizedTextConfig.with_args(num_attention_heads="num_heads"),
"gpt_neox": NormalizedTextConfig,
"gptj": GPT2LikeNormalizedTextConfig,
"granite": NormalizedTextConfigWithGQA,
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Explanation: this one was out of place


Gemma3NormalizedTextAndVisionConfig = create_normalized_text_and_vision_config(
text_config_cls=Gemma3NormalizedTextConfigWithGQA
).with_args(text_config="text_config", vision_config="vision_config")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note: Happy to hear if you have a cleaner way of doing this :)

@simondanielsson simondanielsson changed the title Add supported tasks for Gemma3 [Gemma3] Add supported tasks and normalized config Sep 3, 2025
@simondanielsson simondanielsson force-pushed the feature/add-gemma3-export branch from 7afb702 to f82dafd Compare September 3, 2025 14:47
@simondanielsson simondanielsson marked this pull request as ready for review September 3, 2025 14:55
@IlyasMoutawwakil
Copy link
Member

Hi, the addition of model types in the tasks manager is no longer necessary, same thing for normalized configs, as we are trying to make optimum-subpackages less dependent on optimum during development.
You can try adding gemma3 only in optimum-onnx, which I think will be difficult for the specific reason that gemma3 is a vlm, and we still didn't implement api for vlm export in optimum-onnx.

I will close this PR and let's discuss the caveats in your PR there 🤗

@IlyasMoutawwakil
Copy link
Member

and sorry for the confusion, we are still preparing for the v2 of optimum and removing unnecessary stuff see #2343 and #2346

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Gemma 3 support

2 participants