-
Notifications
You must be signed in to change notification settings - Fork 468
Open
1 / 51 of 5 issues completedLabels
good first issueGood for newcomersGood for newcomershelp wantedExtra attention is neededExtra attention is neededhuggingface
Description
This is an issue tracker for transformers v5.
Known issues
Feel free to report if you find any issue that isn't listed below.
Monkey patch
Errors when running
python3 -m pytest test/transformers/test_monkey_patch.py
See details in #960 (comment)
- AttributeError for 'language_model' in transformers v5 Fix missing property access for multimodal models #966
- KeyError: 'mrope' rope parameters for transformers v5 #1013
- Mixtral: 'MixtralDecoderLayer' object has no attribute 'block_sparse_moe'
- 'Llama4VisionConfig' object has no attribute 'rope_theta' rope parameters for transformers v5 #1013
- 'Gemma3ForConditionalGeneration' object has no attribute 'vision_tower'
- MoE breaking change, see also Moe kernel for latest transformers v5 #958
Convergence test
Errors when running
python3 -m pytest test/convergence/*
- Some
XXXTokenizerFastare deprecated: No module namedtransformers.models.gemma.tokenization_gemma_fastFix/transformers v5 gemma tokenizer #1030 - llava: vision_tower generate
outputs.hdiden_states=NoneLlava vision_tower monkey patch generateshidden_states=None#1011 - qwen2_vl, qwen2_5_vl: 'Qwen2_5_VLConfig' object has no attribute 'hidden_size' Qwen2VLConfig and Qwen2_5_VLConfig have no attribute
hidden_size#1012
Benchmarking
- qwen2vl_mrope:
Qwen2VLRotaryEmbeddingexpects config argument instead ofint(hidden_dim) Benchmarkbenchmark_qwen2vl_mrope.pyfails with new transformers #1015
For Contributors
Environment
Install the latest release candidate in your local environment. See #994 for the release candidate version.
uv pip install transformers==5.0.0rc3Check transformers version
uv pip show transformersPull Requests
When opening a pull request, choose https://github.com/linkedin/Liger-Kernel/tree/transformers-5.0.0rc1 as your base branch.

Sub-issues
Metadata
Metadata
Assignees
Labels
good first issueGood for newcomersGood for newcomershelp wantedExtra attention is neededExtra attention is neededhuggingface