Skip to content

Refactor Bamba tests to inherit from CausalLMModelTester base classes #38587

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

Rocketknight1
Copy link
Member

@Rocketknight1 Rocketknight1 commented Jun 4, 2025

Another OpenHands + Opus 4:

What does this PR do?

This PR refactors the Bamba model tests to inherit from the CausalLMModelTester and CausalLMModelTest base classes, following the pattern established in other causal LM models like Gemma. This reduces code duplication and makes the tests more maintainable.

Changes made:

  • Updated BambaModelTester to inherit from CausalLMModelTester
  • Updated BambaModelTest to inherit from CausalLMModelTest
  • Removed duplicate methods that are already implemented in the base classes:
    • create_and_check_model
    • create_and_check_for_causal_lm
    • Redundant test methods in BambaModelTest
  • Fixed method signatures and attribute names for compatibility with base classes
  • Maintained all Bamba-specific functionality while leveraging base class code

Testing:

All tests pass successfully:

  • Ran pytest tests/models/bamba/test_modeling_bamba.py::BambaModelTest - 93 passed, 130 skipped
  • Verified dependent models (GraniteMoeHybrid) still work correctly
  • Applied style fixes using make fixup

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

EDIT: Told the bot not to ping Arthur (sorry Arthur)

- Update BambaModelTester to inherit from CausalLMModelTester
- Update BambaModelTest to inherit from CausalLMModelTest
- Remove duplicate methods already implemented in base classes
- Fix method signatures and attribute names for compatibility
- Maintain Bamba-specific functionality while leveraging base class code

This change reduces code duplication and follows the pattern established
in other causal LM models like Gemma.
@Rocketknight1 Rocketknight1 marked this pull request as ready for review June 4, 2025 16:41
Rocketknight1 and others added 2 commits June 4, 2025 17:41
GraniteMoeHybrid inherits from Bamba tests but doesn't have mlp_bias
in its config. This adds mlp_bias=False to the test config to match
Bamba's default value and fix the AttributeError.
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Comment on lines +98 to +99
batch_size=batch_size,
seq_length=seq_length,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As my comment in #38475, we can avoid to batch_size , seq_length here and in the arguments above, as the values are the same as the default ones in CausalLMModelTester.

Maybe the same for (many) several attributes/arguments.

base_model_class = BioGptModel
for_causal_lm_class = BioGptForCausalLM
causal_lm_class = BioGptForCausalLM

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same comment as above for the init below

Comment on lines +85 to +92
mamba_d_state=16,
mamba_d_conv=4,
mamba_expand=2,
mamba_dt_rank="auto",
mamba_conv_bias=True,
mamba_proj_bias=False,
mamba_inner_layernorms=True,
n_mamba_heads=2,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure why we have mamba here while this is granitemoebird test file

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah ok, i see we have

class GraniteMoeHybridModelTester(BambaModelTester)

sorry

Comment on lines +299 to +301
@unittest.skip(reason="GraniteMoeHybrid has different cache handling")
def test_decoder_model_past_with_large_inputs(self):
pass
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From code, I don't see this is skipped on current main?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants