Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

modeling_deformable_detr.py DeformableDetrMultiheadAttention.foward function report error for "hidden_states_original" if position_embeddings is None #36378

Open
4 tasks
susanbao opened this issue Feb 24, 2025 · 1 comment
Labels

Comments

@susanbao
Copy link

System Info

src/transformers/models/deformable_detr/modeling_deformable_detr.py

class DeformableDetrMultiheadAttention(nn.Module)

def forward(
    self,
    hidden_states: torch.Tensor,
    attention_mask: Optional[torch.Tensor] = None,
    position_embeddings: Optional[torch.Tensor] = None,
    output_attentions: bool = False,
) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Tuple[torch.Tensor]]]:
    """Input shape: Batch x Time x Channel"""

    batch_size, target_len, embed_dim = hidden_states.size()
    # add position embeddings to the hidden states before projecting to queries and keys
    if position_embeddings is not None:
        hidden_states_original = hidden_states
        hidden_states = self.with_pos_embed(hidden_states, position_embeddings)

    # get queries, keys and values
    query_states = self.q_proj(hidden_states) * self.scaling
    key_states = self._shape(self.k_proj(hidden_states), -1, batch_size)
    value_states = self._shape(self.v_proj(hidden_states_original), -1, batch_size)
     ....

If position_embeddings is None, the "hidden_states_original" won't be created. Then use it on "value_states = self._shape(self.v_proj(hidden_states_original), -1, batch_size)" reports error "UnboundLocalError: local variable 'hidden_states_original' referenced before assignment".

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

Set the input position_embeddings of DeformableDetrMultiheadAttention.forward be None

Expected behavior

Report error "UnboundLocalError: local variable 'hidden_states_original' referenced before assignment"

@susanbao susanbao added the bug label Feb 24, 2025
@Rocketknight1
Copy link
Member

cc @NielsRogge for DeformableDetr!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants