how to provide overrides needed for model compatibility #816
Labels
design
This is a largish feature / design
interpreter
program-coverage
Requests for model and program coverage
While transformers BERT is not yet fully working, we are getting closer.
However, there is the need to disable some data-dependent control flow (typically: checks) to get it to work.
Transformers itself hides some but not all behind a check for compiling, e.g.
https://github.com/huggingface/transformers/blob/0fdea8607d7e01eb0e38a1ebeb7feee30a22f0cf/src/transformers/modeling_attn_mask_utils.py#L256-L260
So here is a candidate for how we might currently be able to run BERT after fixing #805 (and subsequent bugs):
We might submit an issue to transformers to check for is_tracing in
transformers.modeling_utils.PreTrainedModel.warn_if_padding_and_no_attention_mask
but in general I wonder how to provide the such compatibility lookasides to users.
possible variants:
torch._dynamo.is_compiling
-lookaside?...
The text was updated successfully, but these errors were encountered: