Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support transformers 4.42 #789

Closed

Conversation

helena-intel
Copy link
Collaborator

OpenVINO tests pass for me locally with 4.42. Would be great if we can support this version.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@helena-intel helena-intel requested a review from echarlaix June 28, 2024 09:19
@helena-intel helena-intel added the openvino-test Trigger OpenVINO slow tests label Jun 28, 2024
@helena-intel
Copy link
Collaborator Author

For OpenVINO tests for the regular test there is only a non-related error:

Repository Not Found for url: https://huggingface.co/api/models/optimum-internal-testing/tiny-random-phi-private/tree/main?recursive=True&expand=False.

With the slow tests there is this error:

FAILED tests/openvino/test_modeling.py::OVModelForCausalLMIntegrationTest::test_beam_search_13_llama - AssertionError: False is not true : generation config : GenerationConfig {
  "do_sample": true,
  "max_new_tokens": 10,
  "min_new_tokens": 10,
  "num_beams": 4,
  "top_k": 1
}
, transformers output tensor([[    1, 20628,   338,   263,  7575,  2462,   322,   306,   626,  5520,
         20211, 18061, 13144, 29944, 20372, 29249, 18428, 29572, 31252,  1404],
        [    2,     2,     2,     2,     2,     2,     1,   910,   338,   592,
         27264, 14848,  2618, 20610, 24351, 28333, 17245, 12944, 19234, 21336]]), ov_model_stateful output tensor([[    1, 20628,   338,   263,  7575,  2462,   322,   306,   626,  5520,
         20211, 18061,  1404,  2376, 12454,  1404,  2376, 12454,  1404,  5661],
        [    2,     2,     2,     2,     2,     2,     1,   910,   338,   592,
         27264, 14848,  2618, 20610, 24351, 28333, 17245, 12944, 19234, 21336]])
== 1 failed, 190 passed, 421 deselected, 3066 warnings in 2018.31s (0:33:38) ===```

@IlyasMoutawwakil
Copy link
Member

I think we should update the version in the CI as well

transformers-version: ["4.36.0", "4.41.*"]

@eaidova
Copy link
Collaborator

eaidova commented Jul 1, 2024

this is probably will require some changes related to generative models with disableing wrapping cache to cache class and getting dtype

    result = model.generate(
  File "/home/ea/work/my_optimum_intel/optimum_env/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/home/ea/work/my_optimum_intel/optimum_env/lib/python3.8/site-packages/optimum/intel/openvino/modeling_decoder.py", line 659, in generate
    result = super().generate(
  File "/home/ea/work/my_optimum_intel/optimum_env/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/home/ea/work/my_optimum_intel/optimum_env/lib/python3.8/site-packages/transformers/generation/utils.py", line 1744, in generate
    model_kwargs["past_key_values"] = self._get_cache(
  File "/home/ea/work/my_optimum_intel/optimum_env/lib/python3.8/site-packages/transformers/generation/utils.py", line 1434, in _get_cache
    cache_dtype = self.dtype
AttributeError: 'OVModelForCausalLM' object has no attribute 'dtype'

@helena-intel helena-intel force-pushed the helena/transformers442 branch from 5c07721 to 11b7bcb Compare July 1, 2024 10:25
@helena-intel
Copy link
Collaborator Author

I think we should update the version in the CI as well

Thanks! I was under the impression the latest version would always be installed. I update the workflow file.

this is probably will require some changes related to generative models with disableing wrapping cache to cache class and getting dtype

:-( I'll make this PR a draft. The regular tests worked out of the box for me so I hoped this would be an easy update.

@helena-intel helena-intel marked this pull request as draft July 1, 2024 10:28
@eaidova
Copy link
Collaborator

eaidova commented Jul 1, 2024

@helena-intel how did you tested that? There is also forcing transformers version on optimum side that may affect optimum-intel package selection https://github.com/huggingface/optimum/blob/d0a84a94183222a3931adcfca7f234a7086821db/setup.py#L18

do you install transformers version after optimum-intel install?

@helena-intel
Copy link
Collaborator Author

Closing PR as per Ekaterina's comment this needs more work to be supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
openvino-test Trigger OpenVINO slow tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants