You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm having multiple problems testing out the huggingface backend. Here's one example error:
File "C:\PATH\Scripts\WhisperS2T-batch-process\Lib\site-packages\transformers\tokenization_utils.py", line 391, in added_tokens_encoder
return {k.content: v for v, k in sorted(self._added_tokens_decoder.items(), key=lambda item: item[0])}
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\PATH\Scripts\WhisperS2T-batch-process\Lib\site-packages\transformers\tokenization_utils.py", line 391, in <dictcomp>
return {k.content: v for v, k in sorted(self._added_tokens_decoder.items(), key=lambda item: item[0])}
^^^^
TypeError: 'tokenizers.AddedToken' object does not support the context manager protocol
File "C:\PATH\Scripts\WhisperS2T-batch-process\Lib\site-packages\transformers\tokenization_utils.py", line 391, in added_tokens_encoder
return {k.content: v for v, k in sorted(self._added_tokens_decoder.items(), key=lambda item: item[0])}
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\PATH\Scripts\WhisperS2T-batch-process\Lib\site-packages\transformers\tokenization_utils.py", line 391, in <dictcomp>
return {k.content: v for v, k in sorted(self._added_tokens_decoder.items(), key=lambda item: item[0])}
^^^^^^^^^
AttributeError: 'list_iterator' object has no attribute 'content'
It DOES NOT make sense because this error is different even though I only changed the "batch_size" from 8 to 16 in the above script?
I'm on Windows and am using a custom flash attention 2 wheel from here:
I'm having multiple problems testing out the huggingface backend. Here's one example error:
And here's the script that produced it:
Here's another kind of error that I got:
It DOES NOT make sense because this error is different even though I only changed the "batch_size" from 8 to 16 in the above script?
I'm on Windows and am using a custom flash attention 2 wheel from here:
https://github.com/bdashore3/flash-attention/releases/
I've struggled for hours to get flash attention 2 to work on Windows and with whispers2t specifically, using multiple scripts, not just the one above.
Any help would be much appreciated. I'd love to get the huggingface backend working along with ctranslate2's...Here is my pip freeze if it helps...
The text was updated successfully, but these errors were encountered: