Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

python package local-attention release 1.9.15 causes ModuleNotFoundError: No module named 'torch.amp' #524

Open
yitaochen opened this issue Sep 6, 2024 · 5 comments

Comments

@yitaochen
Copy link

Per the title says. Could you include local-attention==1.9.14 in the requirements.txt

@moured
Copy link

moured commented Sep 15, 2024

That was helpful!

@Musongwhk
Copy link
Contributor

Hi. Thank you for your issue.
Firstly, the module we use is torch.cuda.amp, not torch.amp;
Secondly, local-attention is a dependency of other packages we require.If you configure the environment according to requirements.txt, the appropriate version should be automatically downloaded without the need for explicit settings.
If the version of your package local-attention causes ModuleNotFoundError, you may consider reconfiguring the appropriate version as you say. Please feel free to continue discussing if you have any further questions.

@yitaochen
Copy link
Author

Yes, for some reason the author of local-attention packet made the following change between release 1.9.14 and 1.9.15:

- from torch.cuda.amp import autocast
+ from torch.amp import autocast

I was just thinking it'd be helpful if you can mention this somewhere in the doc, so people don't have to spend time on this issue.

@Musongwhk
Copy link
Contributor

Thank you for your information! We also reproduce ModuleNotFoundError. If we configure the environment according to requirements.txt, local-attention version will be 1.9.15.
Your discussion may be helpful to others who have similar doubts. We will consider mentioning this in requirements.txt or somewhere else.

Musongwhk added a commit to Musongwhk/Time-Series-Library that referenced this issue Sep 19, 2024
wuhaixu2016 added a commit that referenced this issue Sep 19, 2024
fix bugs according to issues #524
@Et6an
Copy link

Et6an commented Sep 28, 2024

The same applies to CoLT5-attention, which also requires a lower version. I successfully ran it with version 0.10.20.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants