-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix auto-registered torch.special operators #979
Conversation
Hi @t-vi @IvanYashchuk , we could discuss further if #976 is necessary, this is a bug I found along the way, so I split it out and we could review this first.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @kiya00! Cool stuff; I made some requests for comments to improve readability
…h function; Add torch.fft and torch.linalg ops
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@IvanYashchuk would you like to review, too?
Before submitting
What does this PR do?
Background:
This PR addresses a bug related to the handling of
torch.special
operators, discovered during the development of PR #976.torch.special
operators has__name__
in the formatspecial_opname
, requiring extraction of the actualopname
. Similar issues occur withtorch.linalg
andtorch.fft
operators.In this PR:
_get_torch_function_name
to infer the python call name from the torch module and functiontorch.linalg
andtorch.fft
operators and the testsPR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃