-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add output node if it does not exist in the split module #1480
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the quick fix!
An instance check is missing because currently, the code may try accessing File "/workspace/lightning-thunder/thunder/dynamo/splitter.py", line 150, in _splitter
add_output(original_split_gm)
File "/workspace/lightning-thunder/thunder/dynamo/splitter.py", line 143, in add_output
add_output(getattr(m, node.target))
File "/workspace/lightning-thunder/thunder/dynamo/splitter.py", line 143, in add_output
add_output(getattr(m, node.target))
File "/workspace/lightning-thunder/thunder/dynamo/splitter.py", line 141, in add_output
for node in m.graph.nodes:
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1728, in __getattr__
raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
torch._dynamo.exc.BackendCompilerFailed: backend='<thunder.dynamo.compiler.ThunderCompiler object at 0x7f6ed5737280>' raised:
AttributeError: 'Embedding' object has no attribute 'graph' |
with this change the assertion error can be fixed:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks @kiya00
Should we add a test actually checking this instead of test_thundercompiler_optim_step
implicitly testing this?
@t-vi, could you please merge this one? |
Hi @t-vi , I think it's ready to merge |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @kiya00 @IvanYashchuk @kshitij12345
Before submitting
What does this PR do?
Fixes #1476 .
Add a workaround to make ThunderFX working with older version of PyTorch by going through all submodules of split_module and adds an output node if it's missing
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃