Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issue with reuse layers in torch #1217

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

reuvenperetz
Copy link
Collaborator

@reuvenperetz reuvenperetz commented Sep 15, 2024

Pull Request Description:

Fix the issue of ignoring shared parameters in reused layers in torch models.
The PR also fixes an issue with handling reused nodes during Hessian computation, which replaces reused nodes with their representative 'base' nodes.

Checklist before requesting a review:

  • I set the appropriate labels on the pull request.
  • I have added/updated the release note draft (if necessary).
  • I have updated the documentation to reflect my changes (if necessary).
  • All function and files are well documented.
  • All function and classes have type hints.
  • There is a licenses in all file.
  • The function and variable names are informative.
  • I have checked for code duplications.
  • I have added new unittest (if necessary).

@reuvenperetz reuvenperetz requested review from ofirgo and irenaby and removed request for ofirgo September 16, 2024 13:21

# Verify that 'conv1' is called twice (thus reused) and 'conv2' is called once
layer_calls = {}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can use defaultdict


# Verify that the shared parameters have identical memory addresses
self.unit_test.assertEqual([p.data_ptr() for p in quant_model.conv1.parameters()],
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consider adding another shared conv and validate it is different from the first one

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants