Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finetune all the parameters #556

Closed
VarunGumma opened this issue Jun 6, 2023 · 1 comment
Closed

Finetune all the parameters #556

VarunGumma opened this issue Jun 6, 2023 · 1 comment
Labels
question Further information is requested

Comments

@VarunGumma
Copy link

Environment info

  • adapter-transformers version: latest
  • Platform: Ubuntu 20.04.6
  • Python version: 3.10.11
  • PyTorch version (GPU?): PyTorch 2.0.1 - Nvidia A100

Details

I have a BERT model, and I have added a pre-trained adapter to it. Is there any way to finetune all the parameters, i.e., the BERT model and the pre-trained adapter? I don't want to manually set p.requires_grad=True for all the parameters. Also, I know this beats the purpose of adapters, but need it.

@VarunGumma VarunGumma added the question Further information is requested label Jun 6, 2023
@armin-zd
Copy link

According to the docs you can do freeze_model(False), which essentially loops all params and set requires_grad.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants