Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increasing the Number of Trainable Adapter Parameters #530

Closed
ghost opened this issue Apr 3, 2023 · 4 comments
Closed

Increasing the Number of Trainable Adapter Parameters #530

ghost opened this issue Apr 3, 2023 · 4 comments
Labels
question Further information is requested Stale

Comments

@ghost
Copy link

ghost commented Apr 3, 2023

How can I increase the number of trainable adapter parameters using AdapterHub while training an adapter module for GPT2 text? Despite trying various methods, I have been unable to increase the number of trainable adapter parameters. Can someone provide a code example or suggest a solution to this issue? I unfortunately could not find anything on this while navigating the documentation.

@ghost ghost added the question Further information is requested label Apr 3, 2023
@hSterz
Copy link
Member

hSterz commented Apr 6, 2023

Hey @skylersterling , how you can increase the number of parameters depends on the adapter architecture you are using. If you are using a bottleneck adapter (e.g. with the PfeifferConfig or HoulsbyConfig) you can change the reduction factor. If you have a smaller reduction factor the bottleneck becomes less narrow hence, the number of trainable parameters increases.

@ghost
Copy link
Author

ghost commented Apr 11, 2023

Thank you! Took a bit of experimenting but it seems that for some reason, increasing the reduction factor initially increases the amount of parameters from the default of 42000 to 5 billion haha :-:

@adapter-hub-bert
Copy link
Member

This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label.

@adapter-hub-bert
Copy link
Member

This issue was closed because it was stale for 14 days without any activity.

@adapter-hub-bert adapter-hub-bert closed this as not planned Won't fix, can't repro, duplicate, stale Jul 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested Stale
Projects
None yet
Development

No branches or pull requests

2 participants