Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapter layers in BART #561

Closed
J4nn4 opened this issue Jun 21, 2023 · 2 comments
Closed

Adapter layers in BART #561

J4nn4 opened this issue Jun 21, 2023 · 2 comments
Labels
question Further information is requested

Comments

@J4nn4
Copy link

J4nn4 commented Jun 21, 2023

Hello,

I have a question regarding integration of adapters in models like BART which consist of encoder and decoder components.
Are the adapter layers inserted in the BART model both in the encoder and decoder or only in the encoder?

Thanks!

@J4nn4 J4nn4 added the question Further information is requested label Jun 21, 2023
@julianpollmann
Copy link

julianpollmann commented Aug 16, 2023

Hey,

each transformer block of the encoder and decoder contains a AdapterLayer. For facebook/bart-base 6 adapter layers for the encoder and 6 for the decoder are added.

See the MAD-X paper.

@adapter-hub-bert
Copy link
Member

This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants