Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

On Parallel Adapters & Prefix Tuning #495

Closed
hchoi-moveworks opened this issue Feb 15, 2023 · 3 comments
Closed

On Parallel Adapters & Prefix Tuning #495

hchoi-moveworks opened this issue Feb 15, 2023 · 3 comments
Labels
question Further information is requested Stale

Comments

@hchoi-moveworks
Copy link

Environment info

NA

  • adapter-transformers version: 3.1.0
  • Platform: Linux GPU
  • Python version: 3.8
  • PyTorch version (GPU?): 1.13

Details

@hchoi-moveworks hchoi-moveworks added the question Further information is requested label Feb 15, 2023
@hSterz
Copy link
Member

hSterz commented Mar 9, 2023

Hi @hchoi-moveworks ,
with our new release of adapter-transformers v.3.2, we solved some of your issues

  • We support parallel composition for prefix tuning but not for IA3.
  • We added support for generation with parallel composition with the generate method.
  • Unfortunately, we don't have statistics on the latency, GPU memory or GPU utilization. If you get statistics, feel free to share them with us :) Using parallel composition is not equivalent to a batch size of N. The batch size is adapted at the first parallel block (see this illustration) until the first parallel block the batch size is the original one.

@adapter-hub-bert
Copy link
Member

This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label.

@adapter-hub-bert
Copy link
Member

This issue was closed because it was stale for 14 days without any activity.

@adapter-hub-bert adapter-hub-bert closed this as not planned Won't fix, can't repro, duplicate, stale Jun 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested Stale
Projects
None yet
Development

No branches or pull requests

3 participants