-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fine-Tuned Token Locations #1
Comments
Hello MrWan001,
We will be releasing the tokens for the three datasets we evaluated on shortly, which you can download and place in the location specified by Best, |
Hello @brandontrabucco , I'm dealing with a similar issue. Could you please explain the difference between Also, I followed your instructions here and now have Best, |
Hello jlsaint, Thanks for following up on this issue! The parameter First, caching the images to the disk means they don't have to be stored in memory, and for datasets with many images / classes, this can be crucial if there are too many augmented images. Note based on this point that we generated the augmented images only once at the beginning of training in our example training scripts using the Second, having the images cached means you can inspect the augmented images for tuning hyperparameters and confirming if the DA-Fusion is working as expected. For your last point, take a look at this script: https://github.com/brandontrabucco/da-fusion/blob/main/aggregate_embeddings.py After we do text inversion and have several class-specific tokens, we merge them together into a single dictionary containing all the tokens using the above script. This produces a Let me know if you have other questions! Best, |
Hello @brandontrabucco, I am hoping to run it for Imagenet. |
Sure! We have uploaded the current set of tokens here: |
Thank you very much. |
DEFAULT_EMBED_PATH = "/root/downloads/da-fusion/dataset)-tokens/(dataset)-{seedJ-(examples_per_class}.pt"
Hello,the. pt file cannot be found. What effect does it have on the program?
The text was updated successfully, but these errors were encountered: