Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pretrained weights #2

Open
AWbosman opened this issue Mar 20, 2024 · 1 comment
Open

Pretrained weights #2

AWbosman opened this issue Mar 20, 2024 · 1 comment

Comments

@AWbosman
Copy link

Dear authors,

I have some networks that are pretrained and would like to retrain with your method.
In your read.me you state this would be added soon.
Do you have any tips or examples on how to use your repository?

Kindly,
Annelot

@nvedant07
Copy link
Owner

Hi @AWbosman ,

Thanks for checking out our repo! It's been a while since I worked on this, but I believe what you're looking for is the regularized loss that reduced robustness bias by adding regularization during training.

I added this file under code/regularized_loss.py. This contains a class called RegularizedLoss which you can use in your training loop just like you use nn.CrossEntropyLoss in PyTorch.

Let me know if you run into any issues!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants