-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support full fine-tuning auxiliaries part ii #202
Conversation
Could this be opened against #192 branch so that all changes could be merged together in one go? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. But maybe let's open PR in common branch and merge with @ArEnSc's PR
finetrainers/trainer.py
Outdated
f"Loading adapter weights from state_dict led to unexpected keys not found in the model: " | ||
f" {unexpected_keys}. " | ||
if self.args.training_type == "lora": | ||
transformer_ = transformer_cls_.__class__.from_pretrained( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe missing some context here, but do we need __class__
on the transformer_cls_
class ref? No problem even if not required since you've tested this to work already. Asking because, in the other branch, just transformer_cls_.from_pretrained
is used
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right. Thanks for catching. I have removed it.
Closing as I pushed my updates to #192 |
Tested with all three currently supported models.