We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi authors,
Whether lr = trainer.lr_scheduler.step(global_step) for Lomo in the Trainer is implemented?
If so, how to enable it?
Thanks!
The text was updated successfully, but these errors were encountered:
Sure, just pass the lr_sheduler to trainer will enable it. Example: https://github.com/OpenLMLab/collie/blob/c9cc0055a52b96d156450b5734a0a1d0dbde4562/examples/finetune_chatglm2_for_summary.py#L83
lr_sheduler
trainer
Sorry, something went wrong.
I mean lr_scheduler for Lomo, not AdamW. lr_scheduler for Lomo requires the lr_scheduler.step function to accept a global_step parameter.
lr_scheduler
lr_scheduler.step
global_step
Sry, and I get you now. This is a bug and we are planning to fix it to align with the api call with PyTorch's lr_scheduler. Currently, maybe you can pass this lr_scheduler to trainer when training with Lomo. https://github.com/OpenLMLab/LOMO/blob/24cde8e91feac437809bf7790f4727623dce6a76/src/utils.py#L207
Hi, lr_scheduler for LOMO is implemented and mergd into branch:dev now. FYI:3b69924
KaiLv69
No branches or pull requests
Hi authors,
Whether lr = trainer.lr_scheduler.step(global_step) for Lomo in the Trainer is implemented?
If so, how to enable it?
Thanks!
The text was updated successfully, but these errors were encountered: