total_steps for using OneCycleLR lr scheduler #13236
Unanswered
exiawsh
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 6 replies
-
@exiawsh It should be Also, there's a useful property in the Trainer you might be interested in: def configure_optimizers(self):
optimizer = ...
scheduler = torch.optim.lr_scheduler.OneCycleLR(
optimizer, max_lr=1e-3, total_steps=self.trainer.estimated_stepping_batches
)
return [optimizer], [scheduler] |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, I have some question about the pytorch official One_cycle scheduler when using the pytorch lightning.
When using the One_cycle scheduler, I need provide the parameter total_steps.
Are the total_steps include the validation steps or only include training steps?
That's mean, if I have 3000 steps for training per epoch, and 2000 steps for validation per epoch, and the total epochs are 24.
I should give the parameter with
3000*24
or5000*24
?Beta Was this translation helpful? Give feedback.
All reactions