Unfreezing layers during training? #5814
Answered
by
williamFalcon
s-rog
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
Freezing layers at the beginning of training works, however unfreezing in I'm using DDP + Apex O2 and the loss scaling will keep going down to 0 where it would encounter 0 division and crash. Is unfreezing during training not possible in pytorch/lightning? or am I missing snippet? |
Beta Was this translation helpful? Give feedback.
Answered by
williamFalcon
Jan 21, 2020
Replies: 1 comment
-
you can unfreeze whenever. if gradients explode it's for another reason |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
Borda
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
you can unfreeze whenever. if gradients explode it's for another reason