Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Are the pertained weights frozen while training? #15

Open
prateekmalhotra-hover opened this issue Jul 23, 2019 · 3 comments
Open

Are the pertained weights frozen while training? #15

prateekmalhotra-hover opened this issue Jul 23, 2019 · 3 comments

Comments

@prateekmalhotra-hover
Copy link

Hi! Great work! I just wanted to inquire in more detail whether, while training, you're freezing the old weights. Thanks!

@iperov
Copy link

iperov commented Aug 31, 2019

of course no

@godsmokescrack
Copy link

I think the pretrained weights should be frozen (at least at first) or some of the advantage they give you will be erased. For instance, the images below show training/validation loss curves (on some small segmentation dataset) for three different scenarios. The best performance was given by freezing the pretrained knowledge.

  1. Frozen weights on pretrained vgg11 encoder:
    ternaus_frozen

  2. Unfrozen weights on pretrained encoder:
    DRIVE_ternausnet_loss_curves

  3. Randomly initialized (pretrained=False):
    DRIVE_ternausnet_loss_curves_pt_false

@rrryan2016
Copy link

Has the author already provided pretrained weights?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants