How to train models with datasets which have more than 1000 classes? #518
Unanswered
linhduongtuan
asked this question in
Q&A
Replies: 1 comment
-
@linhduongtuan not sure I understand the question, you just change the num_classes passed to model on creation or afterwards via reset_classifier or manual model slicing and dicing. There isn't much else too it for classification. The layer before classifier don't have to have > # of feature dims. Look at the ResNetV2 in21k bitm models as an example, they are just standard pre-activation resnets (aside from the groupnorm/std conv bits) w/ 2048 pooled features. You could experiment with more complex heads and add an MLP stack but will likely just increase your chances of overfitting unless you add sufficient extra regularization. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi Ross and folks,
Can you give a guidance to modified your models lile EfficientNet or NFNet families to train with datasets which have more than 1000 classes?
Especially, number classes is larger than output of the last pool function layer.
Best regards.
Linh
Beta Was this translation helpful? Give feedback.
All reactions