-
Notifications
You must be signed in to change notification settings - Fork 339
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No eval loss and eval accuracy when calling trainer.evaluate() #542
Comments
Hey @songmh99 , one possible reason for this could be that the trainer does not get the labels in the format it expects. For example, if the labels are not named labels you have to specify |
Thanks a lot! It does work ! :)
However, when I only train an adapter_distillBert_model (without distillation), the training report message shows that :
The number of trainable parameters(without distillation) is the same as the distillation training process. Details :The teacher model and student model are as follows:
The code of only training an adapter_distillBert_model is as follows (without distillation):
|
This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label. |
This issue was closed because it was stale for 14 days without any activity. |
Environment info
adapter-transformers
version:Details
I want to use knowledge distillation to between two adapter models.
It occurs when the tearch model and student model are both adapter-model.
When the teacher model is adapter-model and the student model are huggingface-model, it can train and evaluate normally.
When I runing the code, it only reports the training loss (No validation loss). And when I run "trainer3.evaluate()", there are no "eval_loss" and "eval_accuracy" in the result, only have "eval_runtime" and so on.
When I add the commented lines of code:
It will report the following error:
Thanks for your help!
The text was updated successfully, but these errors were encountered: