Validation Batch size 2x than of training batch size. #473
-
I am going through this tutorial ( https://colab.research.google.com/github/fepegar/torchio-notebooks/blob/main/notebooks/TorchIO_tutorial.ipynb#scrollTo=zzVqpgRf5eFN ) Is there any specific reason why Validation Batch size 2x that of training batch size? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Yes, there is a reason. In validation, you don't need to store the activations for backpropagation. Therefore, more memory is available and we can use a larger batch size. I have added a little comment about that. |
Beta Was this translation helpful? Give feedback.
Yes, there is a reason. In validation, you don't need to store the activations for backpropagation. Therefore, more memory is available and we can use a larger batch size. I have added a little comment about that.