You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are a bunch of different optimizers for neural networks that can have better or worse performance dependent on the data and the specific neural network. Therefore, we should give the user the option to change the optimizer.
Desired solution
There are two different options we could implement.
The first (simple) option would be to give the user a new parameter in the fit method of the neural network to choose which optimizer should be used. The parameter should be a literal.
The second option would be to force the user to create an optimizer instance of a selection of new implemented optimizer classes (for the different optimizers). The advantage would be that we could give the user options to modify parameters of the optimizers. In this case, it would be logical to move the learning_rate parameter from the fit method to the optimizer classes because different optimizers can benefit from different default learning rates.
Possible alternatives (optional)
No response
Screenshots (optional)
No response
Additional Context (optional)
No response
The text was updated successfully, but these errors were encountered:
We could implement this similarly to the kernels of an SVM, so basically your second suggestion.
I'm just wondering whether we should really keep this on the fit method or add it to the constructor of the NN classes. This would IMO be more consistent with the classical models, which also might depend on a learning rate, like gradient boosting. It would also support hyperparameter search.
If you moved the optimizer to the constructor, how would you handle pretrained models? It would be irritating if you would have to give the method for pretrained models an optimizer. Because most of the time you probably don't want to train the model further, but if you want to, you need an optimizer.
Is your feature request related to a problem?
There are a bunch of different optimizers for neural networks that can have better or worse performance dependent on the data and the specific neural network. Therefore, we should give the user the option to change the optimizer.
Desired solution
There are two different options we could implement.
The first (simple) option would be to give the user a new parameter in the
fit
method of the neural network to choose which optimizer should be used. The parameter should be a literal.The second option would be to force the user to create an optimizer instance of a selection of new implemented optimizer classes (for the different optimizers). The advantage would be that we could give the user options to modify parameters of the optimizers. In this case, it would be logical to move the
learning_rate
parameter from the fit method to the optimizer classes because different optimizers can benefit from different default learning rates.Possible alternatives (optional)
No response
Screenshots (optional)
No response
Additional Context (optional)
No response
The text was updated successfully, but these errors were encountered: