Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add additional options for optimizer in NNs #939

Open
Marsmaennchen221 opened this issue Oct 2, 2024 · 3 comments
Open

Add additional options for optimizer in NNs #939

Marsmaennchen221 opened this issue Oct 2, 2024 · 3 comments
Assignees

Comments

@Marsmaennchen221
Copy link
Contributor

Is your feature request related to a problem?

There are a bunch of different optimizers for neural networks that can have better or worse performance dependent on the data and the specific neural network. Therefore, we should give the user the option to change the optimizer.

Desired solution

There are two different options we could implement.

The first (simple) option would be to give the user a new parameter in the fit method of the neural network to choose which optimizer should be used. The parameter should be a literal.

The second option would be to force the user to create an optimizer instance of a selection of new implemented optimizer classes (for the different optimizers). The advantage would be that we could give the user options to modify parameters of the optimizers. In this case, it would be logical to move the learning_rate parameter from the fit method to the optimizer classes because different optimizers can benefit from different default learning rates.

Possible alternatives (optional)

No response

Screenshots (optional)

No response

Additional Context (optional)

No response

@Marsmaennchen221
Copy link
Contributor Author

@lars-reimann which option would you prefer?

@lars-reimann
Copy link
Member

We could implement this similarly to the kernels of an SVM, so basically your second suggestion.

I'm just wondering whether we should really keep this on the fit method or add it to the constructor of the NN classes. This would IMO be more consistent with the classical models, which also might depend on a learning rate, like gradient boosting. It would also support hyperparameter search.

@Marsmaennchen221
Copy link
Contributor Author

If you moved the optimizer to the constructor, how would you handle pretrained models? It would be irritating if you would have to give the method for pretrained models an optimizer. Because most of the time you probably don't want to train the model further, but if you want to, you need an optimizer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Backlog
Development

When branches are created from issues, their pull requests are automatically linked.

2 participants