Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write Tune step #92

Open
kordc opened this issue Sep 29, 2023 · 5 comments
Open

Write Tune step #92

kordc opened this issue Sep 29, 2023 · 5 comments
Assignees
Labels

Comments

@kordc
Copy link
Collaborator

kordc commented Sep 29, 2023

def do(self, previous_states: Dict):
        trainer = Trainer()  # Here we should write other object for this.
        # TODO how to solve this?
        trainer.tune(model=self.model, datamodule=self.datamodule)
@SebChw SebChw added the v0.3 label Sep 30, 2023
@kordc kordc changed the title Solve steps.py Tune class do method Solve steps.py Tune class do method - write whole Tune capability Nov 7, 2023
@kordc kordc changed the title Solve steps.py Tune class do method - write whole Tune capability Write Tune step Nov 22, 2023
@kordc kordc mentioned this issue Nov 22, 2023
@kordc
Copy link
Collaborator Author

kordc commented Nov 22, 2023

Tune capability:

  • random over grid search. For simultaneously tuning multiple hyperparameters it may sound tempting to use grid search to ensure coverage of all settings, but keep in mind that it is best to use random search instead. Intuitively, this is because neural nets are often much more sensitive to some parameters than others. In the limit, if a parameter a matters but changing b has no effect then you’d rather sample a more throughly than at a few fixed points multiple times.
  • hyper-parameter optimization. There is a large number of fancy bayesian hyper-parameter optimization toolboxes around and a few of my friends have also reported success with them, but my personal experience is that the state of the art approach to exploring a nice and wide space of models and hyperparameters is to use an intern :). Just kidding.

@SebChw
Copy link
Owner

SebChw commented Nov 22, 2023

I think that first version of Tune should just support the random search. Over parameters/data/model modifiers

@kordc
Copy link
Collaborator Author

kordc commented Nov 22, 2023

I'd then name it a RandomSearch step. I agree

@SebChw
Copy link
Owner

SebChw commented Nov 22, 2023

Probably right we shouldn't have one Tune step with options but rather few simpler ones.

@trebacz626
Copy link
Collaborator

Also we need to remember that trainer.tune tunes only learning_rate

@kordc kordc added v0.4 and removed v0.3 labels Jan 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants