Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finalizing eScience contributions #1977

Closed
APJansen opened this issue Mar 4, 2024 · 2 comments
Closed

Finalizing eScience contributions #1977

APJansen opened this issue Mar 4, 2024 · 2 comments
Labels

Comments

@APJansen
Copy link
Collaborator

APJansen commented Mar 4, 2024

Finalizing eScience contributions

Since our time is running out, we thought it would be useful to have an overview of our remaining PRs, separated into essentials and optionals.

I think our main priority should be to get the essentials merged and have a tag from which to start the final runs for the paper.

Essentials

name PR purpose status
Fk refactor #1936 ~3.5x speedup merged
Parallel hyperoptimization with MongoDB #1921 parallelisation in trials -> ~3.5x speedup merged
Hyperopt loss #1726 implementation of multiple losses merged
Make weight initialization reproducible #1923 testing, consistency merged
hyperopt runcard #1986 (not an eScience contribution but essential to start runs so thought I'd add it here) merged

Optional, if time allows

name PR purpose status
Avoid idle gpu #1939 ~25% speedup ready for review
Avoiding duplicated computations by having a single observable model #1855 ~30% speedup needs feedback
Implementation of hyperopt model selection #1976 automate, integrate final selection in progress
@scarlehoff
Copy link
Member

I would honestly skip old the optionals (other than the reproducible weight initialisation maybe) and focus instead on polishing the stuff that's already there*. For instance, these problems with tf 2.16 / python 3.12, they will only grow as keras 3 becomes the standard. Making sure that things like multidense / multireplica fits are robust and that we don't "lose them" when keras 3.1 comes out is important.

(and, as with anything that touches the under-the-hood tensorflow, like all the Meta-whatever stuff, it will have many chances to break)

*actually, I would say the weight initialization is part of this polishing

@scarlehoff
Copy link
Member

This is now finished, with only this item still to be merged (but finished otherwise) #1976

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants