Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

optimizate the machine learning/deep learning model creation #2

Open
jero98772 opened this issue Sep 1, 2024 · 1 comment
Open

Comments

@jero98772
Copy link
Member

we know if we have more data ,time and power we can do more accuracy predictions, we can do pre-computations and creating one powerful model, but the behavior of each sensor is different
we think how to make a super model , or multiples models for reduce the time of training and make more accuracy predictions

@jero98772
Copy link
Member Author

jero98772 commented Sep 8, 2024

the problem is the follow,

we need a way to optimize the process to build "strong models" with "strong algortihms" like lstm, rnn, tcn and more
how can we create , or use something that not train from 0 this networks

i think something like this

Agentic Retrieval-Augmented Generation for Time Series Analysis
https://arxiv.org/pdf/2408.14484

but this is a commun problem asociated with other thinks so there is a more simple solution

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant