-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lazy loading and discarding of unused models #22
Comments
If you do not save the model in the classifier itself as a member variable, it should do what you want, does it not?. Just load models when you need the weights, then save them e.g. on disk. |
If I see it correctly, then e.g. the spacy classifiers load the model in the constructor of the classifier and the constructor is called when the classifier is added to the server - so every worker keeps the spacy models in memory even if during actual use it may seldomly be called. |
Yes, but it prevents you nothing from doing it differently. I read this issue as if you want your own recommender to be able to be lazy, not changing existing ones. |
I was suggesting to implement something that extracts this aspect of lazy endpoint management from the classifiers into the framework so that the implementors of classifiers do not need to worry about it or can easily add it e.g. by adding a decorator, wrapping the classifier in some kind of LazyClassifier or extending some kind of base class. Just an idea. Feel free to discard it. |
It might be interesting if there were an option initialize classifiers lazily and to free their resources if unused for a while:
Not sure if this is something that should/could be implemented as part of this framework or if it would rather be something to handle at the level of the production server (e.g. gunicorn).
The text was updated successfully, but these errors were encountered: