-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How much memory is "enough"? #330
Comments
Calculate the amount of VRAM you need for inference: https://huggingface.co/spaces/NyxKrage/LLM-Model-VRAM-Calculator |
My question is : how much memory is needed for GPU in order to test Grok-1? |
I am assuming you are not running any quantization algos. In the above, I have given an estimated ballpark. I cant really give you exact numbers as config info for grok is missing from huggingface. If you want to actually find the proper memory requirement, I would suggest https://huggingface.co/blog/Andyrasika/memory-consumption-estimation |
Thank you so much! |
No description provided.
The text was updated successfully, but these errors were encountered: