Does the type of RAM matter and how much do you need? #935
Replies: 4 comments 2 replies
-
Running larger models will require powerful CPU and even using 65B model is a big deal for top processors. Of course you should choose higher speeds just because the model is being read from memory for every computation, but firstly check what your motherboard supports. Having 64Gb RAM shall allow you to run 65B parameters LLM, but still it's a hard task for CPU. |
Beta Was this translation helpful? Give feedback.
-
My two cents. Larger amount of RAM will only allow to load larger models. |
Beta Was this translation helpful? Give feedback.
-
I can also confirm that with large models, memory becomes a bottleneck. |
Beta Was this translation helpful? Give feedback.
-
I'm having trouble running a 20b model with 32gb of DDR4 RAM--even if I close everything else out, Windows (plus Edge as an interface) uses 4 to 6 gb, and while the model itself typically only takes up about 20gb after that, processing a prompt rapidly consumes the rest of my memory and starts hammering my swap. It does this whether I have Does anybody know if there's any other tricks I can do? I'm not exactly in a position to upgrade my ram right now. |
Beta Was this translation helpful? Give feedback.
-
I currently have about 16GB plain old RAM (not VRAM) and am planning to upgrade to higher capacity sticks to use larger models. My question is twofold.
First is the type of RAM I get going to have a significant effect. Should I for example splashout for DDR5 over DDR4 and get the highest Mhz rating I can find?
Secondly how much RAM should I get? I can relatively easily get to 64GB. Upgrading beyond will necessitate building a new system. What would I miss by sticking to 64GB? I see the short list of memory requirements on the frontpage that I assume are llama models but I wanted to get more detailed opinions.
Beta Was this translation helpful? Give feedback.
All reactions