Replies: 2 comments 1 reply
-
Just to note, the supplied command for testing does not work: Based on googling, the listed image has apparently been removed. using the following variation works: And provides the following output:
Therefore I think everything that needs to be in installed is, and LocalAI should be able to pick up the GPU..... |
Beta Was this translation helpful? Give feedback.
-
I think it's working. Continuing to dig, I realized the image being pulled in the original command was an old one. I updated it to:
Based on that, I believe everything is set to go and it will use the GPU when whatever condition is met that requires it. Summarizing: |
Beta Was this translation helpful? Give feedback.
-
Operating system: Ubuntu 23.10
NVidia/Cuda:
NVidia Card:
Using the command provided in the documentation for getting LocalAI running with cublas:
It is not picking up the GPU and leveraging that for acceleration. According to the documentation I should be seeing a line similar to the following and it's not there:
What is missing/did I do wrong? Thank you! (Side note: I've previously had this running under MacOS using Metal without any issues. This is a first attempt under Linux and Nvidia.)
Beta Was this translation helpful? Give feedback.
All reactions