-
Notifications
You must be signed in to change notification settings - Fork 51
Open
Description
I get failed to load sha when trying to run local model
----- New Chat -----
Prompt: create snow shovel
Loading Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf...
Error loading Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf:
Traceback (most recent call last):
File "C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\.local\lib\python3.11\site-packages\llama_cpp\_ctypes_extensions.py", line 67, in load_shared_library
return ctypes.CDLL(str(lib_path), **cdll_args) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Blender Foundation\Blender 4.3\4.3\python\Lib\ctypes\__init__.py", line 376, in __init__
self._handle = _dlopen(self._name, mode)
^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: Could not find module 'C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\.local\lib\python3.11\site-packages\llama_cpp\lib\llama.dll' (or one of its dependencies). Try using the full path with constructor syntax.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\user_default\meshgen\operators.py", line 91, in execute
backend.load()
File "C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\user_default\meshgen\backend.py", line 125, in load
self._load_local_model()
File "C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\user_default\meshgen\backend.py", line 62, in _load_local_model
self.model = LlamaCppModel(
^^^^^^^^^^^^^^
File "C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\user_default\meshgen\utils.py", line 31, in __init__
import llama_cpp
File "C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\.local\lib\python3.11\site-packages\llama_cpp\__init__.py", line 1, in <module>
from .llama_cpp import *
File "C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\.local\lib\python3.11\site-packages\llama_cpp\llama_cpp.py", line 38, in <module>
_lib = load_shared_library(_lib_base_name, _base_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\.local\lib\python3.11\site-packages\llama_cpp\_ctypes_extensions.py", line 69, in load_shared_library
raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}")
RuntimeError: Failed to load shared library 'C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\.local\lib\python3.11\site-packages\llama_cpp\lib\llama.dll': Could not find module 'C:\Users\phara\AppData\Roaming\Blender Foundation\Blender\4.3\extensions\.local\lib\python3.11\site-packages\llama_cpp\lib\llama.dll' (or one of its dependencies). Try using the full path with constructor syntax.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels