-
Notifications
You must be signed in to change notification settings - Fork 52
Description
----- New Chat -----
Prompt: Create a cubea1
Loading gemma3...
Error loading gemma3:
Traceback (most recent call last):
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/litellm/main.py", line 2869, in completion
generator = ollama_chat.get_ollama_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 313, in get_ollama_response
response = sync_client.post(
^^^^^^^^^^^^^^^^^
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/litellm/llms/custom_httpx/http_handler.py", line 576, in post
raise e
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/litellm/llms/custom_httpx/http_handler.py", line 558, in post
response.raise_for_status()
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/httpx/_models.py", line 829, in raise_for_status
raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '404 Not Found' for url 'http://localhost:11434/api/chat'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/kiie/.config/blender/4.3/extensions/user_default/meshgen/operators.py", line 91, in execute
backend.load()
File "/home/kiie/.config/blender/4.3/extensions/user_default/meshgen/backend.py", line 134, in load
self._load_litellm_model()
File "/home/kiie/.config/blender/4.3/extensions/user_default/meshgen/backend.py", line 120, in _load_litellm_model
raise e
File "/home/kiie/.config/blender/4.3/extensions/user_default/meshgen/backend.py", line 117, in _load_litellm_model
self.model(input_messages)
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/smolagents/models.py", line 910, in call
response = self.client.completion(**completion_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/litellm/utils.py", line 1247, in wrapper
raise e
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/litellm/utils.py", line 1125, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/litellm/main.py", line 3148, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2214, in exception_type
raise e
File "/home/kiie/.config/blender/4.3/extensions/.local/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2183, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: Ollama_chatException - Client error '404 Not Found' for url 'http://localhost:11434/api/chat'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404