Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to run #19

Open
Rishisghostwillhuntyoudown opened this issue Dec 26, 2024 · 1 comment
Open

Unable to run #19

Rishisghostwillhuntyoudown opened this issue Dec 26, 2024 · 1 comment

Comments

@Rishisghostwillhuntyoudown

Followed the tutorial, and it keeps giving this error:
image

@zeeblo
Copy link
Collaborator

zeeblo commented Dec 26, 2024

Which model do you have selected and does this happen immediately when you try starting a chat? Or does it happen after a couple seconds/minutes?

What's interesting to me is that on line 994

    try:
        lst = ollama.list() # line 994

        for i in lst["models"]:
            ai_list.append(i["name"])
    except httpx.ConnectError:
        ai_list = "off"

it should simply just get a list of all your available models and if there aren't any then it'll just give an empty string. If ollama isn't running then it should tell you that ollama isn't running. The fact that it doesn't do either and instead throws "ResponseError" is odd to me...

I currently can't replicate this error on my end but I have seen a somewhat similar issue here haotian-liu/LLaVA#1666 and here ollama/ollama#2384 in both instances they're using quite unique models.

But this one seems most relevant ollama/ollama-python#88

If you're using a VPN then that may be interfering with it.

If you're not then I'm not 100% sure, I'll need to try and replicate it on my end to see if I can figure out the cause. It could be that the default host location for ollama is being used by another application/service but ollama has less priority

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants