-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Ollama servers? #64
Comments
Hello, |
Thanks very much, I'm sorry I didn't check that before, I'm not as
familiar as I ought to be with github. As always, I really appreciate
your speed.
…On 2/14/2024 4:39 PM, André-Abush Clause wrote:
Hello,
Yes, it is in progress. In #62
<#62>. The next release
should include this :)
Thanks
—
Reply to this email directly, view it on GitHub
<#64 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AIC3JC64HWNLGCQLEIUSE6DYTUVKFAVCNFSM6AAAAABDIU6IMWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNBUGY4DONRUHA>.
You are receiving this because you authored the thread.Message ID:
***@***.***>
--------------5E4218BD9CAA82556E2A6506
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: 8bit
<html><head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<p>Thanks very much, I'm sorry I didn't check that before, I'm not
as familiar as I ought to be with github. As always, I really
appreciate your speed. <br>
</p>
<br>
<div class="moz-cite-prefix">On 2/14/2024 4:39 PM, André-Abush
Clause wrote:<br>
</div>
<blockquote ***@***.***" type="cite">
<p dir="auto">Hello,<br>
Yes, it is in progress. In <a moz-do-not-send="true" class="issue-link js-issue-link" data-error-text="Failed to
load title" data-id="2128075248" data-permission-text="Title
is private" data-url="#62" data-hovercard-type="pull_request" data-hovercard-url="/aaclause/nvda-OpenAI/pull/62/hovercard" href="#62>.
The next release should include this :)<br>
Thanks</p>
<p style="font-size:small;-webkit-text-size-adjust:none;color:#666;">—<br>
Reply to this email directly, <a moz-do-not-send="true" href="#64 (comment)
it on GitHub</a>, or <a moz-do-not-send="true" href="https://github.com/notifications/unsubscribe-auth/AIC3JC64HWNLGCQLEIUSE6DYTUVKFAVCNFSM6AAAAABDIU6IMWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNBUGY4DONRUHA">unsubscribe</a>.<br>
You are receiving this because you authored the thread.<img moz-do-not-send="true" src="https://github.com/notifications/beacon/AIC3JC45XI755NHPUHU7WDDYTUVKFA5CNFSM6AAAAABDIU6IMWWGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTTT5GKCA.gif" alt="" height="1" width="1"><span style="color: transparent;
font-size: 0; display: none; visibility: hidden; overflow:
hidden; opacity: 0; width: 0; height: 0; max-width: 0;
max-height: 0; mso-hide: all">Message ID: <span><aaclause/nvda-OpenAI/issues/64/1944687648</span><span>@</span><span>github</span><span>.</span><span>com></span></span></p>
<script type="application/ld+json">[
{
***@***.***": "http://schema.org",
***@***.***": "EmailMessage",
"potentialAction": {
***@***.***": "ViewAction",
"target": "#64 (comment)",
"url": "#64 (comment)",
"name": "View Issue"
},
"description": "View this Issue on GitHub",
"publisher": {
***@***.***": "Organization",
"name": "GitHub",
"url": "https://github.com"
}
}
]</script>
</blockquote>
<br>
</body>
</html>
--------------5E4218BD9CAA82556E2A6506--
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi all,
I continue to be astonished at the speed and quality of development here, thanks to everyone involved for something which has made life better and easier. I thought I would ask, if I may, for a feature which might be more useful as time passes. Ollama
https://ollama.com/
is a method through which LLMs can be run on the local intranet. There is an accessible option for these, though at an early stage, at
https://github.com/chigkim/VOLlama/
Would it be possible for the add-on to support sending data to an Ollama server/model? It would be particularly nice to be able to send NVDA's objects to the model as well, of course, as allowing work with the local models through the central dialogue. Thanks for looking into whether this would be possible.
The text was updated successfully, but these errors were encountered: