Enabling API-keyless usage & locally-run Ollama integration #151
Replies: 3 comments 8 replies
-
Yes, Josh has put together something very useful. I added the option for AWS bedrock, that doesn't require an api key. It'll use your local credentials file used for AWS cli or boto3. Or if you run the container in AWS, it'll use a role. |
Beta Was this translation helpful? Give feedback.
-
Thanks Pierre! I’d be very happy to have local Ollama support added, the use case makes total sense. I think the lack of streaming support for that model is OK too if it doesn’t totally break things. This also supports my long term goal with the project to let people fully run their own agent stack without relying on a third party. The recent PR for adding deepseek shows the places I think you’d need to update for this: https://github.com/JoshuaC215/agent-service-toolkit/pull/134/files Sounds like @madtank may already have it in a fork, if someone wanted to post a working PR I’m happy to help as needed! |
Beta Was this translation helpful? Give feedback.
-
I just added experimental ollama support with #160. I would love feedback on how it works for you. Instructions are here. |
Beta Was this translation helpful? Give feedback.
-
Hello,
Fantastic repo, really appreciate the YT video as well. Just wanna check in to see if you'd be interested in allowing the users to run this entire app without API keys. As a user and developer, and for privacy purposes, some organizations may want to avoid querying external APIs and instead favour locally-run open source models.
Would you be interested, and if so, how much work do you think this kind of integration would represent?
Beta Was this translation helpful? Give feedback.
All reactions