You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've tried it with Github Copilot for Intellij, and it does not work. The first issue was that Copilot uses CONNECT HTTP proxy requests, which LiteLLM doesn't handle. Then after sticking a real HTTP proxy in the mix to have it unencapsulate the proxy request, and redirect api.individual.githubcopilot.com into LiteLLM, it then started failing because copilot requests things like /agents, which LiteLLM doesn't handle.
Even when I do a manual curl, such as http_proxy=http://localhost:4000 curl http://api.individual.githubcopilot.com/models, which is how it would it would work if Copilot were using LiteLLM as a simple HTTP proxy (which is what the document shows), it responds with a 404.
When I go searching through the LiteLLM code for "copilot", all I see is about using copilot as a provider. I don't see anything about this special proxy emulation.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Is this document still valid: https://github.com/BerriAI/litellm/blob/main/docs/my-website/docs/tutorials/github_copilot_integration.md
I've tried it with Github Copilot for Intellij, and it does not work. The first issue was that Copilot uses
CONNECT
HTTP proxy requests, which LiteLLM doesn't handle. Then after sticking a real HTTP proxy in the mix to have it unencapsulate the proxy request, and redirectapi.individual.githubcopilot.com
into LiteLLM, it then started failing because copilot requests things like/agents
, which LiteLLM doesn't handle.Even when I do a manual curl, such as
http_proxy=http://localhost:4000 curl http://api.individual.githubcopilot.com/models
, which is how it would it would work if Copilot were using LiteLLM as a simple HTTP proxy (which is what the document shows), it responds with a 404.When I go searching through the LiteLLM code for "copilot", all I see is about using copilot as a provider. I don't see anything about this special proxy emulation.
So is this document still valid?
Beta Was this translation helpful? Give feedback.
All reactions