Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Mismatch in uvicorn dependency for litellm[proxy] on PyPI #7768

Closed
javiermrcom opened this issue Jan 14, 2025 · 0 comments · Fixed by #7773
Closed

[Bug]: Mismatch in uvicorn dependency for litellm[proxy] on PyPI #7768

javiermrcom opened this issue Jan 14, 2025 · 0 comments · Fixed by #7773
Labels
bug Something isn't working

Comments

@javiermrcom
Copy link

What happened?

Hello, and thank you for your work on litellm!

I've noticed that recent releases of litellm[proxy] on PyPI still pin uvicorn to >=0.22.0,<0.23.0 (based on the package METADATA), while the GitHub repository shows an updated requirement for uvicorn==0.29.0.

This discrepancy causes dependency conflicts in projects that use FastAPI (>=0.111),
because FastAPI requires uvicorn>=0.26.0, while litellm[proxy] enforces uvicorn<0.23.0.

Could you please review and publish a new release on PyPI that reflects the correct uvicorn dependency?
This would greatly help avoid installation issues for downstream projects.

Thank you in advance! Let me know if there's anything else I can do to assist.

Relevant log output

No response

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.58.1

Twitter / LinkedIn details

No response

@javiermrcom javiermrcom added the bug Something isn't working label Jan 14, 2025
rajatvig pushed a commit to rajatvig/litellm that referenced this issue Jan 16, 2025
* build(pyproject.toml): bump uvicorn depedency requirement

Fixes BerriAI#7768

* fix(anthropic/chat/transformation.py): fix is_vertex_request check to actually use optional param passed in

Fixes BerriAI#6898 (comment)

* fix(o1_transformation.py): fix azure o1 'is_o1_model' check to just check for o1 in model string

BerriAI#7743

* test: load vertex creds
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
1 participant