You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've noticed that recent releases of litellm[proxy] on PyPI still pin uvicorn to >=0.22.0,<0.23.0 (based on the package METADATA), while the GitHub repository shows an updated requirement for uvicorn==0.29.0.
This discrepancy causes dependency conflicts in projects that use FastAPI (>=0.111),
because FastAPI requires uvicorn>=0.26.0, while litellm[proxy] enforces uvicorn<0.23.0.
Could you please review and publish a new release on PyPI that reflects the correct uvicorn dependency?
This would greatly help avoid installation issues for downstream projects.
Thank you in advance! Let me know if there's anything else I can do to assist.
Relevant log output
No response
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.58.1
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
What happened?
Hello, and thank you for your work on litellm!
I've noticed that recent releases of litellm[proxy] on PyPI still pin
uvicorn
to>=0.22.0,<0.23.0
(based on the package METADATA), while the GitHub repository shows an updated requirement foruvicorn==0.29.0
.This discrepancy causes dependency conflicts in projects that use FastAPI (>=0.111),
because FastAPI requires
uvicorn>=0.26.0
, while litellm[proxy] enforcesuvicorn<0.23.0
.Could you please review and publish a new release on PyPI that reflects the correct
uvicorn
dependency?This would greatly help avoid installation issues for downstream projects.
Thank you in advance! Let me know if there's anything else I can do to assist.
Relevant log output
No response
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.58.1
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: