Skip to content

Releases: BerriAI/litellm

v1.52.6

13 Nov 05:14
73c7b73
Compare
Choose a tag to compare

What's Changed

  • LiteLLM Minor Fixes & Improvements (11/12/2024) by @krrishdholakia in #6705
  • (feat) helm hook to sync db schema by @ishaan-jaff in #6715
  • (fix proxy redis) Add redis sentinel support by @ishaan-jaff in #6154
  • Fix: Update gpt-4o costs to those of gpt-4o-2024-08-06 by @klieret in #6714
  • (fix) using Anthropic response_format={"type": "json_object"} by @ishaan-jaff in #6721
  • (feat) Add cost tracking for Azure Dall-e-3 Image Generation + use base class to ensure basic image generation tests pass by @ishaan-jaff in #6716

New Contributors

Full Changelog: v1.52.5...v1.52.6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 240.0 266.21521040425523 6.127671245386762 0.0 1833 0 215.80195500001764 2902.9665340000292
Aggregated Passed ✅ 240.0 266.21521040425523 6.127671245386762 0.0 1833 0 215.80195500001764 2902.9665340000292

v1.52.5

12 Nov 06:15
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.52.4...v1.52.5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 200.0 216.13288200000045 6.215294300193555 0.0 1859 0 166.97629999998753 1726.1806539999611
Aggregated Passed ✅ 200.0 216.13288200000045 6.215294300193555 0.0 1859 0 166.97629999998753 1726.1806539999611

v1.52.4

11 Nov 21:01
Compare
Choose a tag to compare

What's Changed

  • (feat) Add support for logging to GCS Buckets with folder paths by @ishaan-jaff in #6675
  • (feat) add bedrock image gen async support by @ishaan-jaff in #6672
  • (feat) Add Bedrock Stability.ai Stable Diffusion 3 Image Generation models by @ishaan-jaff in #6673
  • (Feat) 273% improvement GCS Bucket Logger - use Batched Logging by @ishaan-jaff in #6679

Full Changelog: v1.52.3...v1.52.4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 260.0 290.15274785816086 6.102299282865334 0.0 1826 0 221.48416699997142 3998.8694860000464
Aggregated Passed ✅ 260.0 290.15274785816086 6.102299282865334 0.0 1826 0 221.48416699997142 3998.8694860000464

v1.52.3

08 Nov 18:52
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.52.2...v1.52.3

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.3

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 236.59706194640916 6.234242656262243 0.0 1866 0 180.61705699994945 3424.5764140000006
Aggregated Passed ✅ 210.0 236.59706194640916 6.234242656262243 0.0 1866 0 180.61705699994945 3424.5764140000006

v1.52.2-dev1

08 Nov 19:04
Compare
Choose a tag to compare

Full Changelog: v1.52.3...v1.52.2-dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.2-dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.2-dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 200.0 220.40195196940357 6.227773108800657 0.0 1863 0 180.672578000042 2967.1460419999676
Aggregated Passed ✅ 200.0 220.40195196940357 6.227773108800657 0.0 1863 0 180.672578000042 2967.1460419999676

v1.52.2

08 Nov 15:12
1bef645
Compare
Choose a tag to compare

What's Changed

  • chore: comment for maritalk by @nobu007 in #6607
  • Update gpt-4o-2024-08-06, and o1-preview, o1-mini models in model cost map by @emerzon in #6654
  • (QOL improvement) add unit testing for all static_methods in litellm_logging.py by @ishaan-jaff in #6640
  • (feat) log error class, function_name on prometheus service failure hook + only log DB related failures on DB service hook by @ishaan-jaff in #6650
  • Update several Azure AI models in model cost map by @emerzon in #6655
  • ci(conftest.py): reset conftest.py for local_testing/ by @krrishdholakia in #6657
  • Litellm dev 11 07 2024 by @krrishdholakia in #6649

New Contributors

Full Changelog: v1.52.1...v1.52.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 251.09411961031876 6.087114215107422 0.0 1822 0 198.72582000004968 1667.4085729999888
Aggregated Passed ✅ 230.0 251.09411961031876 6.087114215107422 0.0 1822 0 198.72582000004968 1667.4085729999888

v1.52.1

07 Nov 20:53
27e1835
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.52.0...v1.52.1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 260.0 292.8286898309638 6.110969302244283 0.0 1828 0 230.12115400001676 2643.3588609999674
Aggregated Passed ✅ 260.0 292.8286898309638 6.110969302244283 0.0 1828 0 230.12115400001676 2643.3588609999674

v1.52.0-stable

09 Nov 02:05
695f48a
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.51.3...v1.52.0-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.0-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 240.0 270.29554346208295 6.124428891308869 0.0 1833 0 212.83106800001406 1622.2440090000418
Aggregated Passed ✅ 240.0 270.29554346208295 6.124428891308869 0.0 1833 0 212.83106800001406 1622.2440090000418

v1.52.0

05 Nov 18:16
695f48a
Compare
Choose a tag to compare

Group 6166

What's Changed

New Contributors

Full Changelog: v1.51.3...v1.52.0

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.0

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 231.0704959909717 6.291122309918775 0.0 1883 0 180.74613400000317 2618.3897060000163
Aggregated Passed ✅ 210.0 231.0704959909717 6.291122309918775 0.0 1883 0 180.74613400000317 2618.3897060000163

v1.51.3-dev1

04 Nov 11:17
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.51.3...v1.51.3-dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.51.3-dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 292.3714877928421 6.163980651581093 0.0 1844 0 226.11442700002726 2207.691740000001
Aggregated Passed ✅ 250.0 292.3714877928421 6.163980651581093 0.0 1844 0 226.11442700002726 2207.691740000001