Skip to content

Releases: BerriAI/litellm

v1.31.14

15 Mar 15:44

Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.31.13...v1.31.14

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 88 97.8719389956135 1.5238205426081837 0.0 456 0 80.47983800003067 1042.1901939999998
/health/liveliness Passed ✅ 62 64.17129267745518 15.178054615189408 0.0 4542 0 59.447291000026325 1284.6712140000136
/health/readiness Passed ✅ 180.0 180.3285143601241 14.967526777065908 0.0 4479 0 123.18138000000545 1374.4275610000045
Aggregated Passed ✅ 90 120.690833738736 31.6694019348635 0.0 9477 0 59.447291000026325 1374.4275610000045

v1.31.13

15 Mar 03:43

Choose a tag to compare

What's Changed

Full Changelog: v1.31.12...v1.31.13

v1.31.12

14 Mar 21:07

Choose a tag to compare

What's Changed

Full Changelog: v1.31.10...v1.31.12

v1.31.10

14 Mar 05:37

Choose a tag to compare

Full Changelog: v1.31.9...v1.31.10

Stable Release: v1.30.2 👈 Recommended stable version of proxy.

v1.31.9

14 Mar 04:57

Choose a tag to compare

Stable Release: v1.30.2 👈 Recommended stable version of proxy.

What's Changed

New Contributors

Full Changelog: v1.31.8...v1.31.9

v1.31.8

13 Mar 19:28
3e66b50

Choose a tag to compare

What's Changed

  • feat(prompt_injection_detection.py): support simple heuristic similarity check for prompt injection attacks by @krrishdholakia in #2498

Load Tests Run on every new release

Group 207

Full Changelog: v1.31.7...v1.31.8

v1.31.7

13 Mar 19:06

Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.31.6...v1.31.7

v1.31.6

13 Mar 05:06

Choose a tag to compare

What's Changed

Full Changelog: v1.31.5...v1.31.6

v1.31.5

13 Mar 02:46
3922b82

Choose a tag to compare

What's Changed

Full Changelog: v1.31.4...v1.31.5

v1.31.4

12 Mar 20:13
0d18f3c

Choose a tag to compare

[BETA] Thrilled to launch support for Cohere/Command-R on LiteLLM , LiteLLM Proxy Server 👉 Start here https://docs.litellm.ai/docs/providers/cohere

☎️ PR for using cohere tool calling in OpenAI format: #2479

⚡️ LiteLLM Proxy + @langfuse - High Traffic - support 80+/Requests per second with Proxy + Langfuse logging https://docs.litellm.ai/docs/proxy/logging

⚡️ New Models - Azure GPT-Instruct models https://docs.litellm.ai/docs/providers/azure#azure-instruct-models

🛠️ Fix for using DynamoDB + LiteLLM Virtual Keys

What's Changed

Group 5750

Full Changelog: v1.30.2...v1.31.4