-
-
Notifications
You must be signed in to change notification settings - Fork 5k
Closed as not planned
Labels
Description
The Feature
sending the "metadata" payload works with litellm proxy but the sdk ignores it. the goal is to be able to pass session id in the payload for OpenAI api calls
litellm.api_key = "xxx"
litellm._turn_on_debug()
response = litellm.completion(
model="gpt-4.1-nano", # Change this to the model deployed on your LiteLLM server
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Write a haiku about the ocean burning"},
],
metadata={"session_id": "1234"}
)
print(response.choices[0].message["content"])```
### Motivation, pitch
Ensure metadata is available in logging for all integrations
### LiteLLM is hiring a founding backend engineer, are you interested in joining us and shipping to all our users?
No
### Twitter / LinkedIn details
_No response_