What happened?
when using the service_tier parameters with an OpenAI model ( responses endpoint) only the prompt is logged in langfuse. The answer is to logged
The answer is correctly logged in litellm
Relevant log output
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
Last main
Twitter / LinkedIn details
No response