What happened?
I decided to make this into a new issue from #11742 :
I am on 1.75.7, and with:
litellm.success_callback = ["langfuse_otel"]
litellm.failure_callback = ["langfuse_otel"]
I get:
2025-08-16 10:21:41,617 - opentelemetry.trace - WARNING - Overriding of current TracerProvider is not allowed
10:02:06 - LiteLLM:WARNING: opentelemetry.py:171 - Proxy Server is not installed. Skipping OpenTelemetry initialization.
10:02:07 - LiteLLM:ERROR: _utils.py:283 - [Arize/Phoenix] Failed to set OpenInference span attributes: 'CompletionUsage' object has no attribute 'get'
Removing langfuse_otel from callbacks, I get no such errors. This is with the latest litellm and langfuse v3.2.7. I bet enabling langfuse_otel enables some other things related to Arize/Phoenix and opentelemetry stuff unintentionally. And also tries to initialize the tracer again (that's the first message).
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
1.75.7
Twitter / LinkedIn details
No response