Skip to content

[Bug]: When using service_tier responses are not logged in langfuse (Otel) #13869

@superpoussin22

Description

@superpoussin22

What happened?

when using the service_tier parameters with an OpenAI model ( responses endpoint) only the prompt is logged in langfuse. The answer is to logged

The answer is correctly logged in litellm

Relevant log output

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

Last main

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions