Skip to content

[Bug]: Amazon Titan stops sequences filtering only works when litellm.drop_params = True #13730

@aunt-bitsy

Description

@aunt-bitsy

What happened?

When using Amazon Titan models, it returns a BadRequestError because the stop sequence format validation only happens when litellm.drop_params is set to True, but it should happen regardless of this setting

code: https://github.com/BerriAI/litellm/blob/c99277c51736a331d1deeb49339adba997fa1b42/litellm/llms/bedrock/chat/invoke_transformations/amazon_titan_transformation.py#L73C51-L73C62

context: Amazon Titan models have a strict requirement for stop sequences to match the pattern ^(|+|User:)$. The filtering should always happen regardless of the drop_params setting, or requests will fail when using standard stop sequences

Relevant log output

BadRequestError: litellm.BadRequestError: Error code: 400 - {'error': {'message': 'litellm.BadRequestError: BedrockException - {"message":"Malformed input request: string [</function] does not match pattern ^(\\|+|User:)$, please reformat your input and try again."}. Received Model Group=amazon.titan-text-express-v1\nAvailable Model Group Fallbacks=None', 'type': None, 'param': None, 'code': '400'}}

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.74.4

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions