Skip to content

Problem with integration with LM Studio local server #2

@ibalampanis

Description

@ibalampanis

Hello,

I am getting the following error when I am prompting to a LM Studio local server

[GIN] 2024/03/05 - 17:49:50 | 200 | 130.618µs | 10.0.8.215 | PUT "/api/session/c50a93e4-9d00-46dc-9e64-3972a95efede/prompt"
[GIN] 2024/03/05 - 17:49:50 | 200 | 186.652µs | 10.0.8.215 | PUT "/api/session/c50a93e4-9d00-46dc-9e64-3972a95efede/prompt"
[GIN] 2024/03/05 - 17:49:52 | 200 | 108.349µs | 10.0.8.215 | POST "/api/session/c50a93e4-9d00-46dc-9e64-3972a95efede/response"
[GIN] 2024/03/05 - 17:49:52 | 200 | 146.219µs | 10.0.8.215 | POST "/api/session/c50a93e4-9d00-46dc-9e64-3972a95efede/response"
2024/03/05 17:49:52 engine worker: enqueue 0xc0000f12c0
2024/03/05 17:49:52 OpenAiEngineBackend process(): Starting request
2024/03/05 17:49:52 OpenAiEngineBackend process(): ChatCompletionStream error: error, status code: 400, message:
2024/03/05 17:49:52 engine worker: compute done

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions