-
Notifications
You must be signed in to change notification settings - Fork 170
Description
Is there currently any option for io.quarkiverse.langchain4j.azure.openai.AzureOpenAiChatModel to support structured JSON output?
Based on my testing the jsonSchema argument for ResponseFormat is not utilized by AzureOpenAiChatModel.
ResponseFormat format = ResponseFormat.builder()
.type(ResponseFormatType.JSON)
.jsonSchema(getSchema())
.build();
ChatRequest chatRequest = ChatRequest.builder()
.responseFormat(format)
.messages(messages)
.build();
ChatResponse response = model.chat(chatRequest); // will not use the schema
However, I have been able to receive structured JSON outputs for my gpt-4o model using the dev.langchain4j.model.azure.AzureOpenAiChatModel implementation.
https://github.com/langchain4j/langchain4j/blob/main/docs/docs/integrations/language-models/azure-open-ai.md#structured-outputs
Perhaps similar supportedCapabilities(Set.of(RESPONSE_FORMAT_JSON_SCHEMA)) option is needed for io.quarkiverse model? Or has anyone successfully received structed JSON output with the current io.quarkiverse implementation from Azure OpenAI model deployment?