Skip to content

Commit 5226ed1

Browse files
[Serverless] Update preconfigured connectors (#245445)
## Summary Related issue #243094 Follow up to this PR #242791 - naming request updated. This PR is part of the work to support multiple managed LLMs. This PR moves away from generic naming and updates names for preconfigured connectors. Because these configurations are in yaml files, the connector id cannot use a period without escaping it. There seems to be way to escape special chars by using something like `preconfigured.["sonnet-3.7"].actionTypeId` but it hasn't been used so far in kibana and would need to be tested. Given the time sensitive nature of this change, I updated the id and replaced the period with a dash in the PRs. E.g. `Anthropic-Claude-Sonnet-3.7` becomes `Anthropic-Claude-Sonnet-3-7` From what I can see in kibana, there is no logic depending on the id to match the name exactly so this solution should be fine. ### Checklist Check the PR satisfies following conditions. Reviewers should verify this PR satisfies this list as well. - [ ] Any text added follows [EUI's writing guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses sentence case text and includes [i18n support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md) - [ ] [Documentation](https://www.elastic.co/guide/en/kibana/master/development-documentation.html) was added for features that require explanation or tutorials - [ ] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [ ] If a plugin configuration key changed, check if it needs to be allowlisted in the cloud and added to the [docker list](https://github.com/elastic/kibana/blob/main/src/dev/build/tasks/os_packages/docker_generator/resources/base/bin/kibana-docker) - [ ] This was checked for breaking HTTP API changes, and any breaking changes have been approved by the breaking-change committee. The `release_note:breaking` label should be applied in these situations. - [ ] [Flaky Test Runner](https://ci-stats.kibana.dev/trigger_flaky_test_runner/1) was used on any tests changed - [ ] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [ ] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels. --------- Co-authored-by: Elastic Machine <[email protected]>
1 parent cb12cc5 commit 5226ed1

File tree

8 files changed

+48
-47
lines changed

8 files changed

+48
-47
lines changed

config/serverless.es.yml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -116,8 +116,8 @@ xpack.contentConnectors.ui.enabled: false
116116

117117
# Elastic Managed LLMs
118118
xpack.actions.preconfigured:
119-
General-Purpose-LLM-v1:
120-
name: General Purpose LLM v1
119+
Anthropic-Claude-Sonnet-3-7:
120+
name: Anthropic Claude Sonnet 3.7
121121
actionTypeId: .inference
122122
exposeConfig: true
123123
config:
@@ -126,8 +126,8 @@ xpack.actions.preconfigured:
126126
inferenceId: ".rainbow-sprinkles-elastic"
127127
providerConfig:
128128
model_id: "rainbow-sprinkles"
129-
General-Purpose-LLM-v2:
130-
name: General Purpose LLM v2
129+
Anthropic-Claude-Sonnet-4-5:
130+
name: Anthropic Claude Sonnet 4.5
131131
actionTypeId: .inference
132132
exposeConfig: true
133133
config:
@@ -136,13 +136,13 @@ xpack.actions.preconfigured:
136136
inferenceId: ".gp-llm-v2-chat_completion"
137137
providerConfig:
138138
model_id: "gp-llm-v2"
139-
General-Purpose-LLM-v3:
140-
name: General Purpose LLM v3
139+
OpenAI-GPT-OSS-120B:
140+
name: OpenAI GPT-OSS 120B
141141
actionTypeId: .inference
142142
exposeConfig: true
143143
config:
144144
provider: "elastic"
145145
taskType: "chat_completion"
146-
inferenceId: ".gp-llm-v3-chat_completion"
146+
inferenceId: ".openai-gpt-oss-120b-chat_completion"
147147
providerConfig:
148-
model_id: "gp-llm-v3"
148+
model_id: "openai-gpt-oss-120b"

config/serverless.oblt.complete.yml

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -25,8 +25,8 @@ xpack.features.overrides:
2525

2626
# Elastic Managed LLMs
2727
xpack.actions.preconfigured:
28-
General-Purpose-LLM-v1:
29-
name: General Purpose LLM v1
28+
Anthropic-Claude-Sonnet-3-7:
29+
name: Anthropic Claude Sonnet 3.7
3030
actionTypeId: .inference
3131
exposeConfig: true
3232
config:
@@ -35,8 +35,8 @@ xpack.actions.preconfigured:
3535
inferenceId: ".rainbow-sprinkles-elastic"
3636
providerConfig:
3737
model_id: "rainbow-sprinkles"
38-
General-Purpose-LLM-v2:
39-
name: General Purpose LLM v2
38+
Anthropic-Claude-Sonnet-4-5:
39+
name: Anthropic Claude Sonnet 4.5
4040
actionTypeId: .inference
4141
exposeConfig: true
4242
config:
@@ -45,13 +45,14 @@ xpack.actions.preconfigured:
4545
inferenceId: ".gp-llm-v2-chat_completion"
4646
providerConfig:
4747
model_id: "gp-llm-v2"
48-
General-Purpose-LLM-v3:
49-
name: General Purpose LLM v3
48+
OpenAI-GPT-OSS-120B:
49+
name: OpenAI GPT-OSS 120B
5050
actionTypeId: .inference
5151
exposeConfig: true
5252
config:
5353
provider: "elastic"
5454
taskType: "chat_completion"
55-
inferenceId: ".gp-llm-v3-chat_completion"
55+
inferenceId: ".openai-gpt-oss-120b-chat_completion"
5656
providerConfig:
57-
model_id: "gp-llm-v3"
57+
model_id: "openai-gpt-oss-120b"
58+

config/serverless.security.complete.yml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ xpack.features.overrides:
99

1010
# Elastic Managed LLMs
1111
xpack.actions.preconfigured:
12-
General-Purpose-LLM-v1:
13-
name: General Purpose LLM v1
12+
Anthropic-Claude-Sonnet-3-7:
13+
name: Anthropic Claude Sonnet 3.7
1414
actionTypeId: .inference
1515
exposeConfig: true
1616
config:
@@ -19,8 +19,8 @@ xpack.actions.preconfigured:
1919
inferenceId: ".rainbow-sprinkles-elastic"
2020
providerConfig:
2121
model_id: "rainbow-sprinkles"
22-
General-Purpose-LLM-v2:
23-
name: General Purpose LLM v2
22+
Anthropic-Claude-Sonnet-4-5:
23+
name: Anthropic Claude Sonnet 4.5
2424
actionTypeId: .inference
2525
exposeConfig: true
2626
config:
@@ -29,13 +29,13 @@ xpack.actions.preconfigured:
2929
inferenceId: ".gp-llm-v2-chat_completion"
3030
providerConfig:
3131
model_id: "gp-llm-v2"
32-
General-Purpose-LLM-v3:
33-
name: General Purpose LLM v3
32+
OpenAI-GPT-OSS-120B:
33+
name: OpenAI GPT-OSS 120B
3434
actionTypeId: .inference
3535
exposeConfig: true
3636
config:
3737
provider: "elastic"
3838
taskType: "chat_completion"
39-
inferenceId: ".gp-llm-v3-chat_completion"
39+
inferenceId: ".openai-gpt-oss-120b-chat_completion"
4040
providerConfig:
41-
model_id: "gp-llm-v3"
41+
model_id: "openai-gpt-oss-120b"

config/serverless.security.search_ai_lake.yml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -96,8 +96,8 @@ xpack.fleet.internal.registry.searchAiLakePackageAllowlistEnabled: true
9696

9797
# Elastic Managed LLMs
9898
xpack.actions.preconfigured:
99-
General-Purpose-LLM-v1:
100-
name: General Purpose LLM v1
99+
Anthropic-Claude-Sonnet-3-7:
100+
name: Anthropic Claude Sonnet 3.7
101101
actionTypeId: .inference
102102
exposeConfig: true
103103
config:
@@ -106,8 +106,8 @@ xpack.actions.preconfigured:
106106
inferenceId: ".rainbow-sprinkles-elastic"
107107
providerConfig:
108108
model_id: "rainbow-sprinkles"
109-
General-Purpose-LLM-v2:
110-
name: General Purpose LLM v2
109+
Anthropic-Claude-Sonnet-4-5:
110+
name: Anthropic Claude Sonnet 4.5
111111
actionTypeId: .inference
112112
exposeConfig: true
113113
config:
@@ -116,13 +116,13 @@ xpack.actions.preconfigured:
116116
inferenceId: ".gp-llm-v2-chat_completion"
117117
providerConfig:
118118
model_id: "gp-llm-v2"
119-
General-Purpose-LLM-v3:
120-
name: General Purpose LLM v3
119+
OpenAI-GPT-OSS-120B:
120+
name: OpenAI GPT-OSS 120B
121121
actionTypeId: .inference
122122
exposeConfig: true
123123
config:
124124
provider: "elastic"
125125
taskType: "chat_completion"
126-
inferenceId: ".gp-llm-v3-chat_completion"
126+
inferenceId: ".openai-gpt-oss-120b-chat_completion"
127127
providerConfig:
128-
model_id: "gp-llm-v3"
128+
model_id: "openai-gpt-oss-120b"

config/serverless.workplaceai.yml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -31,8 +31,8 @@ xpack.product_intercept.enabled: false
3131

3232
# Elastic Managed LLMs
3333
xpack.actions.preconfigured:
34-
General-Purpose-LLM-v1:
35-
name: General Purpose LLM v1
34+
Anthropic-Claude-Sonnet-3-7:
35+
name: Anthropic Claude Sonnet 3.7
3636
actionTypeId: .inference
3737
exposeConfig: true
3838
config:
@@ -41,8 +41,8 @@ xpack.actions.preconfigured:
4141
inferenceId: ".rainbow-sprinkles-elastic"
4242
providerConfig:
4343
model_id: "rainbow-sprinkles"
44-
General-Purpose-LLM-v2:
45-
name: General Purpose LLM v2
44+
Anthropic-Claude-Sonnet-4-5:
45+
name: Anthropic Claude Sonnet 4.5
4646
actionTypeId: .inference
4747
exposeConfig: true
4848
config:
@@ -51,13 +51,13 @@ xpack.actions.preconfigured:
5151
inferenceId: ".gp-llm-v2-chat_completion"
5252
providerConfig:
5353
model_id: "gp-llm-v2"
54-
General-Purpose-LLM-v3:
55-
name: General Purpose LLM v3
54+
OpenAI-GPT-OSS-120B:
55+
name: OpenAI GPT-OSS 120B
5656
actionTypeId: .inference
5757
exposeConfig: true
5858
config:
5959
provider: "elastic"
6060
taskType: "chat_completion"
61-
inferenceId: ".gp-llm-v3-chat_completion"
61+
inferenceId: ".openai-gpt-oss-120b-chat_completion"
6262
providerConfig:
63-
model_id: "gp-llm-v3"
63+
model_id: "openai-gpt-oss-120b"

x-pack/platform/packages/shared/kbn-elastic-assistant/impl/assistant/helpers.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ export const getMessageFromRawResponse = (
3838
}
3939
};
4040

41-
const ELASTIC_LLM_CONNECTOR_IDS = ['Elastic-Managed-LLM', 'General-Purpose-LLM-v1'];
41+
const ELASTIC_LLM_CONNECTOR_IDS = ['Elastic-Managed-LLM', 'Anthropic-Claude-Sonnet-3-7'];
4242

4343
/**
4444
* Returns a default connector if there is only one connector

x-pack/platform/plugins/shared/fleet/cypress/tasks/api_calls/connectors.ts

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -27,9 +27,9 @@ export const request = <T = unknown>({
2727
};
2828
export const INTERNAL_INFERENCE_CONNECTORS = [
2929
'Elastic-Managed-LLM',
30-
'General-Purpose-LLM-v1',
31-
'General-Purpose-LLM-v2',
32-
'General-Purpose-LLM-v3',
30+
'Anthropic-Claude-Sonnet-3-7',
31+
'Anthropic-Claude-Sonnet-4-5',
32+
'OpenAI-GPT-OSS-120B',
3333
];
3434
export const INTERNAL_CLOUD_CONNECTORS = ['Elastic-Cloud-SMTP'];
3535

x-pack/solutions/security/plugins/security_solution/public/management/cypress/tasks/insights.ts

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -19,9 +19,9 @@ import {
1919
const INTERNAL_CLOUD_CONNECTORS = ['Elastic-Cloud-SMTP'];
2020
const INTERNAL_INFERENCE_CONNECTORS = [
2121
'Elastic-Managed-LLM',
22-
'General-Purpose-LLM-v1',
23-
'General-Purpose-LLM-v2',
24-
'General-Purpose-LLM-v3',
22+
'Anthropic-Claude-Sonnet-3-7',
23+
'Anthropic-Claude-Sonnet-4-5',
24+
'OpenAI-GPT-OSS-120B',
2525
];
2626
const INTERNAL_CONNECTORS = [...INTERNAL_CLOUD_CONNECTORS, ...INTERNAL_INFERENCE_CONNECTORS];
2727

0 commit comments

Comments
 (0)