You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Publish an integration [_publish_an_integration]
6
+
# Publish an integration via Pull Request[_publish_an_integration]
7
7
8
-
When your integration is done, it’s time to open a PR to include it in the integrations repository. Before opening your PR, run:
8
+
When your integration is done, it’s time to open a PR to include it in the integrations repository.
9
+
Before opening your PR, make sure you have:
9
10
11
+
1. Pass all checks
12
+
Run:
10
13
```bash
11
14
elastic-package check
12
15
```
13
16
14
-
The `check`command ensures the package is built correctly, formatted properly, and aligned with the spec. Passing the`check`command is required before adding your integration to the repository.
17
+
This command validates that your package is built correctly, formatted properly, and aligned with the specification. Passing this`check` is required before submitting your integration.
15
18
16
-
When CI is happy, merge your PR into the integrations repository.
19
+
2. Added a new entry into `changelog.yml`
20
+
Update the package’s `changelog.yml` with a clear description of your changes for the new version.
17
21
18
-
CI will kick off a build job for the main branch, which can release your integration to the package-storage. It means that it will open a PR to the Package Storage/snapshot with the built integration if only the package version doesn’t already exist in the storage (hasn’t been released yet).
22
+
3. Bumped the package version
23
+
If you are releasing new changes, increment the version in your manifest.yml file. This is required for the package to be published.
19
24
25
+
4. Wrote clear PR title and description
26
+
- Use a concise, descriptive title (e.g., `[New Integration] Add Acme Logs integration`).
27
+
- In the PR description, summarize what your integration or change does, list key features or fixes, reference related issues, and provide testing instructions.
28
+
- Ensure your documentation, sample events, and tests are included and up to date.
20
29
21
-
## Promote [_promote]
30
+
::::{tip}
31
+
A well-written PR with clear documentation, versioning, and testing instructions will speed up the review and publishing process!
32
+
::::
22
33
23
-
Now that you’ve tested your integration with {{kib}}, it’s time to promote it to staging or production. Run:
24
34
25
-
```bash
26
-
elastic-package promote
27
-
```
35
+
When CI is happy, merge your PR into the integrations repository.
28
36
29
-
The tool will open 2 pull requests (promote and delete) to the package-storage: target and source branches.
37
+
Once the PR with the new version of the package is merged, the required CI pipelines are triggered to release that new version into Package Storage V2 and make them available in https://epr.elastic.co.
30
38
31
-
Please review both pull requests on your own, check if CI is happy and merge - first target, then source. Once any PR is merged, the CI will kick off a job to bake a new Docker image of package-storage (tracking). Ideally the "delete" PR should be merged once the CI job for "promote" is done, as the Docker image of previous stage depends on the later one.
32
39
33
40
::::{tip}
34
41
When you are ready for your changes in the integration to be released, remember to bump up the package version. It is up to you, as the package developer, to decide how many changes you want to release in a single version. For example, you could implement a change in a PR and bump up the package version in the same PR. Or you could implement several changes across multiple pull requests and then bump up the package version in the last of these pull requests or in a separate follow up PR.
Copy file name to clipboardExpand all lines: docs/extend/add-data-stream.md
+23-8Lines changed: 23 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,24 +20,39 @@ apache
20
20
21
21
A data stream defines multiple {{es}} assets, like index templates, ingest pipelines, and field definitions. These assets are loaded into {{es}} when a user installs an integration using the {{fleet}} UI in {{kib}}.
22
22
23
-
A data stream also defines a policy template. Policy templates include variables that allow users to configure the data stream using the {{fleet}} UI in {{kib}}. Then, the {{agent}} interprets the resulting policy to collect relevant information from the product or service being observed. Policy templates can also define an integration’s supported [`deployment_modes`](/extend/define-deployment-modes.md#deployment_modes).
23
+
A data stream also defines a policy template. Policy templates include variables that allow users to configure the data stream using the {{fleet}} UI in {{kib}}. Then, the {{agent}} interprets the resulting policy to collect relevant information from the product or service being observed. Policy templates can also define an integration’s supported [`deployment_modes`](/extend/define-deployment-modes.md#set-deployment-modes).
24
24
25
25
See [data streams](docs-content://reference/fleet/data-streams.md) for more information.
26
26
27
27
::::
28
28
29
+
## How to add a data stream [how-to]
29
30
30
-
Bootstrap a new data stream using the TUI wizard. In the directory of your package, run:
31
+
1. Boostrap a new data stream
32
+
33
+
In your package directory, run:
31
34
32
35
```bash
33
36
elastic-package create data-stream
34
37
```
35
38
36
-
Follow the prompts to name, title, and select your data stream type. Then, run this command each time you add a new data stream to your integration.
39
+
Follow the prompts to set the name, title, and type (logs, metrics, etc.) for the data stream. Repeat this command for each new data stream you want to add.
40
+
41
+
2. Configure the data stream
42
+
43
+
After bootstrapping, manually adjust the generated files to suit your use case:
44
+
45
+
* Define required variables:
46
+
In the policy template, specify variables that users can configure (e.g., paths, ports, log levels).
47
+
* Define used fields:
48
+
Edit the fields/ files to describe the structure and types of data your stream will ingest.
49
+
* Define ingest pipeline definitions:
50
+
If needed, create or update ingest pipelines to parse, enrich, or transform incoming data before it’s indexed.
51
+
* Update the {{agent}} stream configuration:
52
+
Ensure the {{agent}}’s stream configuration matches your data collection requirements and references the correct variables and pipelines.
* When the integration is installed, each data stream is registered in {{es}} as a managed, time-based resource.
57
+
* Data sent to the data stream is automatically routed to the correct backing indices, with lifecycle management (rollover, retention) handled by Elasticsearch.
58
+
* Users can query, visualize, and analyze data from each stream in {{kib}}, using the single data stream name (e.g., `logs-apache.access`).
0 commit comments