-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Description
What's the issue?
We're using Dagster for a specific use case where we have jobs that execute database operations with different parameters. For example, we use these jobs to correct data in specific database tables.
Our Use Case: We need the ability to run the same job concurrently with different parameters. Until recently, this worked as expected.
The Issue: We're experiencing what appears to be a bug with concurrent job executions. When we:
- Start Job1 with param_set_1 (e.g., targeting table_A)
- Simultaneously start the same Job1 with param_set_2 (e.g., targeting table_B)
The second run (with param_set_2) seems to interfere with the first run. Specifically, Run 2's data is being incorrectly ingested into Run 1's target table.
Has anyone encountered similar issues with concurrent Dagster job executions using different parameters? Are there known configuration requirements for ensuring proper run isolation?
What did you expect to happen?
Expected Behavior: Each concurrent run should maintain isolation and write to its intended target based on its specific parameters.
Actual Behavior: The runs appear to "collapse" or interfere with each other, with Run 2's data ending up in Run 1's destination.
How to reproduce?
- Create a parameterized job that performs database operations where the target table is determined by a configuration parameter (e.g., a job that writes data to a table specified in the run config)
- Launch the first run with parameters targeting one destination (e.g., target_table: "table_A", data: "dataset_1")
- Immediately launch a second concurrent run of the same job with different parameters targeting a different destination (e.g., target_table: "table_B", data: "dataset_2")
- Check the database tables after both runs complete
Dagster version
1.11.16
Deployment type
Dagster Cloud
Deployment details
No response
Additional information
No response
Message from the maintainers
Impacted by this issue? Give it a 👍! We factor engagement into prioritization.