Skip to content

Conditional task execution based on specific upstream failure in Databricks Workflows #4473

@ValeryiaLupanava

Description

@ValeryiaLupanava
Image

Hello Databricks team,

I’m trying to model a conditional execution pattern in Databricks Workflows and ran into a limitation with the available trigger conditions.

Scenario:

  • Task 1 is the main task.
  • If Task 1 fails, I need to execute Task 2.
  • All other downstream tasks should be skipped.
  • However, due to the current trigger conditions, Task 3 is always executed, because there is no way to trigger it only if a specific upstream task (Task 4) fails.

Currently available conditions like ALL_FAILED and AT_LEAST_ONE_FAILED don’t fit this use case, because they evaluate the whole upstream set, not a specific task’s status.

Questions:

  1. Is there any trigger condition (existing or planned) that allows running a task only if a specific upstream task fails (for example: “run Task 3 only if Task 4 fails”)?
  2. If not, what is the recommended design pattern for this kind of conditional branching?

Additional challenge:
I considered merging Tasks 2 and 3 into a single task it would help to resolve issue with conditions, but I still need to dynamically choose parameters based on which task actually produced output. For example:

base_parameters:
  trg_table_schema: "{{tasks.task_1.values.trg_table_schema}}" OR "{{tasks.task_4.values.trg_table_schema}}"

However, I didn’t find a way to conditionally reference task outputs when only one of them exists.

Any guidance on how to handle this pattern (either via workflow configuration or best practices) would be very helpful.

Thank you!
Best regards,
Valeryia

Metadata

Metadata

Assignees

No one assigned

    Labels

    DABsDABs related issues

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions