Databricks Pipeline
laktory.models.pipeline.DatabricksPipelineOrchestrator
¤
Bases: Pipeline, PipelineChild
Databricks Pipeline used as an orchestrator to execute a Laktory pipeline.
DLT orchestrator does not support pipeline nodes with views (as opposed to materialized tables). Also, it does not support writing to multiple schemas within the same pipeline.
Selecting this orchestrator requires to add the supporting notebook to the stack.
References
| PARAMETER | DESCRIPTION |
|---|---|
access_controls
|
Pipeline access controls
TYPE:
|
allow_duplicate_names
|
If
TYPE:
|
budget_policy_id
|
optional string specifying ID of the budget policy for this DLT pipeline.
TYPE:
|
catalog
|
Name of the unity catalog storing the pipeline tables
TYPE:
|
cause
|
TYPE:
|
channel
|
Name of the release channel for Spark version used by DLT pipeline.
TYPE:
|
cluster_id
|
TYPE:
|
clusters
|
Clusters to run the pipeline. If none is specified, pipelines will automatically select a default cluster configuration for the pipeline.
TYPE:
|
config_file
|
Pipeline configuration (json) file deployed to the workspace and used by the job to read and execute the pipeline.
TYPE:
|
configuration
|
List of values to apply to the entire pipeline. Elements must be formatted as key:value pairs
TYPE:
|
continuous
|
If
TYPE:
|
creator_user_name
|
TYPE:
|
deployment
|
Deployment type of this pipeline.
TYPE:
|
development
|
If
TYPE:
|
edition
|
Name of the product edition
TYPE:
|
event_log
|
An optional block specifying a table where DLT Event Log will be stored.
TYPE:
|
expected_last_modified
|
TYPE:
|
filters
|
Filters on which Pipeline packages to include in the deployed graph.
TYPE:
|
gateway_definition
|
The definition of a gateway pipeline to support CDC.
TYPE:
|
health
|
TYPE:
|
ingestion_definition
|
Lakeflow Ingestion Pipeline definition
TYPE:
|
last_modified
|
TYPE:
|
latest_updates
|
TYPE:
|
libraries
|
Specifies pipeline code (notebooks) and required artifacts.
TYPE:
|
name
|
Pipeline name
TYPE:
|
name_prefix
|
Prefix added to the DLT pipeline name
TYPE:
|
name_suffix
|
Suffix added to the DLT pipeline name
TYPE:
|
notifications
|
Notifications specifications
TYPE:
|
photon
|
If
TYPE:
|
restart_window
|
TYPE:
|
root_path
|
An optional string specifying the root path for this pipeline. This is used as the root directory when editing the pipeline in the Databricks user interface and it is added to sys.path when executing Python sources during pipeline execution.
TYPE:
|
run_as
|
TYPE:
|
run_as_user_name
|
TYPE:
|
schema_
|
The default schema (database) where tables are read from or published to. The presence of this attribute implies that the pipeline is in direct publishing mode.
TYPE:
|
serverless
|
If
TYPE:
|
state
|
TYPE:
|
storage
|
A location on DBFS or cloud storage where output data and metadata required for pipeline execution are stored.
By default, tables are stored in a subdirectory of this location. Change of this parameter forces recreation of
the pipeline. (Conflicts with
TYPE:
|
tags
|
A map of tags associated with the pipeline. These are forwarded to the cluster as cluster tags, and are therefore subject to the same limitations. A maximum of 25 tags can be added to the pipeline.
TYPE:
|
target
|
The name of a database (in either the Hive metastore or in a UC catalog) for persisting pipeline output data. Configuring the target setting allows you to view and query the pipeline output data from the Databricks UI.
TYPE:
|
trigger
|
TYPE:
|
type
|
Type of orchestrator
TYPE:
|
url
|
URL of the DLT pipeline on the given workspace.
TYPE:
|
| METHOD | DESCRIPTION |
|---|---|
to_dab_resource |
Convert to a DABs Python Pipeline resource object for use with |
| ATTRIBUTE | DESCRIPTION |
|---|---|
additional_core_resources |
TYPE:
|
additional_core_resources
property
¤
- configuration workspace file
- configuration workspace file permissions
to_dab_resource()
¤
Convert to a DABs Python Pipeline resource object for use with
laktory.dab.build_resources.
| RETURNS | DESCRIPTION |
|---|---|
|
|
Source code in laktory/models/pipeline/orchestrators/databrickspipelineorchestrator.py
84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 | |