Skip to content

Job

laktory.models.resources.databricks.Job ¤

Bases: JobBase

Databricks Job

Examples:

import io

from laktory import models

# Define job
job_yaml = '''
name: job-stock-prices
job_clusters:
  - job_cluster_key: main
    new_cluster:
        spark_version: 16.3.x-scala2.12
        node_type_id: Standard_DS3_v2

tasks:
  - task_key: ingest
    job_cluster_key: main
    notebook_task:
      notebook_path: /jobs/ingest_stock_prices.py
    libraries:
      - pypi:
          package: yfinance

  - task_key: pipeline
    depends_on:
      - task_key: ingest
    pipeline_task:
      pipeline_id: 74900655-3641-49f1-8323-b8507f0e3e3b

access_controls:
  - group_name: account users
    permission_level: CAN_VIEW
  - group_name: role-engineers
    permission_level: CAN_MANAGE_RUN
'''
job = models.resources.databricks.Job.model_validate_yaml(io.StringIO(job_yaml))

# Define job with for each task
job_yaml = '''
name: job-hello
tasks:
  - task_key: hello-loop
    for_each_task:
      inputs: "[{'id':1, 'name': 'olivier'}, {'id':2, 'name': 'kubic'}]"
      task:
        task_key: hello-task
        notebook_task:
          notebook_path: /Workspace/Users/olivier.soucy@okube.ai/hello-world
          base_parameters:
            input: "{{input}}"
'''
job = models.resources.databricks.Job.model_validate_yaml(io.StringIO(job_yaml))
References
BASE DESCRIPTION
always_running

(Bool) Whenever the job is always running, like a Spark Streaming application, on every update restart the current active run or start it again, if nothing it is not running. False by default. Any job runs are started with parameters specified in spark_jar_task or spark_submit_task or spark_python_task or notebook_task blocks

TYPE: bool | None | VariableType DEFAULT: None

budget_policy_id

The ID of the user-specified budget policy to use for this job. If not specified, a default budget policy may be applied when creating or modifying the job

TYPE: str | None | VariableType DEFAULT: None

continuous

TYPE: JobContinuous | None | VariableType DEFAULT: None

control_run_state

(Bool) If true, the Databricks provider will stop and start the job as needed to ensure that the active run for the job reflects the deployed configuration. For continuous jobs, the provider respects the pause_status by stopping the current active run. This flag cannot be set for non-continuous jobs

TYPE: bool | None | VariableType DEFAULT: None

dbt_task

TYPE: JobDbtTask | None | VariableType DEFAULT: None

deployment

TYPE: JobDeployment | None | VariableType DEFAULT: None

description

description for this task

TYPE: str | None | VariableType DEFAULT: None

edit_mode

If 'UI_LOCKED', the user interface for the job will be locked. If 'EDITABLE' (the default), the user interface will be editable

TYPE: str | None | VariableType DEFAULT: None

email_notifications

An optional block to specify a set of email addresses notified when this task begins, completes or fails. The default behavior is to not send any emails. This block is documented below

TYPE: JobEmailNotifications | None | VariableType DEFAULT: None

environment

TYPE: list[JobEnvironment] | None | VariableType DEFAULT: None

existing_cluster_id

Identifier of the interactive cluster to run job on. Note: running tasks on interactive clusters may lead to increased costs!

TYPE: str | None | VariableType DEFAULT: None

format

TYPE: str | None | VariableType DEFAULT: None

git_source

Specifies the a Git repository for task source code. See git_source Configuration Block below

TYPE: JobGitSource | None | VariableType DEFAULT: None

health

block described below that specifies health conditions for a given task

TYPE: JobHealth | None | VariableType DEFAULT: None

job_cluster

A list of job databricks_cluster specifications that can be shared and reused by tasks of this job. Libraries cannot be declared in a shared job cluster. You must declare dependent libraries in task settings. Multi-task syntax

TYPE: list[JobJobCluster] | None | VariableType DEFAULT: None

library

(Set) An optional list of libraries to be installed on the cluster that will execute the job

TYPE: list[JobLibrary] | None | VariableType DEFAULT: None

max_concurrent_runs

(Integer) An optional maximum allowed number of concurrent runs of the job. Defaults to 1

TYPE: int | None | VariableType DEFAULT: None

max_retries

(Integer) An optional maximum number of times to retry an unsuccessful run. A run is considered to be unsuccessful if it completes with a FAILED or INTERNAL_ERROR lifecycle state. The value -1 means to retry indefinitely and the value 0 means to never retry. The default behavior is to never retry. A run can have the following lifecycle state: PENDING, RUNNING, TERMINATING, TERMINATED, SKIPPED or INTERNAL_ERROR

TYPE: int | None | VariableType DEFAULT: None

min_retry_interval_millis

(Integer) An optional minimal interval in milliseconds between the start of the failed run and the subsequent retry run. The default behavior is that unsuccessful runs are immediately retried

TYPE: int | None | VariableType DEFAULT: None

name

The name of the defined parameter. May only contain alphanumeric characters, _, -, and .

TYPE: str | None | VariableType DEFAULT: None

new_cluster

Block with almost the same set of parameters as for databricks_cluster resource, except following (check the REST API documentation for full list of supported parameters):

TYPE: JobNewCluster | None | VariableType DEFAULT: None

notebook_task

TYPE: JobNotebookTask | None | VariableType DEFAULT: None

notification_settings

An optional block controlling the notification settings on the job level documented below

TYPE: JobNotificationSettings | None | VariableType DEFAULT: None

parameter

Specifies job parameter for the job. See parameter Configuration Block

TYPE: list[JobParameter] | None | VariableType DEFAULT: None

performance_target

The performance mode on a serverless job. The performance target determines the level of compute performance or cost-efficiency for the run. Supported values are: * PERFORMANCE_OPTIMIZED: (default value) Prioritizes fast startup and execution times through rapid scaling and optimized cluster performance. * STANDARD: Enables cost-efficient execution of serverless workloads

TYPE: str | None | VariableType DEFAULT: None

pipeline_task

TYPE: JobPipelineTask | None | VariableType DEFAULT: None

python_wheel_task

TYPE: JobPythonWheelTask | None | VariableType DEFAULT: None

queue

The queue status for the job. See queue Configuration Block below

TYPE: JobQueue | None | VariableType DEFAULT: None

retry_on_timeout

(Bool) An optional policy to specify whether to retry a job when it times out. The default behavior is to not retry on timeout

TYPE: bool | None | VariableType DEFAULT: None

run_as

The user or the service principal the job runs as. See run_as Configuration Block below

TYPE: JobRunAs | None | VariableType DEFAULT: None

run_job_task

TYPE: JobRunJobTask | None | VariableType DEFAULT: None

schedule

An optional periodic schedule for this job. The default behavior is that the job runs when triggered by clicking Run Now in the Jobs UI or sending an API request to runNow. See schedule Configuration Block below

TYPE: JobSchedule | None | VariableType DEFAULT: None

spark_jar_task

TYPE: JobSparkJarTask | None | VariableType DEFAULT: None

spark_python_task

TYPE: JobSparkPythonTask | None | VariableType DEFAULT: None

spark_submit_task

TYPE: JobSparkSubmitTask | None | VariableType DEFAULT: None

tags

An optional map of the tags associated with the job. See tags Configuration Map

TYPE: dict[str, str] | None | VariableType DEFAULT: None

task

Task to run against the inputs list

TYPE: list[JobTask] | None | VariableType DEFAULT: None

timeout_seconds

(Integer) An optional timeout applied to each run of this job. The default behavior is to have no timeout

TYPE: int | None | VariableType DEFAULT: None

timeouts

TYPE: JobTimeouts | None | VariableType DEFAULT: None

trigger

The conditions that triggers the job to start. See trigger Configuration Block below. * continuous- (Optional) Configuration block to configure pause status. See continuous Configuration Block

TYPE: JobTrigger | None | VariableType DEFAULT: None

usage_policy_id

TYPE: str | None | VariableType DEFAULT: None

webhook_notifications

(List) An optional set of system destinations (for example, webhook destinations or Slack) to be notified when runs of this task begins, completes or fails. The default behavior is to not send any notifications. This field is a block and is documented below

TYPE: JobWebhookNotifications | None | VariableType DEFAULT: None

LAKTORY DESCRIPTION
access_controls

Access controls list

TYPE: list[AccessControl | VariableType] | VariableType DEFAULT: []

name_prefix

Prefix added to the job name

TYPE: str | VariableType DEFAULT: None

name_suffix

Suffix added to the job name

TYPE: str | VariableType DEFAULT: None

ATTRIBUTE DESCRIPTION
additional_core_resources
  • permissions

TYPE: list

additional_core_resources property ¤

  • permissions

laktory.models.resources.databricks.job.JobContinuous ¤

Bases: BaseModel

PARAMETER DESCRIPTION
pause_status

Indicate whether this trigger is paused or not. Either PAUSED or UNPAUSED. When the pause_status field is omitted in the block, the server will default to using UNPAUSED as a value for pause_status

TYPE: str | None | VariableType DEFAULT: None

task_retry_mode

Controls task level retry behaviour. Allowed values are: * NEVER (default): The failed task will not be retried. * ON_FAILURE: Retry a failed task if at least one other task in the job is still running its first attempt. When this condition is no longer met or the retry limit is reached, the job run is cancelled and a new run is started

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobDbtTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
catalog

The name of the catalog to use inside Unity Catalog

TYPE: str | None | VariableType DEFAULT: None

commands

(Array) Series of dbt commands to execute in sequence. Every command must start with 'dbt'

TYPE: list[str] | VariableType

profiles_directory

The relative path to the directory in the repository specified by git_source where dbt should look in for the profiles.yml file. If not specified, defaults to the repository's root directory. Equivalent to passing --profile-dir to a dbt command

TYPE: str | None | VariableType DEFAULT: None

project_directory

The path where dbt should look for dbt_project.yml. Equivalent to passing --project-dir to the dbt CLI. * If source is GIT: Relative path to the directory in the repository specified in the git_source block. Defaults to the repository's root directory when not specified. * If source is WORKSPACE: Absolute path to the folder in the workspace

TYPE: str | None | VariableType DEFAULT: None

schema_

The name of the schema dbt should run in. Defaults to default

TYPE: str | None | VariableType DEFAULT: None

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobDeployment ¤

Bases: BaseModel

PARAMETER DESCRIPTION
kind

TYPE: str | VariableType

metadata_file_path

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobEmailNotifications ¤

Bases: BaseModel

PARAMETER DESCRIPTION
no_alert_for_skipped_runs

(Bool) don't send alert for skipped runs

TYPE: bool | None | VariableType DEFAULT: None

on_duration_warning_threshold_exceeded

(List) list of notification IDs to call when the duration of a run exceeds the threshold specified by the RUN_DURATION_SECONDS metric in the health block

TYPE: list[str] | None | VariableType DEFAULT: None

on_failure

(List) list of notification IDs to call when the run fails. A maximum of 3 destinations can be specified

TYPE: list[str] | None | VariableType DEFAULT: None

on_start

(List) list of notification IDs to call when the run starts. A maximum of 3 destinations can be specified

TYPE: list[str] | None | VariableType DEFAULT: None

on_streaming_backlog_exceeded

(List) list of notification IDs to call when any streaming backlog thresholds are exceeded for any stream

TYPE: list[str] | None | VariableType DEFAULT: None

on_success

(List) list of notification IDs to call when the run completes successfully. A maximum of 3 destinations can be specified

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobEnvironment ¤

Bases: BaseModel

PARAMETER DESCRIPTION
environment_key

an unique identifier of the Environment. It will be referenced from environment_key attribute of corresponding task

TYPE: str | VariableType

spec

block describing the Environment. Consists of following attributes:

TYPE: JobEnvironmentSpec | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobEnvironmentSpec ¤

Bases: BaseModel

PARAMETER DESCRIPTION
base_environment

TYPE: str | None | VariableType DEFAULT: None

client

TYPE: str | None | VariableType DEFAULT: None

dependencies

(list of strings) List of pip dependencies, as supported by the version of pip in this environment. Each dependency is a pip requirement file line. See API docs for more information

TYPE: list[str] | None | VariableType DEFAULT: None

environment_version

client version used by the environment. Each version comes with a specific Python version and a set of Python packages

TYPE: str | None | VariableType DEFAULT: None

java_dependencies

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobGitSource ¤

Bases: BaseModel

PARAMETER DESCRIPTION
branch

name of the Git branch to use. Conflicts with tag and commit

TYPE: str | None | VariableType DEFAULT: None

commit

hash of Git commit to use. Conflicts with branch and tag

TYPE: str | None | VariableType DEFAULT: None

git_snapshot

TYPE: JobGitSourceGitSnapshot | None | VariableType DEFAULT: None

job_source

TYPE: JobGitSourceJobSource | None | VariableType DEFAULT: None

provider

case insensitive name of the Git provider. Following values are supported right now (could be a subject for change, consult Repos API documentation): gitHub, gitHubEnterprise, bitbucketCloud, bitbucketServer, azureDevOpsServices, gitLab, gitLabEnterpriseEdition

TYPE: str | None | VariableType DEFAULT: None

sparse_checkout

TYPE: JobGitSourceSparseCheckout | None | VariableType DEFAULT: None

tag

name of the Git branch to use. Conflicts with branch and commit

TYPE: str | None | VariableType DEFAULT: None

url

URL of the job on the given workspace

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobGitSourceGitSnapshot ¤

Bases: BaseModel

PARAMETER DESCRIPTION
used_commit

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobGitSourceJobSource ¤

Bases: BaseModel

PARAMETER DESCRIPTION
dirty_state

TYPE: str | None | VariableType DEFAULT: None

import_from_git_branch

TYPE: str | VariableType

job_config_path

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobGitSourceSparseCheckout ¤

Bases: BaseModel

PARAMETER DESCRIPTION
patterns

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobHealth ¤

Bases: BaseModel

PARAMETER DESCRIPTION
rules

(List) list of rules that are represented as objects with the following attributes:

TYPE: list[JobHealthRules] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobHealthRules ¤

Bases: BaseModel

PARAMETER DESCRIPTION
metric

string specifying the metric to check, like RUN_DURATION_SECONDS, STREAMING_BACKLOG_FILES, etc. - check the Jobs REST API documentation for the full list of supported metrics

TYPE: str | VariableType

op

string specifying the operation used to evaluate the given metric. The only supported operation is GREATER_THAN

TYPE: str | VariableType

value

integer value used to compare to the given metric

TYPE: int | VariableType


laktory.models.resources.databricks.job.JobJobCluster ¤

Bases: BaseModel

PARAMETER DESCRIPTION
job_cluster_key

Identifier that can be referenced in task block, so that cluster is shared between tasks

TYPE: str | VariableType

new_cluster

Block with almost the same set of parameters as for databricks_cluster resource, except following (check the REST API documentation for full list of supported parameters):

TYPE: JobJobClusterNewCluster | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewCluster ¤

Bases: BaseModel

PARAMETER DESCRIPTION
apply_policy_default_values

TYPE: bool | None | VariableType DEFAULT: None

autoscale

TYPE: JobJobClusterNewClusterAutoscale | None | VariableType DEFAULT: None

aws_attributes

TYPE: JobJobClusterNewClusterAwsAttributes | None | VariableType DEFAULT: None

azure_attributes

TYPE: JobJobClusterNewClusterAzureAttributes | None | VariableType DEFAULT: None

cluster_id

TYPE: str | None | VariableType DEFAULT: None

cluster_log_conf

TYPE: JobJobClusterNewClusterClusterLogConf | None | VariableType DEFAULT: None

cluster_mount_info

TYPE: list[JobJobClusterNewClusterClusterMountInfo] | None | VariableType DEFAULT: None

cluster_name

TYPE: str | None | VariableType DEFAULT: None

custom_tags

TYPE: dict[str, str] | None | VariableType DEFAULT: None

data_security_mode

TYPE: str | None | VariableType DEFAULT: None

docker_image

TYPE: JobJobClusterNewClusterDockerImage | None | VariableType DEFAULT: None

driver_instance_pool_id

TYPE: str | None | VariableType DEFAULT: None

driver_node_type_flexibility

TYPE: JobJobClusterNewClusterDriverNodeTypeFlexibility | None | VariableType DEFAULT: None

driver_node_type_id

TYPE: str | None | VariableType DEFAULT: None

enable_elastic_disk

TYPE: bool | None | VariableType DEFAULT: None

enable_local_disk_encryption

TYPE: bool | None | VariableType DEFAULT: None

gcp_attributes

TYPE: JobJobClusterNewClusterGcpAttributes | None | VariableType DEFAULT: None

idempotency_token

TYPE: str | None | VariableType DEFAULT: None

init_scripts

TYPE: list[JobJobClusterNewClusterInitScripts] | None | VariableType DEFAULT: None

instance_pool_id

TYPE: str | None | VariableType DEFAULT: None

is_single_node

TYPE: bool | None | VariableType DEFAULT: None

kind

TYPE: str | None | VariableType DEFAULT: None

library

(Set) An optional list of libraries to be installed on the cluster that will execute the job

TYPE: list[JobJobClusterNewClusterLibrary] | None | VariableType DEFAULT: None

node_type_id

TYPE: str | None | VariableType DEFAULT: None

num_workers

TYPE: int | None | VariableType DEFAULT: None

policy_id

TYPE: str | None | VariableType DEFAULT: None

remote_disk_throughput

TYPE: int | None | VariableType DEFAULT: None

runtime_engine

TYPE: str | None | VariableType DEFAULT: None

single_user_name

TYPE: str | None | VariableType DEFAULT: None

spark_conf

TYPE: dict[str, str] | None | VariableType DEFAULT: None

spark_env_vars

TYPE: dict[str, str] | None | VariableType DEFAULT: None

spark_version

TYPE: str | None | VariableType DEFAULT: None

ssh_public_keys

TYPE: list[str] | None | VariableType DEFAULT: None

total_initial_remote_disk_size

TYPE: int | None | VariableType DEFAULT: None

use_ml_runtime

TYPE: bool | None | VariableType DEFAULT: None

worker_node_type_flexibility

TYPE: JobJobClusterNewClusterWorkerNodeTypeFlexibility | None | VariableType DEFAULT: None

workload_type

isn't supported

TYPE: JobJobClusterNewClusterWorkloadType | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterAutoscale ¤

Bases: BaseModel

PARAMETER DESCRIPTION
max_workers

TYPE: int | None | VariableType DEFAULT: None

min_workers

TYPE: int | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterAwsAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

ebs_volume_count

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_iops

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_size

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_throughput

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_type

TYPE: str | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

instance_profile_arn

TYPE: str | None | VariableType DEFAULT: None

spot_bid_price_percent

TYPE: int | None | VariableType DEFAULT: None

zone_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterAzureAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

log_analytics_info

TYPE: JobJobClusterNewClusterAzureAttributesLogAnalyticsInfo | None | VariableType DEFAULT: None

spot_bid_max_price

TYPE: float | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterAzureAttributesLogAnalyticsInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
log_analytics_primary_key

TYPE: str | None | VariableType DEFAULT: None

log_analytics_workspace_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterClusterLogConf ¤

Bases: BaseModel

PARAMETER DESCRIPTION
dbfs

TYPE: JobJobClusterNewClusterClusterLogConfDbfs | None | VariableType DEFAULT: None

s3

TYPE: JobJobClusterNewClusterClusterLogConfS3 | None | VariableType DEFAULT: None

volumes

TYPE: JobJobClusterNewClusterClusterLogConfVolumes | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterClusterLogConfDbfs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterClusterLogConfS3 ¤

Bases: BaseModel

PARAMETER DESCRIPTION
canned_acl

TYPE: str | None | VariableType DEFAULT: None

destination

TYPE: str | VariableType

enable_encryption

TYPE: bool | None | VariableType DEFAULT: None

encryption_type

TYPE: str | None | VariableType DEFAULT: None

endpoint

TYPE: str | None | VariableType DEFAULT: None

kms_key

TYPE: str | None | VariableType DEFAULT: None

region

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterClusterLogConfVolumes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterClusterMountInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
local_mount_dir_path

TYPE: str | VariableType

network_filesystem_info

TYPE: JobJobClusterNewClusterClusterMountInfoNetworkFilesystemInfo | None | VariableType DEFAULT: None

remote_mount_dir_path

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterClusterMountInfoNetworkFilesystemInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
mount_options

TYPE: str | None | VariableType DEFAULT: None

server_address

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterDockerImage ¤

Bases: BaseModel

PARAMETER DESCRIPTION
basic_auth

TYPE: JobJobClusterNewClusterDockerImageBasicAuth | None | VariableType DEFAULT: None

url

URL of the job on the given workspace

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterDockerImageBasicAuth ¤

Bases: BaseModel

PARAMETER DESCRIPTION
password

TYPE: str | VariableType

username

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterDriverNodeTypeFlexibility ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alternate_node_type_ids

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterGcpAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

boot_disk_size

TYPE: int | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

google_service_account

TYPE: str | None | VariableType DEFAULT: None

local_ssd_count

TYPE: int | None | VariableType DEFAULT: None

use_preemptible_executors

TYPE: bool | None | VariableType DEFAULT: None

zone_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterInitScripts ¤

Bases: BaseModel

PARAMETER DESCRIPTION
abfss

TYPE: JobJobClusterNewClusterInitScriptsAbfss | None | VariableType DEFAULT: None

dbfs

TYPE: JobJobClusterNewClusterInitScriptsDbfs | None | VariableType DEFAULT: None

file

block consisting of single string fields:

TYPE: JobJobClusterNewClusterInitScriptsFile | None | VariableType DEFAULT: None

gcs

TYPE: JobJobClusterNewClusterInitScriptsGcs | None | VariableType DEFAULT: None

s3

TYPE: JobJobClusterNewClusterInitScriptsS3 | None | VariableType DEFAULT: None

volumes

TYPE: JobJobClusterNewClusterInitScriptsVolumes | None | VariableType DEFAULT: None

workspace

TYPE: JobJobClusterNewClusterInitScriptsWorkspace | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterInitScriptsAbfss ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterInitScriptsDbfs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterInitScriptsFile ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterInitScriptsGcs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterInitScriptsS3 ¤

Bases: BaseModel

PARAMETER DESCRIPTION
canned_acl

TYPE: str | None | VariableType DEFAULT: None

destination

TYPE: str | VariableType

enable_encryption

TYPE: bool | None | VariableType DEFAULT: None

encryption_type

TYPE: str | None | VariableType DEFAULT: None

endpoint

TYPE: str | None | VariableType DEFAULT: None

kms_key

TYPE: str | None | VariableType DEFAULT: None

region

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterInitScriptsVolumes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterInitScriptsWorkspace ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobJobClusterNewClusterLibrary ¤

Bases: BaseModel

PARAMETER DESCRIPTION
cran

TYPE: JobJobClusterNewClusterLibraryCran | None | VariableType DEFAULT: None

egg

TYPE: str | None | VariableType DEFAULT: None

jar

TYPE: str | None | VariableType DEFAULT: None

maven

TYPE: JobJobClusterNewClusterLibraryMaven | None | VariableType DEFAULT: None

pypi

TYPE: JobJobClusterNewClusterLibraryPypi | None | VariableType DEFAULT: None

requirements

TYPE: str | None | VariableType DEFAULT: None

whl

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterLibraryCran ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterLibraryMaven ¤

Bases: BaseModel

PARAMETER DESCRIPTION
coordinates

TYPE: str | VariableType

exclusions

TYPE: list[str] | None | VariableType DEFAULT: None

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterLibraryPypi ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterWorkerNodeTypeFlexibility ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alternate_node_type_ids

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterWorkloadType ¤

Bases: BaseModel

PARAMETER DESCRIPTION
clients

TYPE: JobJobClusterNewClusterWorkloadTypeClients | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobJobClusterNewClusterWorkloadTypeClients ¤

Bases: BaseModel

PARAMETER DESCRIPTION
jobs

TYPE: bool | None | VariableType DEFAULT: None

notebooks

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobLibrary ¤

Bases: BaseModel

PARAMETER DESCRIPTION
cran

TYPE: JobLibraryCran | None | VariableType DEFAULT: None

egg

TYPE: str | None | VariableType DEFAULT: None

jar

TYPE: str | None | VariableType DEFAULT: None

maven

TYPE: JobLibraryMaven | None | VariableType DEFAULT: None

pypi

TYPE: JobLibraryPypi | None | VariableType DEFAULT: None

requirements

TYPE: str | None | VariableType DEFAULT: None

whl

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobLibraryCran ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobLibraryMaven ¤

Bases: BaseModel

PARAMETER DESCRIPTION
coordinates

TYPE: str | VariableType

exclusions

TYPE: list[str] | None | VariableType DEFAULT: None

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobLibraryPypi ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobLookup ¤

Bases: ResourceLookup

PARAMETER DESCRIPTION
id

The id of the databricks job

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewCluster ¤

Bases: BaseModel

PARAMETER DESCRIPTION
apply_policy_default_values

TYPE: bool | None | VariableType DEFAULT: None

autoscale

TYPE: JobNewClusterAutoscale | None | VariableType DEFAULT: None

aws_attributes

TYPE: JobNewClusterAwsAttributes | None | VariableType DEFAULT: None

azure_attributes

TYPE: JobNewClusterAzureAttributes | None | VariableType DEFAULT: None

cluster_id

TYPE: str | None | VariableType DEFAULT: None

cluster_log_conf

TYPE: JobNewClusterClusterLogConf | None | VariableType DEFAULT: None

cluster_mount_info

TYPE: list[JobNewClusterClusterMountInfo] | None | VariableType DEFAULT: None

cluster_name

TYPE: str | None | VariableType DEFAULT: None

custom_tags

TYPE: dict[str, str] | None | VariableType DEFAULT: None

data_security_mode

TYPE: str | None | VariableType DEFAULT: None

docker_image

TYPE: JobNewClusterDockerImage | None | VariableType DEFAULT: None

driver_instance_pool_id

TYPE: str | None | VariableType DEFAULT: None

driver_node_type_flexibility

TYPE: JobNewClusterDriverNodeTypeFlexibility | None | VariableType DEFAULT: None

driver_node_type_id

TYPE: str | None | VariableType DEFAULT: None

enable_elastic_disk

TYPE: bool | None | VariableType DEFAULT: None

enable_local_disk_encryption

TYPE: bool | None | VariableType DEFAULT: None

gcp_attributes

TYPE: JobNewClusterGcpAttributes | None | VariableType DEFAULT: None

idempotency_token

TYPE: str | None | VariableType DEFAULT: None

init_scripts

TYPE: list[JobNewClusterInitScripts] | None | VariableType DEFAULT: None

instance_pool_id

TYPE: str | None | VariableType DEFAULT: None

is_single_node

TYPE: bool | None | VariableType DEFAULT: None

kind

TYPE: str | None | VariableType DEFAULT: None

library

(Set) An optional list of libraries to be installed on the cluster that will execute the job

TYPE: list[JobNewClusterLibrary] | None | VariableType DEFAULT: None

node_type_id

TYPE: str | None | VariableType DEFAULT: None

num_workers

TYPE: int | None | VariableType DEFAULT: None

policy_id

TYPE: str | None | VariableType DEFAULT: None

remote_disk_throughput

TYPE: int | None | VariableType DEFAULT: None

runtime_engine

TYPE: str | None | VariableType DEFAULT: None

single_user_name

TYPE: str | None | VariableType DEFAULT: None

spark_conf

TYPE: dict[str, str] | None | VariableType DEFAULT: None

spark_env_vars

TYPE: dict[str, str] | None | VariableType DEFAULT: None

spark_version

TYPE: str | None | VariableType DEFAULT: None

ssh_public_keys

TYPE: list[str] | None | VariableType DEFAULT: None

total_initial_remote_disk_size

TYPE: int | None | VariableType DEFAULT: None

use_ml_runtime

TYPE: bool | None | VariableType DEFAULT: None

worker_node_type_flexibility

TYPE: JobNewClusterWorkerNodeTypeFlexibility | None | VariableType DEFAULT: None

workload_type

isn't supported

TYPE: JobNewClusterWorkloadType | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterAutoscale ¤

Bases: BaseModel

PARAMETER DESCRIPTION
max_workers

TYPE: int | None | VariableType DEFAULT: None

min_workers

TYPE: int | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterAwsAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

ebs_volume_count

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_iops

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_size

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_throughput

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_type

TYPE: str | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

instance_profile_arn

TYPE: str | None | VariableType DEFAULT: None

spot_bid_price_percent

TYPE: int | None | VariableType DEFAULT: None

zone_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterAzureAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

log_analytics_info

TYPE: JobNewClusterAzureAttributesLogAnalyticsInfo | None | VariableType DEFAULT: None

spot_bid_max_price

TYPE: float | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterAzureAttributesLogAnalyticsInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
log_analytics_primary_key

TYPE: str | None | VariableType DEFAULT: None

log_analytics_workspace_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterClusterLogConf ¤

Bases: BaseModel

PARAMETER DESCRIPTION
dbfs

TYPE: JobNewClusterClusterLogConfDbfs | None | VariableType DEFAULT: None

s3

TYPE: JobNewClusterClusterLogConfS3 | None | VariableType DEFAULT: None

volumes

TYPE: JobNewClusterClusterLogConfVolumes | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterClusterLogConfDbfs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterClusterLogConfS3 ¤

Bases: BaseModel

PARAMETER DESCRIPTION
canned_acl

TYPE: str | None | VariableType DEFAULT: None

destination

TYPE: str | VariableType

enable_encryption

TYPE: bool | None | VariableType DEFAULT: None

encryption_type

TYPE: str | None | VariableType DEFAULT: None

endpoint

TYPE: str | None | VariableType DEFAULT: None

kms_key

TYPE: str | None | VariableType DEFAULT: None

region

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterClusterLogConfVolumes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterClusterMountInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
local_mount_dir_path

TYPE: str | VariableType

network_filesystem_info

TYPE: JobNewClusterClusterMountInfoNetworkFilesystemInfo | None | VariableType DEFAULT: None

remote_mount_dir_path

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterClusterMountInfoNetworkFilesystemInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
mount_options

TYPE: str | None | VariableType DEFAULT: None

server_address

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterDockerImage ¤

Bases: BaseModel

PARAMETER DESCRIPTION
basic_auth

TYPE: JobNewClusterDockerImageBasicAuth | None | VariableType DEFAULT: None

url

URL of the job on the given workspace

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterDockerImageBasicAuth ¤

Bases: BaseModel

PARAMETER DESCRIPTION
password

TYPE: str | VariableType

username

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterDriverNodeTypeFlexibility ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alternate_node_type_ids

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterGcpAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

boot_disk_size

TYPE: int | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

google_service_account

TYPE: str | None | VariableType DEFAULT: None

local_ssd_count

TYPE: int | None | VariableType DEFAULT: None

use_preemptible_executors

TYPE: bool | None | VariableType DEFAULT: None

zone_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterInitScripts ¤

Bases: BaseModel

PARAMETER DESCRIPTION
abfss

TYPE: JobNewClusterInitScriptsAbfss | None | VariableType DEFAULT: None

dbfs

TYPE: JobNewClusterInitScriptsDbfs | None | VariableType DEFAULT: None

file

block consisting of single string fields:

TYPE: JobNewClusterInitScriptsFile | None | VariableType DEFAULT: None

gcs

TYPE: JobNewClusterInitScriptsGcs | None | VariableType DEFAULT: None

s3

TYPE: JobNewClusterInitScriptsS3 | None | VariableType DEFAULT: None

volumes

TYPE: JobNewClusterInitScriptsVolumes | None | VariableType DEFAULT: None

workspace

TYPE: JobNewClusterInitScriptsWorkspace | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterInitScriptsAbfss ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterInitScriptsDbfs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterInitScriptsFile ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterInitScriptsGcs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterInitScriptsS3 ¤

Bases: BaseModel

PARAMETER DESCRIPTION
canned_acl

TYPE: str | None | VariableType DEFAULT: None

destination

TYPE: str | VariableType

enable_encryption

TYPE: bool | None | VariableType DEFAULT: None

encryption_type

TYPE: str | None | VariableType DEFAULT: None

endpoint

TYPE: str | None | VariableType DEFAULT: None

kms_key

TYPE: str | None | VariableType DEFAULT: None

region

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterInitScriptsVolumes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterInitScriptsWorkspace ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobNewClusterLibrary ¤

Bases: BaseModel

PARAMETER DESCRIPTION
cran

TYPE: JobNewClusterLibraryCran | None | VariableType DEFAULT: None

egg

TYPE: str | None | VariableType DEFAULT: None

jar

TYPE: str | None | VariableType DEFAULT: None

maven

TYPE: JobNewClusterLibraryMaven | None | VariableType DEFAULT: None

pypi

TYPE: JobNewClusterLibraryPypi | None | VariableType DEFAULT: None

requirements

TYPE: str | None | VariableType DEFAULT: None

whl

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterLibraryCran ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterLibraryMaven ¤

Bases: BaseModel

PARAMETER DESCRIPTION
coordinates

TYPE: str | VariableType

exclusions

TYPE: list[str] | None | VariableType DEFAULT: None

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterLibraryPypi ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterWorkerNodeTypeFlexibility ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alternate_node_type_ids

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterWorkloadType ¤

Bases: BaseModel

PARAMETER DESCRIPTION
clients

TYPE: JobNewClusterWorkloadTypeClients | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNewClusterWorkloadTypeClients ¤

Bases: BaseModel

PARAMETER DESCRIPTION
jobs

TYPE: bool | None | VariableType DEFAULT: None

notebooks

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNotebookTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
base_parameters

(Map) Base parameters to be used for each run of this job. If the run is initiated by a call to run-now with parameters specified, the two parameters maps will be merged. If the same key is specified in base_parameters and in run-now, the value from run-now will be used. If the notebook takes a parameter that is not specified in the job's base_parameters or the run-now override parameters, the default value from the notebook will be used. Retrieve these parameters in a notebook using dbutils.widgets.get

TYPE: dict[str, str] | None | VariableType DEFAULT: None

notebook_path

The path of the databricks_notebook to be run in the Databricks workspace or remote repository. For notebooks stored in the Databricks workspace, the path must be absolute and begin with a slash. For notebooks stored in a remote repository, the path must be relative. This field is required

TYPE: str | VariableType

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobNotificationSettings ¤

Bases: BaseModel

PARAMETER DESCRIPTION
no_alert_for_canceled_runs

(Bool) don't send alert for cancelled runs

TYPE: bool | None | VariableType DEFAULT: None

no_alert_for_skipped_runs

(Bool) don't send alert for skipped runs

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobParameter ¤

Bases: BaseModel

PARAMETER DESCRIPTION
default

Default value of the parameter

TYPE: str | VariableType

name

The name of the defined parameter. May only contain alphanumeric characters, _, -, and .

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobPipelineTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
full_refresh

(Bool) Specifies if there should be full refresh of the pipeline

TYPE: bool | None | VariableType DEFAULT: None

pipeline_id

The pipeline's unique ID

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobPythonWheelTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
entry_point

Python function as entry point for the task

TYPE: str | None | VariableType DEFAULT: None

named_parameters

Named parameters for the task

TYPE: dict[str, str] | None | VariableType DEFAULT: None

package_name

Name of Python package

TYPE: str | None | VariableType DEFAULT: None

parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobQueue ¤

Bases: BaseModel

PARAMETER DESCRIPTION
enabled

If true, enable queueing for the job

TYPE: bool | VariableType


laktory.models.resources.databricks.job.JobRunAs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
group_name

TYPE: str | None | VariableType DEFAULT: None

service_principal_name

The application ID of an active service principal. Setting this field requires the servicePrincipal/user role

TYPE: str | None | VariableType DEFAULT: None

user_name

The email of an active workspace user. Non-admin users can only set this field to their own email

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobRunJobTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
job_id

(String) ID of the job

TYPE: int | VariableType

job_parameters

(Map) Job parameters for the task

TYPE: dict[str, str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobSchedule ¤

Bases: BaseModel

PARAMETER DESCRIPTION
pause_status

Indicate whether this trigger is paused or not. Either PAUSED or UNPAUSED. When the pause_status field is omitted in the block, the server will default to using UNPAUSED as a value for pause_status

TYPE: str | None | VariableType DEFAULT: None

quartz_cron_expression

A Cron expression using Quartz syntax that describes the schedule for a job. This field is required

TYPE: str | VariableType

timezone_id

A Java timezone ID. The schedule for a job will be resolved with respect to this timezone. See Java TimeZone for details. This field is required

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobSparkJarTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
jar_uri

TYPE: str | None | VariableType DEFAULT: None

main_class_name

The full name of the class containing the main method to be executed. This class must be contained in a JAR provided as a library. The code should use SparkContext.getOrCreate to obtain a Spark context; otherwise, runs of the job will fail

TYPE: str | None | VariableType DEFAULT: None

parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobSparkPythonTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None

python_file

The URI of the Python file to be executed. Cloud file URIs (e.g. s3:/, abfss:/, gs:/), workspace paths and remote repository are supported. For Python files stored in the Databricks workspace, the path must be absolute and begin with /. For files stored in a remote repository, the path must be relative. This field is required

TYPE: str | VariableType

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobSparkSubmitTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert_task

TYPE: JobTaskAlertTask | None | VariableType DEFAULT: None

clean_rooms_notebook_task

TYPE: JobTaskCleanRoomsNotebookTask | None | VariableType DEFAULT: None

compute

Task level compute configuration. This block is documented below

TYPE: JobTaskCompute | None | VariableType DEFAULT: None

condition_task

TYPE: JobTaskConditionTask | None | VariableType DEFAULT: None

dashboard_task

TYPE: JobTaskDashboardTask | None | VariableType DEFAULT: None

dbt_cloud_task

TYPE: JobTaskDbtCloudTask | None | VariableType DEFAULT: None

dbt_platform_task

TYPE: JobTaskDbtPlatformTask | None | VariableType DEFAULT: None

dbt_task

TYPE: JobTaskDbtTask | None | VariableType DEFAULT: None

depends_on

block specifying dependency(-ies) for a given task

TYPE: list[JobTaskDependsOn] | None | VariableType DEFAULT: None

description

description for this task

TYPE: str | None | VariableType DEFAULT: None

disable_auto_optimization

A flag to disable auto optimization in serverless tasks

TYPE: bool | None | VariableType DEFAULT: None

disabled

TYPE: bool | None | VariableType DEFAULT: None

email_notifications

An optional block to specify a set of email addresses notified when this task begins, completes or fails. The default behavior is to not send any emails. This block is documented below

TYPE: JobTaskEmailNotifications | None | VariableType DEFAULT: None

environment_key

an unique identifier of the Environment. It will be referenced from environment_key attribute of corresponding task

TYPE: str | None | VariableType DEFAULT: None

existing_cluster_id

Identifier of the interactive cluster to run job on. Note: running tasks on interactive clusters may lead to increased costs!

TYPE: str | None | VariableType DEFAULT: None

for_each_task

TYPE: JobTaskForEachTask | None | VariableType DEFAULT: None

gen_ai_compute_task

TYPE: JobTaskGenAiComputeTask | None | VariableType DEFAULT: None

health

block described below that specifies health conditions for a given task

TYPE: JobTaskHealth | None | VariableType DEFAULT: None

job_cluster_key

Identifier that can be referenced in task block, so that cluster is shared between tasks

TYPE: str | None | VariableType DEFAULT: None

library

(Set) An optional list of libraries to be installed on the cluster that will execute the job

TYPE: list[JobTaskLibrary] | None | VariableType DEFAULT: None

max_retries

(Integer) An optional maximum number of times to retry an unsuccessful run. A run is considered to be unsuccessful if it completes with a FAILED or INTERNAL_ERROR lifecycle state. The value -1 means to retry indefinitely and the value 0 means to never retry. The default behavior is to never retry. A run can have the following lifecycle state: PENDING, RUNNING, TERMINATING, TERMINATED, SKIPPED or INTERNAL_ERROR

TYPE: int | None | VariableType DEFAULT: None

min_retry_interval_millis

(Integer) An optional minimal interval in milliseconds between the start of the failed run and the subsequent retry run. The default behavior is that unsuccessful runs are immediately retried

TYPE: int | None | VariableType DEFAULT: None

new_cluster

Block with almost the same set of parameters as for databricks_cluster resource, except following (check the REST API documentation for full list of supported parameters):

TYPE: JobTaskNewCluster | None | VariableType DEFAULT: None

notebook_task

TYPE: JobTaskNotebookTask | None | VariableType DEFAULT: None

notification_settings

An optional block controlling the notification settings on the job level documented below

TYPE: JobTaskNotificationSettings | None | VariableType DEFAULT: None

pipeline_task

TYPE: JobTaskPipelineTask | None | VariableType DEFAULT: None

power_bi_task

TYPE: JobTaskPowerBiTask | None | VariableType DEFAULT: None

python_wheel_task

TYPE: JobTaskPythonWheelTask | None | VariableType DEFAULT: None

retry_on_timeout

(Bool) An optional policy to specify whether to retry a job when it times out. The default behavior is to not retry on timeout

TYPE: bool | None | VariableType DEFAULT: None

run_if

An optional value indicating the condition that determines whether the task should be run once its dependencies have been completed. One of ALL_SUCCESS, AT_LEAST_ONE_SUCCESS, NONE_FAILED, ALL_DONE, AT_LEAST_ONE_FAILED or ALL_FAILED. When omitted, defaults to ALL_SUCCESS

TYPE: str | None | VariableType DEFAULT: None

run_job_task

TYPE: JobTaskRunJobTask | None | VariableType DEFAULT: None

spark_jar_task

TYPE: JobTaskSparkJarTask | None | VariableType DEFAULT: None

spark_python_task

TYPE: JobTaskSparkPythonTask | None | VariableType DEFAULT: None

spark_submit_task

TYPE: JobTaskSparkSubmitTask | None | VariableType DEFAULT: None

sql_task

TYPE: JobTaskSqlTask | None | VariableType DEFAULT: None

task_key

The name of the task this task depends on

TYPE: str | VariableType

timeout_seconds

(Integer) An optional timeout applied to each run of this job. The default behavior is to have no timeout

TYPE: int | None | VariableType DEFAULT: None

webhook_notifications

(List) An optional set of system destinations (for example, webhook destinations or Slack) to be notified when runs of this task begins, completes or fails. The default behavior is to not send any notifications. This field is a block and is documented below

TYPE: JobTaskWebhookNotifications | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskAlertTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert_id

(String) identifier of the Databricks Alert (databricks_alert)

TYPE: str | None | VariableType DEFAULT: None

subscribers

TYPE: list[JobTaskAlertTaskSubscribers] | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None

workspace_path

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskAlertTaskSubscribers ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination_id

TYPE: str | None | VariableType DEFAULT: None

user_name

The email of an active workspace user. Non-admin users can only set this field to their own email

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskCleanRoomsNotebookTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
clean_room_name

TYPE: str | VariableType

etag

TYPE: str | None | VariableType DEFAULT: None

notebook_base_parameters

TYPE: dict[str, str] | None | VariableType DEFAULT: None

notebook_name

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskCompute ¤

Bases: BaseModel

PARAMETER DESCRIPTION
hardware_accelerator

Hardware accelerator configuration for Serverless GPU workloads. Supported values are: * GPU_1xA10: GPU_1xA10: Single A10 GPU configuration. * GPU_8xH100: GPU_8xH100: 8x H100 GPU configuration

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskConditionTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
left

The left operand of the condition task. It could be a string value, job state, or a parameter reference

TYPE: str | VariableType

op

string specifying the operation used to evaluate the given metric. The only supported operation is GREATER_THAN

TYPE: str | VariableType

right

The right operand of the condition task. It could be a string value, job state, or parameter reference

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskDashboardTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
dashboard_id

(String) identifier of the Databricks SQL Dashboard databricks_sql_dashboard

TYPE: str | None | VariableType DEFAULT: None

filters

TYPE: dict[str, str] | None | VariableType DEFAULT: None

subscription

TYPE: JobTaskDashboardTaskSubscription | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskDashboardTaskSubscription ¤

Bases: BaseModel

PARAMETER DESCRIPTION
custom_subject

string specifying a custom subject of email sent

TYPE: str | None | VariableType DEFAULT: None

paused

TYPE: bool | None | VariableType DEFAULT: None

subscribers

TYPE: list[JobTaskDashboardTaskSubscriptionSubscribers] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskDashboardTaskSubscriptionSubscribers ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination_id

TYPE: str | None | VariableType DEFAULT: None

user_name

The email of an active workspace user. Non-admin users can only set this field to their own email

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskDbtCloudTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
connection_resource_name

TYPE: str | None | VariableType DEFAULT: None

dbt_cloud_job_id

TYPE: int | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskDbtPlatformTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
connection_resource_name

TYPE: str | None | VariableType DEFAULT: None

dbt_platform_job_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskDbtTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
catalog

The name of the catalog to use inside Unity Catalog

TYPE: str | None | VariableType DEFAULT: None

commands

(Array) Series of dbt commands to execute in sequence. Every command must start with 'dbt'

TYPE: list[str] | VariableType

profiles_directory

The relative path to the directory in the repository specified by git_source where dbt should look in for the profiles.yml file. If not specified, defaults to the repository's root directory. Equivalent to passing --profile-dir to a dbt command

TYPE: str | None | VariableType DEFAULT: None

project_directory

The path where dbt should look for dbt_project.yml. Equivalent to passing --project-dir to the dbt CLI. * If source is GIT: Relative path to the directory in the repository specified in the git_source block. Defaults to the repository's root directory when not specified. * If source is WORKSPACE: Absolute path to the folder in the workspace

TYPE: str | None | VariableType DEFAULT: None

schema_

The name of the schema dbt should run in. Defaults to default

TYPE: str | None | VariableType DEFAULT: None

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskDependsOn ¤

Bases: BaseModel

PARAMETER DESCRIPTION
outcome

Can only be specified on condition task dependencies. The outcome of the dependent task that must be met for this task to run. Possible values are 'true' or 'false'

TYPE: str | None | VariableType DEFAULT: None

task_key

The name of the task this task depends on

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskEmailNotifications ¤

Bases: BaseModel

PARAMETER DESCRIPTION
no_alert_for_skipped_runs

(Bool) don't send alert for skipped runs

TYPE: bool | None | VariableType DEFAULT: None

on_duration_warning_threshold_exceeded

(List) list of notification IDs to call when the duration of a run exceeds the threshold specified by the RUN_DURATION_SECONDS metric in the health block

TYPE: list[str] | None | VariableType DEFAULT: None

on_failure

(List) list of notification IDs to call when the run fails. A maximum of 3 destinations can be specified

TYPE: list[str] | None | VariableType DEFAULT: None

on_start

(List) list of notification IDs to call when the run starts. A maximum of 3 destinations can be specified

TYPE: list[str] | None | VariableType DEFAULT: None

on_streaming_backlog_exceeded

(List) list of notification IDs to call when any streaming backlog thresholds are exceeded for any stream

TYPE: list[str] | None | VariableType DEFAULT: None

on_success

(List) list of notification IDs to call when the run completes successfully. A maximum of 3 destinations can be specified

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
concurrency

Controls the number of active iteration task runs. Default is 20, maximum allowed is 100

TYPE: int | None | VariableType DEFAULT: None

inputs

(String) Array for task to iterate on. This can be a JSON string or a reference to an array parameter

TYPE: str | VariableType

task

Task to run against the inputs list

TYPE: JobTaskForEachTaskTask | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert_task

TYPE: JobTaskForEachTaskTaskAlertTask | None | VariableType DEFAULT: None

clean_rooms_notebook_task

TYPE: JobTaskForEachTaskTaskCleanRoomsNotebookTask | None | VariableType DEFAULT: None

compute

Task level compute configuration. This block is documented below

TYPE: JobTaskForEachTaskTaskCompute | None | VariableType DEFAULT: None

condition_task

TYPE: JobTaskForEachTaskTaskConditionTask | None | VariableType DEFAULT: None

dashboard_task

TYPE: JobTaskForEachTaskTaskDashboardTask | None | VariableType DEFAULT: None

dbt_cloud_task

TYPE: JobTaskForEachTaskTaskDbtCloudTask | None | VariableType DEFAULT: None

dbt_platform_task

TYPE: JobTaskForEachTaskTaskDbtPlatformTask | None | VariableType DEFAULT: None

dbt_task

TYPE: JobTaskForEachTaskTaskDbtTask | None | VariableType DEFAULT: None

depends_on

block specifying dependency(-ies) for a given task

TYPE: list[JobTaskForEachTaskTaskDependsOn] | None | VariableType DEFAULT: None

description

description for this task

TYPE: str | None | VariableType DEFAULT: None

disable_auto_optimization

A flag to disable auto optimization in serverless tasks

TYPE: bool | None | VariableType DEFAULT: None

disabled

TYPE: bool | None | VariableType DEFAULT: None

email_notifications

An optional block to specify a set of email addresses notified when this task begins, completes or fails. The default behavior is to not send any emails. This block is documented below

TYPE: JobTaskForEachTaskTaskEmailNotifications | None | VariableType DEFAULT: None

environment_key

an unique identifier of the Environment. It will be referenced from environment_key attribute of corresponding task

TYPE: str | None | VariableType DEFAULT: None

existing_cluster_id

Identifier of the interactive cluster to run job on. Note: running tasks on interactive clusters may lead to increased costs!

TYPE: str | None | VariableType DEFAULT: None

gen_ai_compute_task

TYPE: JobTaskForEachTaskTaskGenAiComputeTask | None | VariableType DEFAULT: None

health

block described below that specifies health conditions for a given task

TYPE: JobTaskForEachTaskTaskHealth | None | VariableType DEFAULT: None

job_cluster_key

Identifier that can be referenced in task block, so that cluster is shared between tasks

TYPE: str | None | VariableType DEFAULT: None

library

(Set) An optional list of libraries to be installed on the cluster that will execute the job

TYPE: list[JobTaskForEachTaskTaskLibrary] | None | VariableType DEFAULT: None

max_retries

(Integer) An optional maximum number of times to retry an unsuccessful run. A run is considered to be unsuccessful if it completes with a FAILED or INTERNAL_ERROR lifecycle state. The value -1 means to retry indefinitely and the value 0 means to never retry. The default behavior is to never retry. A run can have the following lifecycle state: PENDING, RUNNING, TERMINATING, TERMINATED, SKIPPED or INTERNAL_ERROR

TYPE: int | None | VariableType DEFAULT: None

min_retry_interval_millis

(Integer) An optional minimal interval in milliseconds between the start of the failed run and the subsequent retry run. The default behavior is that unsuccessful runs are immediately retried

TYPE: int | None | VariableType DEFAULT: None

new_cluster

Block with almost the same set of parameters as for databricks_cluster resource, except following (check the REST API documentation for full list of supported parameters):

TYPE: JobTaskForEachTaskTaskNewCluster | None | VariableType DEFAULT: None

notebook_task

TYPE: JobTaskForEachTaskTaskNotebookTask | None | VariableType DEFAULT: None

notification_settings

An optional block controlling the notification settings on the job level documented below

TYPE: JobTaskForEachTaskTaskNotificationSettings | None | VariableType DEFAULT: None

pipeline_task

TYPE: JobTaskForEachTaskTaskPipelineTask | None | VariableType DEFAULT: None

power_bi_task

TYPE: JobTaskForEachTaskTaskPowerBiTask | None | VariableType DEFAULT: None

python_wheel_task

TYPE: JobTaskForEachTaskTaskPythonWheelTask | None | VariableType DEFAULT: None

retry_on_timeout

(Bool) An optional policy to specify whether to retry a job when it times out. The default behavior is to not retry on timeout

TYPE: bool | None | VariableType DEFAULT: None

run_if

An optional value indicating the condition that determines whether the task should be run once its dependencies have been completed. One of ALL_SUCCESS, AT_LEAST_ONE_SUCCESS, NONE_FAILED, ALL_DONE, AT_LEAST_ONE_FAILED or ALL_FAILED. When omitted, defaults to ALL_SUCCESS

TYPE: str | None | VariableType DEFAULT: None

run_job_task

TYPE: JobTaskForEachTaskTaskRunJobTask | None | VariableType DEFAULT: None

spark_jar_task

TYPE: JobTaskForEachTaskTaskSparkJarTask | None | VariableType DEFAULT: None

spark_python_task

TYPE: JobTaskForEachTaskTaskSparkPythonTask | None | VariableType DEFAULT: None

spark_submit_task

TYPE: JobTaskForEachTaskTaskSparkSubmitTask | None | VariableType DEFAULT: None

sql_task

TYPE: JobTaskForEachTaskTaskSqlTask | None | VariableType DEFAULT: None

task_key

The name of the task this task depends on

TYPE: str | VariableType

timeout_seconds

(Integer) An optional timeout applied to each run of this job. The default behavior is to have no timeout

TYPE: int | None | VariableType DEFAULT: None

webhook_notifications

(List) An optional set of system destinations (for example, webhook destinations or Slack) to be notified when runs of this task begins, completes or fails. The default behavior is to not send any notifications. This field is a block and is documented below

TYPE: JobTaskForEachTaskTaskWebhookNotifications | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskAlertTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert_id

(String) identifier of the Databricks Alert (databricks_alert)

TYPE: str | None | VariableType DEFAULT: None

subscribers

TYPE: list[JobTaskForEachTaskTaskAlertTaskSubscribers] | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None

workspace_path

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskAlertTaskSubscribers ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination_id

TYPE: str | None | VariableType DEFAULT: None

user_name

The email of an active workspace user. Non-admin users can only set this field to their own email

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskCleanRoomsNotebookTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
clean_room_name

TYPE: str | VariableType

etag

TYPE: str | None | VariableType DEFAULT: None

notebook_base_parameters

TYPE: dict[str, str] | None | VariableType DEFAULT: None

notebook_name

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskCompute ¤

Bases: BaseModel

PARAMETER DESCRIPTION
hardware_accelerator

Hardware accelerator configuration for Serverless GPU workloads. Supported values are: * GPU_1xA10: GPU_1xA10: Single A10 GPU configuration. * GPU_8xH100: GPU_8xH100: 8x H100 GPU configuration

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskConditionTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
left

The left operand of the condition task. It could be a string value, job state, or a parameter reference

TYPE: str | VariableType

op

string specifying the operation used to evaluate the given metric. The only supported operation is GREATER_THAN

TYPE: str | VariableType

right

The right operand of the condition task. It could be a string value, job state, or parameter reference

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskDashboardTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
dashboard_id

(String) identifier of the Databricks SQL Dashboard databricks_sql_dashboard

TYPE: str | None | VariableType DEFAULT: None

filters

TYPE: dict[str, str] | None | VariableType DEFAULT: None

subscription

TYPE: JobTaskForEachTaskTaskDashboardTaskSubscription | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskDashboardTaskSubscription ¤

Bases: BaseModel

PARAMETER DESCRIPTION
custom_subject

string specifying a custom subject of email sent

TYPE: str | None | VariableType DEFAULT: None

paused

TYPE: bool | None | VariableType DEFAULT: None

subscribers

TYPE: list[JobTaskForEachTaskTaskDashboardTaskSubscriptionSubscribers] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskDashboardTaskSubscriptionSubscribers ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination_id

TYPE: str | None | VariableType DEFAULT: None

user_name

The email of an active workspace user. Non-admin users can only set this field to their own email

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskDbtCloudTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
connection_resource_name

TYPE: str | None | VariableType DEFAULT: None

dbt_cloud_job_id

TYPE: int | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskDbtPlatformTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
connection_resource_name

TYPE: str | None | VariableType DEFAULT: None

dbt_platform_job_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskDbtTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
catalog

The name of the catalog to use inside Unity Catalog

TYPE: str | None | VariableType DEFAULT: None

commands

(Array) Series of dbt commands to execute in sequence. Every command must start with 'dbt'

TYPE: list[str] | VariableType

profiles_directory

The relative path to the directory in the repository specified by git_source where dbt should look in for the profiles.yml file. If not specified, defaults to the repository's root directory. Equivalent to passing --profile-dir to a dbt command

TYPE: str | None | VariableType DEFAULT: None

project_directory

The path where dbt should look for dbt_project.yml. Equivalent to passing --project-dir to the dbt CLI. * If source is GIT: Relative path to the directory in the repository specified in the git_source block. Defaults to the repository's root directory when not specified. * If source is WORKSPACE: Absolute path to the folder in the workspace

TYPE: str | None | VariableType DEFAULT: None

schema_

The name of the schema dbt should run in. Defaults to default

TYPE: str | None | VariableType DEFAULT: None

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskDependsOn ¤

Bases: BaseModel

PARAMETER DESCRIPTION
outcome

Can only be specified on condition task dependencies. The outcome of the dependent task that must be met for this task to run. Possible values are 'true' or 'false'

TYPE: str | None | VariableType DEFAULT: None

task_key

The name of the task this task depends on

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskEmailNotifications ¤

Bases: BaseModel

PARAMETER DESCRIPTION
no_alert_for_skipped_runs

(Bool) don't send alert for skipped runs

TYPE: bool | None | VariableType DEFAULT: None

on_duration_warning_threshold_exceeded

(List) list of notification IDs to call when the duration of a run exceeds the threshold specified by the RUN_DURATION_SECONDS metric in the health block

TYPE: list[str] | None | VariableType DEFAULT: None

on_failure

(List) list of notification IDs to call when the run fails. A maximum of 3 destinations can be specified

TYPE: list[str] | None | VariableType DEFAULT: None

on_start

(List) list of notification IDs to call when the run starts. A maximum of 3 destinations can be specified

TYPE: list[str] | None | VariableType DEFAULT: None

on_streaming_backlog_exceeded

(List) list of notification IDs to call when any streaming backlog thresholds are exceeded for any stream

TYPE: list[str] | None | VariableType DEFAULT: None

on_success

(List) list of notification IDs to call when the run completes successfully. A maximum of 3 destinations can be specified

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskGenAiComputeTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
command

TYPE: str | None | VariableType DEFAULT: None

compute

Task level compute configuration. This block is documented below

TYPE: JobTaskForEachTaskTaskGenAiComputeTaskCompute | None | VariableType DEFAULT: None

dl_runtime_image

TYPE: str | VariableType

mlflow_experiment_name

TYPE: str | None | VariableType DEFAULT: None

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None

training_script_path

TYPE: str | None | VariableType DEFAULT: None

yaml_parameters

TYPE: str | None | VariableType DEFAULT: None

yaml_parameters_file_path

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskGenAiComputeTaskCompute ¤

Bases: BaseModel

PARAMETER DESCRIPTION
gpu_node_pool_id

TYPE: str | None | VariableType DEFAULT: None

gpu_type

TYPE: str | None | VariableType DEFAULT: None

num_gpus

TYPE: int | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskHealth ¤

Bases: BaseModel

PARAMETER DESCRIPTION
rules

(List) list of rules that are represented as objects with the following attributes:

TYPE: list[JobTaskForEachTaskTaskHealthRules] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskHealthRules ¤

Bases: BaseModel

PARAMETER DESCRIPTION
metric

string specifying the metric to check, like RUN_DURATION_SECONDS, STREAMING_BACKLOG_FILES, etc. - check the Jobs REST API documentation for the full list of supported metrics

TYPE: str | VariableType

op

string specifying the operation used to evaluate the given metric. The only supported operation is GREATER_THAN

TYPE: str | VariableType

value

integer value used to compare to the given metric

TYPE: int | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskLibrary ¤

Bases: BaseModel

PARAMETER DESCRIPTION
cran

TYPE: JobTaskForEachTaskTaskLibraryCran | None | VariableType DEFAULT: None

egg

TYPE: str | None | VariableType DEFAULT: None

jar

TYPE: str | None | VariableType DEFAULT: None

maven

TYPE: JobTaskForEachTaskTaskLibraryMaven | None | VariableType DEFAULT: None

pypi

TYPE: JobTaskForEachTaskTaskLibraryPypi | None | VariableType DEFAULT: None

requirements

TYPE: str | None | VariableType DEFAULT: None

whl

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskLibraryCran ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskLibraryMaven ¤

Bases: BaseModel

PARAMETER DESCRIPTION
coordinates

TYPE: str | VariableType

exclusions

TYPE: list[str] | None | VariableType DEFAULT: None

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskLibraryPypi ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewCluster ¤

Bases: BaseModel

PARAMETER DESCRIPTION
apply_policy_default_values

TYPE: bool | None | VariableType DEFAULT: None

autoscale

TYPE: JobTaskForEachTaskTaskNewClusterAutoscale | None | VariableType DEFAULT: None

aws_attributes

TYPE: JobTaskForEachTaskTaskNewClusterAwsAttributes | None | VariableType DEFAULT: None

azure_attributes

TYPE: JobTaskForEachTaskTaskNewClusterAzureAttributes | None | VariableType DEFAULT: None

cluster_id

TYPE: str | None | VariableType DEFAULT: None

cluster_log_conf

TYPE: JobTaskForEachTaskTaskNewClusterClusterLogConf | None | VariableType DEFAULT: None

cluster_mount_info

TYPE: list[JobTaskForEachTaskTaskNewClusterClusterMountInfo] | None | VariableType DEFAULT: None

cluster_name

TYPE: str | None | VariableType DEFAULT: None

custom_tags

TYPE: dict[str, str] | None | VariableType DEFAULT: None

data_security_mode

TYPE: str | None | VariableType DEFAULT: None

docker_image

TYPE: JobTaskForEachTaskTaskNewClusterDockerImage | None | VariableType DEFAULT: None

driver_instance_pool_id

TYPE: str | None | VariableType DEFAULT: None

driver_node_type_flexibility

TYPE: JobTaskForEachTaskTaskNewClusterDriverNodeTypeFlexibility | None | VariableType DEFAULT: None

driver_node_type_id

TYPE: str | None | VariableType DEFAULT: None

enable_elastic_disk

TYPE: bool | None | VariableType DEFAULT: None

enable_local_disk_encryption

TYPE: bool | None | VariableType DEFAULT: None

gcp_attributes

TYPE: JobTaskForEachTaskTaskNewClusterGcpAttributes | None | VariableType DEFAULT: None

idempotency_token

TYPE: str | None | VariableType DEFAULT: None

init_scripts

TYPE: list[JobTaskForEachTaskTaskNewClusterInitScripts] | None | VariableType DEFAULT: None

instance_pool_id

TYPE: str | None | VariableType DEFAULT: None

is_single_node

TYPE: bool | None | VariableType DEFAULT: None

kind

TYPE: str | None | VariableType DEFAULT: None

library

(Set) An optional list of libraries to be installed on the cluster that will execute the job

TYPE: list[JobTaskForEachTaskTaskNewClusterLibrary] | None | VariableType DEFAULT: None

node_type_id

TYPE: str | None | VariableType DEFAULT: None

num_workers

TYPE: int | None | VariableType DEFAULT: None

policy_id

TYPE: str | None | VariableType DEFAULT: None

remote_disk_throughput

TYPE: int | None | VariableType DEFAULT: None

runtime_engine

TYPE: str | None | VariableType DEFAULT: None

single_user_name

TYPE: str | None | VariableType DEFAULT: None

spark_conf

TYPE: dict[str, str] | None | VariableType DEFAULT: None

spark_env_vars

TYPE: dict[str, str] | None | VariableType DEFAULT: None

spark_version

TYPE: str | None | VariableType DEFAULT: None

ssh_public_keys

TYPE: list[str] | None | VariableType DEFAULT: None

total_initial_remote_disk_size

TYPE: int | None | VariableType DEFAULT: None

use_ml_runtime

TYPE: bool | None | VariableType DEFAULT: None

worker_node_type_flexibility

TYPE: JobTaskForEachTaskTaskNewClusterWorkerNodeTypeFlexibility | None | VariableType DEFAULT: None

workload_type

isn't supported

TYPE: JobTaskForEachTaskTaskNewClusterWorkloadType | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterAutoscale ¤

Bases: BaseModel

PARAMETER DESCRIPTION
max_workers

TYPE: int | None | VariableType DEFAULT: None

min_workers

TYPE: int | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterAwsAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

ebs_volume_count

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_iops

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_size

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_throughput

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_type

TYPE: str | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

instance_profile_arn

TYPE: str | None | VariableType DEFAULT: None

spot_bid_price_percent

TYPE: int | None | VariableType DEFAULT: None

zone_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterAzureAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

log_analytics_info

TYPE: JobTaskForEachTaskTaskNewClusterAzureAttributesLogAnalyticsInfo | None | VariableType DEFAULT: None

spot_bid_max_price

TYPE: float | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterAzureAttributesLogAnalyticsInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
log_analytics_primary_key

TYPE: str | None | VariableType DEFAULT: None

log_analytics_workspace_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterClusterLogConf ¤

Bases: BaseModel

PARAMETER DESCRIPTION
dbfs

TYPE: JobTaskForEachTaskTaskNewClusterClusterLogConfDbfs | None | VariableType DEFAULT: None

s3

TYPE: JobTaskForEachTaskTaskNewClusterClusterLogConfS3 | None | VariableType DEFAULT: None

volumes

TYPE: JobTaskForEachTaskTaskNewClusterClusterLogConfVolumes | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterClusterLogConfDbfs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterClusterLogConfS3 ¤

Bases: BaseModel

PARAMETER DESCRIPTION
canned_acl

TYPE: str | None | VariableType DEFAULT: None

destination

TYPE: str | VariableType

enable_encryption

TYPE: bool | None | VariableType DEFAULT: None

encryption_type

TYPE: str | None | VariableType DEFAULT: None

endpoint

TYPE: str | None | VariableType DEFAULT: None

kms_key

TYPE: str | None | VariableType DEFAULT: None

region

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterClusterLogConfVolumes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterClusterMountInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
local_mount_dir_path

TYPE: str | VariableType

network_filesystem_info

TYPE: JobTaskForEachTaskTaskNewClusterClusterMountInfoNetworkFilesystemInfo | None | VariableType DEFAULT: None

remote_mount_dir_path

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterClusterMountInfoNetworkFilesystemInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
mount_options

TYPE: str | None | VariableType DEFAULT: None

server_address

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterDockerImage ¤

Bases: BaseModel

PARAMETER DESCRIPTION
basic_auth

TYPE: JobTaskForEachTaskTaskNewClusterDockerImageBasicAuth | None | VariableType DEFAULT: None

url

URL of the job on the given workspace

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterDockerImageBasicAuth ¤

Bases: BaseModel

PARAMETER DESCRIPTION
password

TYPE: str | VariableType

username

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterDriverNodeTypeFlexibility ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alternate_node_type_ids

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterGcpAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

boot_disk_size

TYPE: int | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

google_service_account

TYPE: str | None | VariableType DEFAULT: None

local_ssd_count

TYPE: int | None | VariableType DEFAULT: None

use_preemptible_executors

TYPE: bool | None | VariableType DEFAULT: None

zone_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterInitScripts ¤

Bases: BaseModel

PARAMETER DESCRIPTION
abfss

TYPE: JobTaskForEachTaskTaskNewClusterInitScriptsAbfss | None | VariableType DEFAULT: None

dbfs

TYPE: JobTaskForEachTaskTaskNewClusterInitScriptsDbfs | None | VariableType DEFAULT: None

file

block consisting of single string fields:

TYPE: JobTaskForEachTaskTaskNewClusterInitScriptsFile | None | VariableType DEFAULT: None

gcs

TYPE: JobTaskForEachTaskTaskNewClusterInitScriptsGcs | None | VariableType DEFAULT: None

s3

TYPE: JobTaskForEachTaskTaskNewClusterInitScriptsS3 | None | VariableType DEFAULT: None

volumes

TYPE: JobTaskForEachTaskTaskNewClusterInitScriptsVolumes | None | VariableType DEFAULT: None

workspace

TYPE: JobTaskForEachTaskTaskNewClusterInitScriptsWorkspace | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterInitScriptsAbfss ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterInitScriptsDbfs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterInitScriptsFile ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterInitScriptsGcs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterInitScriptsS3 ¤

Bases: BaseModel

PARAMETER DESCRIPTION
canned_acl

TYPE: str | None | VariableType DEFAULT: None

destination

TYPE: str | VariableType

enable_encryption

TYPE: bool | None | VariableType DEFAULT: None

encryption_type

TYPE: str | None | VariableType DEFAULT: None

endpoint

TYPE: str | None | VariableType DEFAULT: None

kms_key

TYPE: str | None | VariableType DEFAULT: None

region

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterInitScriptsVolumes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterInitScriptsWorkspace ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterLibrary ¤

Bases: BaseModel

PARAMETER DESCRIPTION
cran

TYPE: JobTaskForEachTaskTaskNewClusterLibraryCran | None | VariableType DEFAULT: None

egg

TYPE: str | None | VariableType DEFAULT: None

jar

TYPE: str | None | VariableType DEFAULT: None

maven

TYPE: JobTaskForEachTaskTaskNewClusterLibraryMaven | None | VariableType DEFAULT: None

pypi

TYPE: JobTaskForEachTaskTaskNewClusterLibraryPypi | None | VariableType DEFAULT: None

requirements

TYPE: str | None | VariableType DEFAULT: None

whl

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterLibraryCran ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterLibraryMaven ¤

Bases: BaseModel

PARAMETER DESCRIPTION
coordinates

TYPE: str | VariableType

exclusions

TYPE: list[str] | None | VariableType DEFAULT: None

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterLibraryPypi ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterWorkerNodeTypeFlexibility ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alternate_node_type_ids

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterWorkloadType ¤

Bases: BaseModel

PARAMETER DESCRIPTION
clients

TYPE: JobTaskForEachTaskTaskNewClusterWorkloadTypeClients | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNewClusterWorkloadTypeClients ¤

Bases: BaseModel

PARAMETER DESCRIPTION
jobs

TYPE: bool | None | VariableType DEFAULT: None

notebooks

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNotebookTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
base_parameters

(Map) Base parameters to be used for each run of this job. If the run is initiated by a call to run-now with parameters specified, the two parameters maps will be merged. If the same key is specified in base_parameters and in run-now, the value from run-now will be used. If the notebook takes a parameter that is not specified in the job's base_parameters or the run-now override parameters, the default value from the notebook will be used. Retrieve these parameters in a notebook using dbutils.widgets.get

TYPE: dict[str, str] | None | VariableType DEFAULT: None

notebook_path

The path of the databricks_notebook to be run in the Databricks workspace or remote repository. For notebooks stored in the Databricks workspace, the path must be absolute and begin with a slash. For notebooks stored in a remote repository, the path must be relative. This field is required

TYPE: str | VariableType

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskNotificationSettings ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert_on_last_attempt

(Bool) do not send notifications to recipients specified in on_start for the retried runs and do not send notifications to recipients specified in on_failure until the last retry of the run

TYPE: bool | None | VariableType DEFAULT: None

no_alert_for_canceled_runs

(Bool) don't send alert for cancelled runs

TYPE: bool | None | VariableType DEFAULT: None

no_alert_for_skipped_runs

(Bool) don't send alert for skipped runs

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskPipelineTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
full_refresh

(Bool) Specifies if there should be full refresh of the pipeline

TYPE: bool | None | VariableType DEFAULT: None

pipeline_id

The pipeline's unique ID

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskPowerBiTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
connection_resource_name

TYPE: str | None | VariableType DEFAULT: None

power_bi_model

TYPE: JobTaskForEachTaskTaskPowerBiTaskPowerBiModel | None | VariableType DEFAULT: None

refresh_after_update

TYPE: bool | None | VariableType DEFAULT: None

tables

TYPE: list[JobTaskForEachTaskTaskPowerBiTaskTables] | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskPowerBiTaskPowerBiModel ¤

Bases: BaseModel

PARAMETER DESCRIPTION
authentication_method

TYPE: str | None | VariableType DEFAULT: None

model_name

TYPE: str | None | VariableType DEFAULT: None

overwrite_existing

TYPE: bool | None | VariableType DEFAULT: None

storage_mode

TYPE: str | None | VariableType DEFAULT: None

workspace_name

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskPowerBiTaskTables ¤

Bases: BaseModel

PARAMETER DESCRIPTION
catalog

The name of the catalog to use inside Unity Catalog

TYPE: str | None | VariableType DEFAULT: None

name

The name of the defined parameter. May only contain alphanumeric characters, _, -, and .

TYPE: str | None | VariableType DEFAULT: None

schema_

The name of the schema dbt should run in. Defaults to default

TYPE: str | None | VariableType DEFAULT: None

storage_mode

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskPythonWheelTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
entry_point

Python function as entry point for the task

TYPE: str | None | VariableType DEFAULT: None

named_parameters

Named parameters for the task

TYPE: dict[str, str] | None | VariableType DEFAULT: None

package_name

Name of Python package

TYPE: str | None | VariableType DEFAULT: None

parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskRunJobTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
dbt_commands

TYPE: list[str] | None | VariableType DEFAULT: None

jar_params

TYPE: list[str] | None | VariableType DEFAULT: None

job_id

(String) ID of the job

TYPE: int | VariableType

job_parameters

(Map) Job parameters for the task

TYPE: dict[str, str] | None | VariableType DEFAULT: None

notebook_params

TYPE: dict[str, str] | None | VariableType DEFAULT: None

pipeline_params

TYPE: JobTaskForEachTaskTaskRunJobTaskPipelineParams | None | VariableType DEFAULT: None

python_named_params

TYPE: dict[str, str] | None | VariableType DEFAULT: None

python_params

TYPE: list[str] | None | VariableType DEFAULT: None

spark_submit_params

TYPE: list[str] | None | VariableType DEFAULT: None

sql_params

TYPE: dict[str, str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskRunJobTaskPipelineParams ¤

Bases: BaseModel

PARAMETER DESCRIPTION
full_refresh

(Bool) Specifies if there should be full refresh of the pipeline

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSparkJarTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
jar_uri

TYPE: str | None | VariableType DEFAULT: None

main_class_name

The full name of the class containing the main method to be executed. This class must be contained in a JAR provided as a library. The code should use SparkContext.getOrCreate to obtain a Spark context; otherwise, runs of the job will fail

TYPE: str | None | VariableType DEFAULT: None

parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None

run_as_repl

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSparkPythonTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None

python_file

The URI of the Python file to be executed. Cloud file URIs (e.g. s3:/, abfss:/, gs:/), workspace paths and remote repository are supported. For Python files stored in the Databricks workspace, the path must be absolute and begin with /. For files stored in a remote repository, the path must be relative. This field is required

TYPE: str | VariableType

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSparkSubmitTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSqlTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert

block consisting of following fields:

TYPE: JobTaskForEachTaskTaskSqlTaskAlert | None | VariableType DEFAULT: None

dashboard

block consisting of following fields:

TYPE: JobTaskForEachTaskTaskSqlTaskDashboard | None | VariableType DEFAULT: None

file

block consisting of single string fields:

TYPE: JobTaskForEachTaskTaskSqlTaskFile | None | VariableType DEFAULT: None

parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: dict[str, str] | None | VariableType DEFAULT: None

query

block consisting of single string field: query_id - identifier of the Databricks Query (databricks_query)

TYPE: JobTaskForEachTaskTaskSqlTaskQuery | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSqlTaskAlert ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert_id

(String) identifier of the Databricks Alert (databricks_alert)

TYPE: str | VariableType

pause_subscriptions

flag that specifies if subscriptions are paused or not

TYPE: bool | None | VariableType DEFAULT: None

subscriptions

a list of subscription blocks consisting out of one of the required fields: user_name for user emails or destination_id - for Alert destination's identifier

TYPE: list[JobTaskForEachTaskTaskSqlTaskAlertSubscriptions] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSqlTaskAlertSubscriptions ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination_id

TYPE: str | None | VariableType DEFAULT: None

user_name

The email of an active workspace user. Non-admin users can only set this field to their own email

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSqlTaskDashboard ¤

Bases: BaseModel

PARAMETER DESCRIPTION
custom_subject

string specifying a custom subject of email sent

TYPE: str | None | VariableType DEFAULT: None

dashboard_id

(String) identifier of the Databricks SQL Dashboard databricks_sql_dashboard

TYPE: str | VariableType

pause_subscriptions

flag that specifies if subscriptions are paused or not

TYPE: bool | None | VariableType DEFAULT: None

subscriptions

a list of subscription blocks consisting out of one of the required fields: user_name for user emails or destination_id - for Alert destination's identifier

TYPE: list[JobTaskForEachTaskTaskSqlTaskDashboardSubscriptions] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSqlTaskDashboardSubscriptions ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination_id

TYPE: str | None | VariableType DEFAULT: None

user_name

The email of an active workspace user. Non-admin users can only set this field to their own email

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSqlTaskFile ¤

Bases: BaseModel

PARAMETER DESCRIPTION
path

If source is GIT: Relative path to the file in the repository specified in the git_source block with SQL commands to execute. If source is WORKSPACE: Absolute path to the file in the workspace with SQL commands to execute

TYPE: str | VariableType

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskSqlTaskQuery ¤

Bases: BaseModel

PARAMETER DESCRIPTION
query_id

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskWebhookNotifications ¤

Bases: BaseModel

PARAMETER DESCRIPTION
on_duration_warning_threshold_exceeded

(List) list of notification IDs to call when the duration of a run exceeds the threshold specified by the RUN_DURATION_SECONDS metric in the health block

TYPE: list[JobTaskForEachTaskTaskWebhookNotificationsOnDurationWarningThresholdExceeded] | None | VariableType DEFAULT: None

on_failure

(List) list of notification IDs to call when the run fails. A maximum of 3 destinations can be specified

TYPE: list[JobTaskForEachTaskTaskWebhookNotificationsOnFailure] | None | VariableType DEFAULT: None

on_start

(List) list of notification IDs to call when the run starts. A maximum of 3 destinations can be specified

TYPE: list[JobTaskForEachTaskTaskWebhookNotificationsOnStart] | None | VariableType DEFAULT: None

on_streaming_backlog_exceeded

(List) list of notification IDs to call when any streaming backlog thresholds are exceeded for any stream

TYPE: list[JobTaskForEachTaskTaskWebhookNotificationsOnStreamingBacklogExceeded] | None | VariableType DEFAULT: None

on_success

(List) list of notification IDs to call when the run completes successfully. A maximum of 3 destinations can be specified

TYPE: list[JobTaskForEachTaskTaskWebhookNotificationsOnSuccess] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskWebhookNotificationsOnDurationWarningThresholdExceeded ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskWebhookNotificationsOnFailure ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskWebhookNotificationsOnStart ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskWebhookNotificationsOnStreamingBacklogExceeded ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTaskForEachTaskTaskWebhookNotificationsOnSuccess ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTaskGenAiComputeTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
command

TYPE: str | None | VariableType DEFAULT: None

compute

Task level compute configuration. This block is documented below

TYPE: JobTaskGenAiComputeTaskCompute | None | VariableType DEFAULT: None

dl_runtime_image

TYPE: str | VariableType

mlflow_experiment_name

TYPE: str | None | VariableType DEFAULT: None

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None

training_script_path

TYPE: str | None | VariableType DEFAULT: None

yaml_parameters

TYPE: str | None | VariableType DEFAULT: None

yaml_parameters_file_path

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskGenAiComputeTaskCompute ¤

Bases: BaseModel

PARAMETER DESCRIPTION
gpu_node_pool_id

TYPE: str | None | VariableType DEFAULT: None

gpu_type

TYPE: str | None | VariableType DEFAULT: None

num_gpus

TYPE: int | VariableType


laktory.models.resources.databricks.job.JobTaskHealth ¤

Bases: BaseModel

PARAMETER DESCRIPTION
rules

(List) list of rules that are represented as objects with the following attributes:

TYPE: list[JobTaskHealthRules] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskHealthRules ¤

Bases: BaseModel

PARAMETER DESCRIPTION
metric

string specifying the metric to check, like RUN_DURATION_SECONDS, STREAMING_BACKLOG_FILES, etc. - check the Jobs REST API documentation for the full list of supported metrics

TYPE: str | VariableType

op

string specifying the operation used to evaluate the given metric. The only supported operation is GREATER_THAN

TYPE: str | VariableType

value

integer value used to compare to the given metric

TYPE: int | VariableType


laktory.models.resources.databricks.job.JobTaskLibrary ¤

Bases: BaseModel

PARAMETER DESCRIPTION
cran

TYPE: JobTaskLibraryCran | None | VariableType DEFAULT: None

egg

TYPE: str | None | VariableType DEFAULT: None

jar

TYPE: str | None | VariableType DEFAULT: None

maven

TYPE: JobTaskLibraryMaven | None | VariableType DEFAULT: None

pypi

TYPE: JobTaskLibraryPypi | None | VariableType DEFAULT: None

requirements

TYPE: str | None | VariableType DEFAULT: None

whl

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskLibraryCran ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskLibraryMaven ¤

Bases: BaseModel

PARAMETER DESCRIPTION
coordinates

TYPE: str | VariableType

exclusions

TYPE: list[str] | None | VariableType DEFAULT: None

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskLibraryPypi ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewCluster ¤

Bases: BaseModel

PARAMETER DESCRIPTION
apply_policy_default_values

TYPE: bool | None | VariableType DEFAULT: None

autoscale

TYPE: JobTaskNewClusterAutoscale | None | VariableType DEFAULT: None

aws_attributes

TYPE: JobTaskNewClusterAwsAttributes | None | VariableType DEFAULT: None

azure_attributes

TYPE: JobTaskNewClusterAzureAttributes | None | VariableType DEFAULT: None

cluster_id

TYPE: str | None | VariableType DEFAULT: None

cluster_log_conf

TYPE: JobTaskNewClusterClusterLogConf | None | VariableType DEFAULT: None

cluster_mount_info

TYPE: list[JobTaskNewClusterClusterMountInfo] | None | VariableType DEFAULT: None

cluster_name

TYPE: str | None | VariableType DEFAULT: None

custom_tags

TYPE: dict[str, str] | None | VariableType DEFAULT: None

data_security_mode

TYPE: str | None | VariableType DEFAULT: None

docker_image

TYPE: JobTaskNewClusterDockerImage | None | VariableType DEFAULT: None

driver_instance_pool_id

TYPE: str | None | VariableType DEFAULT: None

driver_node_type_flexibility

TYPE: JobTaskNewClusterDriverNodeTypeFlexibility | None | VariableType DEFAULT: None

driver_node_type_id

TYPE: str | None | VariableType DEFAULT: None

enable_elastic_disk

TYPE: bool | None | VariableType DEFAULT: None

enable_local_disk_encryption

TYPE: bool | None | VariableType DEFAULT: None

gcp_attributes

TYPE: JobTaskNewClusterGcpAttributes | None | VariableType DEFAULT: None

idempotency_token

TYPE: str | None | VariableType DEFAULT: None

init_scripts

TYPE: list[JobTaskNewClusterInitScripts] | None | VariableType DEFAULT: None

instance_pool_id

TYPE: str | None | VariableType DEFAULT: None

is_single_node

TYPE: bool | None | VariableType DEFAULT: None

kind

TYPE: str | None | VariableType DEFAULT: None

library

(Set) An optional list of libraries to be installed on the cluster that will execute the job

TYPE: list[JobTaskNewClusterLibrary] | None | VariableType DEFAULT: None

node_type_id

TYPE: str | None | VariableType DEFAULT: None

num_workers

TYPE: int | None | VariableType DEFAULT: None

policy_id

TYPE: str | None | VariableType DEFAULT: None

remote_disk_throughput

TYPE: int | None | VariableType DEFAULT: None

runtime_engine

TYPE: str | None | VariableType DEFAULT: None

single_user_name

TYPE: str | None | VariableType DEFAULT: None

spark_conf

TYPE: dict[str, str] | None | VariableType DEFAULT: None

spark_env_vars

TYPE: dict[str, str] | None | VariableType DEFAULT: None

spark_version

TYPE: str | None | VariableType DEFAULT: None

ssh_public_keys

TYPE: list[str] | None | VariableType DEFAULT: None

total_initial_remote_disk_size

TYPE: int | None | VariableType DEFAULT: None

use_ml_runtime

TYPE: bool | None | VariableType DEFAULT: None

worker_node_type_flexibility

TYPE: JobTaskNewClusterWorkerNodeTypeFlexibility | None | VariableType DEFAULT: None

workload_type

isn't supported

TYPE: JobTaskNewClusterWorkloadType | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterAutoscale ¤

Bases: BaseModel

PARAMETER DESCRIPTION
max_workers

TYPE: int | None | VariableType DEFAULT: None

min_workers

TYPE: int | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterAwsAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

ebs_volume_count

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_iops

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_size

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_throughput

TYPE: int | None | VariableType DEFAULT: None

ebs_volume_type

TYPE: str | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

instance_profile_arn

TYPE: str | None | VariableType DEFAULT: None

spot_bid_price_percent

TYPE: int | None | VariableType DEFAULT: None

zone_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterAzureAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

log_analytics_info

TYPE: JobTaskNewClusterAzureAttributesLogAnalyticsInfo | None | VariableType DEFAULT: None

spot_bid_max_price

TYPE: float | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterAzureAttributesLogAnalyticsInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
log_analytics_primary_key

TYPE: str | None | VariableType DEFAULT: None

log_analytics_workspace_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterClusterLogConf ¤

Bases: BaseModel

PARAMETER DESCRIPTION
dbfs

TYPE: JobTaskNewClusterClusterLogConfDbfs | None | VariableType DEFAULT: None

s3

TYPE: JobTaskNewClusterClusterLogConfS3 | None | VariableType DEFAULT: None

volumes

TYPE: JobTaskNewClusterClusterLogConfVolumes | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterClusterLogConfDbfs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterClusterLogConfS3 ¤

Bases: BaseModel

PARAMETER DESCRIPTION
canned_acl

TYPE: str | None | VariableType DEFAULT: None

destination

TYPE: str | VariableType

enable_encryption

TYPE: bool | None | VariableType DEFAULT: None

encryption_type

TYPE: str | None | VariableType DEFAULT: None

endpoint

TYPE: str | None | VariableType DEFAULT: None

kms_key

TYPE: str | None | VariableType DEFAULT: None

region

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterClusterLogConfVolumes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterClusterMountInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
local_mount_dir_path

TYPE: str | VariableType

network_filesystem_info

TYPE: JobTaskNewClusterClusterMountInfoNetworkFilesystemInfo | None | VariableType DEFAULT: None

remote_mount_dir_path

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterClusterMountInfoNetworkFilesystemInfo ¤

Bases: BaseModel

PARAMETER DESCRIPTION
mount_options

TYPE: str | None | VariableType DEFAULT: None

server_address

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterDockerImage ¤

Bases: BaseModel

PARAMETER DESCRIPTION
basic_auth

TYPE: JobTaskNewClusterDockerImageBasicAuth | None | VariableType DEFAULT: None

url

URL of the job on the given workspace

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterDockerImageBasicAuth ¤

Bases: BaseModel

PARAMETER DESCRIPTION
password

TYPE: str | VariableType

username

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterDriverNodeTypeFlexibility ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alternate_node_type_ids

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterGcpAttributes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
availability

TYPE: str | None | VariableType DEFAULT: None

boot_disk_size

TYPE: int | None | VariableType DEFAULT: None

first_on_demand

TYPE: int | None | VariableType DEFAULT: None

google_service_account

TYPE: str | None | VariableType DEFAULT: None

local_ssd_count

TYPE: int | None | VariableType DEFAULT: None

use_preemptible_executors

TYPE: bool | None | VariableType DEFAULT: None

zone_id

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterInitScripts ¤

Bases: BaseModel

PARAMETER DESCRIPTION
abfss

TYPE: JobTaskNewClusterInitScriptsAbfss | None | VariableType DEFAULT: None

dbfs

TYPE: JobTaskNewClusterInitScriptsDbfs | None | VariableType DEFAULT: None

file

block consisting of single string fields:

TYPE: JobTaskNewClusterInitScriptsFile | None | VariableType DEFAULT: None

gcs

TYPE: JobTaskNewClusterInitScriptsGcs | None | VariableType DEFAULT: None

s3

TYPE: JobTaskNewClusterInitScriptsS3 | None | VariableType DEFAULT: None

volumes

TYPE: JobTaskNewClusterInitScriptsVolumes | None | VariableType DEFAULT: None

workspace

TYPE: JobTaskNewClusterInitScriptsWorkspace | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterInitScriptsAbfss ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterInitScriptsDbfs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterInitScriptsFile ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterInitScriptsGcs ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterInitScriptsS3 ¤

Bases: BaseModel

PARAMETER DESCRIPTION
canned_acl

TYPE: str | None | VariableType DEFAULT: None

destination

TYPE: str | VariableType

enable_encryption

TYPE: bool | None | VariableType DEFAULT: None

encryption_type

TYPE: str | None | VariableType DEFAULT: None

endpoint

TYPE: str | None | VariableType DEFAULT: None

kms_key

TYPE: str | None | VariableType DEFAULT: None

region

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterInitScriptsVolumes ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterInitScriptsWorkspace ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskNewClusterLibrary ¤

Bases: BaseModel

PARAMETER DESCRIPTION
cran

TYPE: JobTaskNewClusterLibraryCran | None | VariableType DEFAULT: None

egg

TYPE: str | None | VariableType DEFAULT: None

jar

TYPE: str | None | VariableType DEFAULT: None

maven

TYPE: JobTaskNewClusterLibraryMaven | None | VariableType DEFAULT: None

pypi

TYPE: JobTaskNewClusterLibraryPypi | None | VariableType DEFAULT: None

requirements

TYPE: str | None | VariableType DEFAULT: None

whl

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterLibraryCran ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterLibraryMaven ¤

Bases: BaseModel

PARAMETER DESCRIPTION
coordinates

TYPE: str | VariableType

exclusions

TYPE: list[str] | None | VariableType DEFAULT: None

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterLibraryPypi ¤

Bases: BaseModel

PARAMETER DESCRIPTION
package

TYPE: str | VariableType

repo

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterWorkerNodeTypeFlexibility ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alternate_node_type_ids

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterWorkloadType ¤

Bases: BaseModel

PARAMETER DESCRIPTION
clients

TYPE: JobTaskNewClusterWorkloadTypeClients | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNewClusterWorkloadTypeClients ¤

Bases: BaseModel

PARAMETER DESCRIPTION
jobs

TYPE: bool | None | VariableType DEFAULT: None

notebooks

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNotebookTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
base_parameters

(Map) Base parameters to be used for each run of this job. If the run is initiated by a call to run-now with parameters specified, the two parameters maps will be merged. If the same key is specified in base_parameters and in run-now, the value from run-now will be used. If the notebook takes a parameter that is not specified in the job's base_parameters or the run-now override parameters, the default value from the notebook will be used. Retrieve these parameters in a notebook using dbutils.widgets.get

TYPE: dict[str, str] | None | VariableType DEFAULT: None

notebook_path

The path of the databricks_notebook to be run in the Databricks workspace or remote repository. For notebooks stored in the Databricks workspace, the path must be absolute and begin with a slash. For notebooks stored in a remote repository, the path must be relative. This field is required

TYPE: str | VariableType

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskNotificationSettings ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert_on_last_attempt

(Bool) do not send notifications to recipients specified in on_start for the retried runs and do not send notifications to recipients specified in on_failure until the last retry of the run

TYPE: bool | None | VariableType DEFAULT: None

no_alert_for_canceled_runs

(Bool) don't send alert for cancelled runs

TYPE: bool | None | VariableType DEFAULT: None

no_alert_for_skipped_runs

(Bool) don't send alert for skipped runs

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskPipelineTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
full_refresh

(Bool) Specifies if there should be full refresh of the pipeline

TYPE: bool | None | VariableType DEFAULT: None

pipeline_id

The pipeline's unique ID

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskPowerBiTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
connection_resource_name

TYPE: str | None | VariableType DEFAULT: None

power_bi_model

TYPE: JobTaskPowerBiTaskPowerBiModel | None | VariableType DEFAULT: None

refresh_after_update

TYPE: bool | None | VariableType DEFAULT: None

tables

TYPE: list[JobTaskPowerBiTaskTables] | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskPowerBiTaskPowerBiModel ¤

Bases: BaseModel

PARAMETER DESCRIPTION
authentication_method

TYPE: str | None | VariableType DEFAULT: None

model_name

TYPE: str | None | VariableType DEFAULT: None

overwrite_existing

TYPE: bool | None | VariableType DEFAULT: None

storage_mode

TYPE: str | None | VariableType DEFAULT: None

workspace_name

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskPowerBiTaskTables ¤

Bases: BaseModel

PARAMETER DESCRIPTION
catalog

The name of the catalog to use inside Unity Catalog

TYPE: str | None | VariableType DEFAULT: None

name

The name of the defined parameter. May only contain alphanumeric characters, _, -, and .

TYPE: str | None | VariableType DEFAULT: None

schema_

The name of the schema dbt should run in. Defaults to default

TYPE: str | None | VariableType DEFAULT: None

storage_mode

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskPythonWheelTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
entry_point

Python function as entry point for the task

TYPE: str | None | VariableType DEFAULT: None

named_parameters

Named parameters for the task

TYPE: dict[str, str] | None | VariableType DEFAULT: None

package_name

Name of Python package

TYPE: str | None | VariableType DEFAULT: None

parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskRunJobTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
dbt_commands

TYPE: list[str] | None | VariableType DEFAULT: None

jar_params

TYPE: list[str] | None | VariableType DEFAULT: None

job_id

(String) ID of the job

TYPE: int | VariableType

job_parameters

(Map) Job parameters for the task

TYPE: dict[str, str] | None | VariableType DEFAULT: None

notebook_params

TYPE: dict[str, str] | None | VariableType DEFAULT: None

pipeline_params

TYPE: JobTaskRunJobTaskPipelineParams | None | VariableType DEFAULT: None

python_named_params

TYPE: dict[str, str] | None | VariableType DEFAULT: None

python_params

TYPE: list[str] | None | VariableType DEFAULT: None

spark_submit_params

TYPE: list[str] | None | VariableType DEFAULT: None

sql_params

TYPE: dict[str, str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskRunJobTaskPipelineParams ¤

Bases: BaseModel

PARAMETER DESCRIPTION
full_refresh

(Bool) Specifies if there should be full refresh of the pipeline

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskSparkJarTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
jar_uri

TYPE: str | None | VariableType DEFAULT: None

main_class_name

The full name of the class containing the main method to be executed. This class must be contained in a JAR provided as a library. The code should use SparkContext.getOrCreate to obtain a Spark context; otherwise, runs of the job will fail

TYPE: str | None | VariableType DEFAULT: None

parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None

run_as_repl

TYPE: bool | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskSparkPythonTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None

python_file

The URI of the Python file to be executed. Cloud file URIs (e.g. s3:/, abfss:/, gs:/), workspace paths and remote repository are supported. For Python files stored in the Databricks workspace, the path must be absolute and begin with /. For files stored in a remote repository, the path must be relative. This field is required

TYPE: str | VariableType

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskSparkSubmitTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: list[str] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskSqlTask ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert

block consisting of following fields:

TYPE: JobTaskSqlTaskAlert | None | VariableType DEFAULT: None

dashboard

block consisting of following fields:

TYPE: JobTaskSqlTaskDashboard | None | VariableType DEFAULT: None

file

block consisting of single string fields:

TYPE: JobTaskSqlTaskFile | None | VariableType DEFAULT: None

parameters

(Map) parameters to be used for each run of this task. The SQL alert task does not support custom parameters

TYPE: dict[str, str] | None | VariableType DEFAULT: None

query

block consisting of single string field: query_id - identifier of the Databricks Query (databricks_query)

TYPE: JobTaskSqlTaskQuery | None | VariableType DEFAULT: None

warehouse_id

ID of the (the databricks_sql_endpoint) that will be used to execute the task. Only Serverless & Pro warehouses are supported right now

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskSqlTaskAlert ¤

Bases: BaseModel

PARAMETER DESCRIPTION
alert_id

(String) identifier of the Databricks Alert (databricks_alert)

TYPE: str | VariableType

pause_subscriptions

flag that specifies if subscriptions are paused or not

TYPE: bool | None | VariableType DEFAULT: None

subscriptions

a list of subscription blocks consisting out of one of the required fields: user_name for user emails or destination_id - for Alert destination's identifier

TYPE: list[JobTaskSqlTaskAlertSubscriptions] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskSqlTaskAlertSubscriptions ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination_id

TYPE: str | None | VariableType DEFAULT: None

user_name

The email of an active workspace user. Non-admin users can only set this field to their own email

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskSqlTaskDashboard ¤

Bases: BaseModel

PARAMETER DESCRIPTION
custom_subject

string specifying a custom subject of email sent

TYPE: str | None | VariableType DEFAULT: None

dashboard_id

(String) identifier of the Databricks SQL Dashboard databricks_sql_dashboard

TYPE: str | VariableType

pause_subscriptions

flag that specifies if subscriptions are paused or not

TYPE: bool | None | VariableType DEFAULT: None

subscriptions

a list of subscription blocks consisting out of one of the required fields: user_name for user emails or destination_id - for Alert destination's identifier

TYPE: list[JobTaskSqlTaskDashboardSubscriptions] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskSqlTaskDashboardSubscriptions ¤

Bases: BaseModel

PARAMETER DESCRIPTION
destination_id

TYPE: str | None | VariableType DEFAULT: None

user_name

The email of an active workspace user. Non-admin users can only set this field to their own email

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskSqlTaskFile ¤

Bases: BaseModel

PARAMETER DESCRIPTION
path

If source is GIT: Relative path to the file in the repository specified in the git_source block with SQL commands to execute. If source is WORKSPACE: Absolute path to the file in the workspace with SQL commands to execute

TYPE: str | VariableType

source

The source of the project. Possible values are WORKSPACE and GIT

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskSqlTaskQuery ¤

Bases: BaseModel

PARAMETER DESCRIPTION
query_id

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTaskWebhookNotifications ¤

Bases: BaseModel

PARAMETER DESCRIPTION
on_duration_warning_threshold_exceeded

(List) list of notification IDs to call when the duration of a run exceeds the threshold specified by the RUN_DURATION_SECONDS metric in the health block

TYPE: list[JobTaskWebhookNotificationsOnDurationWarningThresholdExceeded] | None | VariableType DEFAULT: None

on_failure

(List) list of notification IDs to call when the run fails. A maximum of 3 destinations can be specified

TYPE: list[JobTaskWebhookNotificationsOnFailure] | None | VariableType DEFAULT: None

on_start

(List) list of notification IDs to call when the run starts. A maximum of 3 destinations can be specified

TYPE: list[JobTaskWebhookNotificationsOnStart] | None | VariableType DEFAULT: None

on_streaming_backlog_exceeded

(List) list of notification IDs to call when any streaming backlog thresholds are exceeded for any stream

TYPE: list[JobTaskWebhookNotificationsOnStreamingBacklogExceeded] | None | VariableType DEFAULT: None

on_success

(List) list of notification IDs to call when the run completes successfully. A maximum of 3 destinations can be specified

TYPE: list[JobTaskWebhookNotificationsOnSuccess] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTaskWebhookNotificationsOnDurationWarningThresholdExceeded ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTaskWebhookNotificationsOnFailure ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTaskWebhookNotificationsOnStart ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTaskWebhookNotificationsOnStreamingBacklogExceeded ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTaskWebhookNotificationsOnSuccess ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobTimeouts ¤

Bases: BaseModel

PARAMETER DESCRIPTION
create

TYPE: str | None | VariableType DEFAULT: None

update_

TYPE: str | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTrigger ¤

Bases: BaseModel

PARAMETER DESCRIPTION
file_arrival

configuration block to define a trigger for File Arrival events consisting of following attributes:

TYPE: JobTriggerFileArrival | None | VariableType DEFAULT: None

model

TYPE: JobTriggerModel | None | VariableType DEFAULT: None

pause_status

Indicate whether this trigger is paused or not. Either PAUSED or UNPAUSED. When the pause_status field is omitted in the block, the server will default to using UNPAUSED as a value for pause_status

TYPE: str | None | VariableType DEFAULT: None

periodic

configuration block to define a trigger for Periodic Triggers consisting of the following attributes:

TYPE: JobTriggerPeriodic | None | VariableType DEFAULT: None

table_update

configuration block to define a trigger for Table Updates consisting of following attributes:

TYPE: JobTriggerTableUpdate | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTriggerFileArrival ¤

Bases: BaseModel

PARAMETER DESCRIPTION
min_time_between_triggers_seconds

If set, the trigger starts a run only after the specified amount of time passed since the last time the trigger fired. The minimum allowed value is 60 seconds

TYPE: int | None | VariableType DEFAULT: None

url

URL of the job on the given workspace

TYPE: str | VariableType

wait_after_last_change_seconds

If set, the trigger starts a run only after no file activity has occurred for the specified amount of time. This makes it possible to wait for a batch of incoming files to arrive before triggering a run. The minimum allowed value is 60 seconds

TYPE: int | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTriggerModel ¤

Bases: BaseModel

PARAMETER DESCRIPTION
aliases

TYPE: list[str] | None | VariableType DEFAULT: None

condition

The table(s) condition based on which to trigger a job run. Possible values are ANY_UPDATED, ALL_UPDATED

TYPE: str | VariableType

min_time_between_triggers_seconds

If set, the trigger starts a run only after the specified amount of time passed since the last time the trigger fired. The minimum allowed value is 60 seconds

TYPE: int | None | VariableType DEFAULT: None

securable_name

TYPE: str | None | VariableType DEFAULT: None

wait_after_last_change_seconds

If set, the trigger starts a run only after no file activity has occurred for the specified amount of time. This makes it possible to wait for a batch of incoming files to arrive before triggering a run. The minimum allowed value is 60 seconds

TYPE: int | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobTriggerPeriodic ¤

Bases: BaseModel

PARAMETER DESCRIPTION
interval

Specifies the interval at which the job should run

TYPE: int | VariableType

unit

The unit of time for the interval. Possible values are: DAYS, HOURS, WEEKS

TYPE: str | VariableType


laktory.models.resources.databricks.job.JobTriggerTableUpdate ¤

Bases: BaseModel

PARAMETER DESCRIPTION
condition

The table(s) condition based on which to trigger a job run. Possible values are ANY_UPDATED, ALL_UPDATED

TYPE: str | None | VariableType DEFAULT: None

min_time_between_triggers_seconds

If set, the trigger starts a run only after the specified amount of time passed since the last time the trigger fired. The minimum allowed value is 60 seconds

TYPE: int | None | VariableType DEFAULT: None

table_names

A non-empty list of tables to monitor for changes. The table name must be in the format catalog_name.schema_name.table_name

TYPE: list[str] | VariableType

wait_after_last_change_seconds

If set, the trigger starts a run only after no file activity has occurred for the specified amount of time. This makes it possible to wait for a batch of incoming files to arrive before triggering a run. The minimum allowed value is 60 seconds

TYPE: int | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobWebhookNotifications ¤

Bases: BaseModel

PARAMETER DESCRIPTION
on_duration_warning_threshold_exceeded

(List) list of notification IDs to call when the duration of a run exceeds the threshold specified by the RUN_DURATION_SECONDS metric in the health block

TYPE: list[JobWebhookNotificationsOnDurationWarningThresholdExceeded] | None | VariableType DEFAULT: None

on_failure

(List) list of notification IDs to call when the run fails. A maximum of 3 destinations can be specified

TYPE: list[JobWebhookNotificationsOnFailure] | None | VariableType DEFAULT: None

on_start

(List) list of notification IDs to call when the run starts. A maximum of 3 destinations can be specified

TYPE: list[JobWebhookNotificationsOnStart] | None | VariableType DEFAULT: None

on_streaming_backlog_exceeded

(List) list of notification IDs to call when any streaming backlog thresholds are exceeded for any stream

TYPE: list[JobWebhookNotificationsOnStreamingBacklogExceeded] | None | VariableType DEFAULT: None

on_success

(List) list of notification IDs to call when the run completes successfully. A maximum of 3 destinations can be specified

TYPE: list[JobWebhookNotificationsOnSuccess] | None | VariableType DEFAULT: None


laktory.models.resources.databricks.job.JobWebhookNotificationsOnDurationWarningThresholdExceeded ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobWebhookNotificationsOnFailure ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobWebhookNotificationsOnStart ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobWebhookNotificationsOnStreamingBacklogExceeded ¤

Bases: BaseModel


laktory.models.resources.databricks.job.JobWebhookNotificationsOnSuccess ¤

Bases: BaseModel