Skip to content

Stack

laktory.models.stacks.Stack ¤

Bases: BaseModel

The Stack defines a collection of deployable resources, the deployment configuration, some variables and the environment-specific settings.

Examples:

from laktory import models

stack = models.Stack(
    name="workspace",
    resources={
        "databricks_pipelines": {
            "pl-stock-prices": {
                "name": "pl-stock-prices",
                "development": "${vars.is_dev}",
                "libraries": [
                    {"notebook": {"path": "/pipelines/dlt_brz_template.py"}},
                ],
            }
        },
        "databricks_jobs": {
            "job-stock-prices": {
                "name": "job-stock-prices",
                "job_clusters": [
                    {
                        "job_cluster_key": "main",
                        "new_cluster": {
                            "spark_version": "16.3.x-scala2.12",
                            "node_type_id": "Standard_DS3_v2",
                        },
                    }
                ],
                "tasks": [
                    {
                        "task_key": "ingest",
                        "job_cluster_key": "main",
                        "notebook_task": {
                            "notebook_path": "/.laktory/jobs/ingest_stock_prices.py",
                        },
                    },
                    {
                        "task_key": "pipeline",
                        "depends_on": [{"task_key": "ingest"}],
                        "pipeline_task": {
                            "pipeline_id": "${resources.dlt-pl-stock-prices.id}",
                        },
                    },
                ],
            }
        },
    },
    variables={
        "org": "okube",
    },
    environments={
        "dev": {
            "variables": {
                "is_dev": True,
            }
        },
        "prod": {
            "variables": {
                "is_dev": False,
            }
        },
    },
)
References
PARAMETER DESCRIPTION
description

Description of the stack

TYPE: str | VariableType DEFAULT: None

environments

Environment-specific overwrite of config, resources or variables arguments.

TYPE: dict[str | VariableType, EnvironmentSettings | VariableType] | VariableType DEFAULT: {}

iac_backend

IaC backend used for deployment.

TYPE: Literal['terraform'] | VariableType DEFAULT: 'terraform'

name

Name of the stack.

TYPE: str | VariableType

organization

Organization

TYPE: str | None | VariableType DEFAULT: None

resources

Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.

TYPE: StackResources | None | VariableType DEFAULT: StackResources(variables={}, databricks_accesscontrolrulesets={}, databricks_alerts={}, databricks_apps={}, databricks_catalogs={}, databricks_clusterpolicies={}, databricks_clusters={}, databricks_connections={}, databricks_currentusers={}, databricks_dashboards={}, databricks_dbfsfiles={}, databricks_directories={}, databricks_entitlements={}, databricks_pipelines={}, databricks_externallocations={}, databricks_grant={}, databricks_grants={}, databricks_groups={}, databricks_jobs={}, databricks_metastoreassignments={}, databricks_metastoredataaccesses={}, databricks_metastores={}, databricks_mlflowexperiments={}, databricks_mlflowmodels={}, databricks_mlflowwebhooks={}, databricks_networkconnectivityconfig={}, databricks_notebooks={}, databricks_notificationdestinations={}, databricks_obotokens={}, databricks_permissions={}, databricks_qualitymonitors={}, databricks_pythonpackages={}, databricks_queries={}, databricks_recipients={}, databricks_repos={}, databricks_schemas={}, databricks_secrets={}, databricks_secretscopes={}, databricks_serviceprincipals={}, databricks_shares={}, databricks_storagecredentials={}, databricks_tables={}, databricks_users={}, databricks_vectorsearchendpoints={}, databricks_vectorsearchindexes={}, databricks_volumes={}, databricks_warehouses={}, databricks_workspacebindings={}, databricks_workspacefiles={}, databricks_workspacetrees={}, pipelines={}, providers={})

settings

Laktory settings

TYPE: LaktorySettings | VariableType DEFAULT: None

terraform

Terraform-specific settings

TYPE: Terraform | VariableType DEFAULT: Terraform(variables={}, backend=None)

METHOD DESCRIPTION
apply_settings

Required to apply settings before instantiating resources and setting default values

build

Build stack artifacts before preview or deploy.

get_env

Complete definition the stack for a given environment. It takes into

to_terraform

Create a terraform stack for a given environment env.

apply_settings(data) classmethod ¤

Required to apply settings before instantiating resources and setting default values

Source code in laktory/models/stacks/stack.py
340
341
342
343
344
345
346
347
348
349
350
@model_validator(mode="before")
@classmethod
def apply_settings(cls, data: Any) -> Any:
    """Required to apply settings before instantiating resources and setting default values"""
    settings = data.get("settings", None)
    if settings:
        if not isinstance(settings, dict):
            settings = settings.model_dump(exclude_unset=True)
        LaktorySettings(**settings)

    return data

build(env_name, inject_vars=True) ¤

Build stack artifacts before preview or deploy.

Pipeline config JSON files are written to the location determined by settings.build_root (when set in stack.yaml under settings:) or the default Laktory cache directory. For Databricks Asset Bundles users, set settings.build_root to a project-local path (e.g. .laktory/.resources/) so that DABs can sync the files to the workspace.

PARAMETER DESCRIPTION
env_name

Name of the environment

TYPE: str | None

inject_vars

Inject stack variables

TYPE: bool DEFAULT: True

Source code in laktory/models/stacks/stack.py
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
def build(self, env_name: str | None, inject_vars: bool = True):
    """
    Build stack artifacts before preview or deploy.

    Pipeline config JSON files are written to the location determined by
    ``settings.build_root`` (when set in ``stack.yaml`` under
    ``settings:``) or the default Laktory cache directory. For Databricks
    Asset Bundles users, set ``settings.build_root`` to a
    project-local path (e.g. ``.laktory/.resources/``) so that DABs can
    sync the files to the workspace.

    Parameters
    ----------
    env_name:
        Name of the environment
    inject_vars:
        Inject stack variables
    """

    logger.info("Building artifacts...")

    env = self.get_env(env_name=env_name)
    if inject_vars:
        env = env.inject_vars()

    if env.resources is None:
        return

    for k, r in env.resources._get_all(providers_excluded=True).items():
        if isinstance(r, PythonPackage):
            r.build()

    logger.info("Writing pipeline config files...")
    for k, r in env.resources._get_all(providers_excluded=True).items():
        if isinstance(r, Pipeline):
            orchestrator = r.orchestrator
            if not orchestrator:
                continue

            config_file = getattr(r.orchestrator, "config_file", None)
            if config_file:
                config_file.build()

    logger.info("Build completed.")

get_env(env_name) ¤

Complete definition the stack for a given environment. It takes into account both the default stack values and environment-specific overwrites.

PARAMETER DESCRIPTION
env_name

Name of the environment

TYPE: str | None

RETURNS DESCRIPTION

Environment definitions.

Source code in laktory/models/stacks/stack.py
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
def get_env(self, env_name: str | None):
    """
    Complete definition the stack for a given environment. It takes into
    account both the default stack values and environment-specific
    overwrites.

    Parameters
    ----------
    env_name:
        Name of the environment

    Returns
    -------
    :
        Environment definitions.
    """

    if env_name is None:
        return self

    if env_name not in self.environments.keys():
        raise ValueError(f"Environment '{env_name}' is not declared in the stack.")

    env = self.model_copy(update={"environments": {}})
    env.update(self.environments[env_name].model_dump(exclude_unset=True))
    return env

to_terraform(env_name=None) ¤

Create a terraform stack for a given environment env.

PARAMETER DESCRIPTION
env_name

Target environment. If None, used default stack values only.

TYPE: Union[str, None] DEFAULT: None

RETURNS DESCRIPTION
TerraformStack

Terraform-specific stack definition

Source code in laktory/models/stacks/stack.py
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
def to_terraform(self, env_name: Union[str, None] = None):
    """
    Create a terraform stack for a given environment `env`.

    Parameters
    ----------
    env_name:
        Target environment. If `None`, used default stack values only.

    Returns
    -------
    : TerraformStack
        Terraform-specific stack definition
    """
    from laktory.models.stacks.terraformstack import TerraformStack

    env = self.get_env(env_name=env_name).inject_vars()
    env.build(env_name=None, inject_vars=False)

    # Providers
    providers = {}
    for r in env.resources._get_all(providers_only=True).values():
        for _r in r.core_resources:
            rname = _r.resource_name
            providers[rname] = _r

    # Resources
    resources = {}
    for r in env.resources._get_all(providers_excluded=True).values():
        for _r in r.core_resources:
            resources[_r.resource_name] = _r

    # Update terraform
    return TerraformStack(
        terraform={"backend": env.terraform.backend},
        providers=providers,
        resources=resources,
    )

laktory.models.stacks.StackResources ¤

Bases: BaseModel

Resources definition for a given stack or stack environment.

PARAMETER DESCRIPTION
databricks_accesscontrolrulesets

TYPE: dict[str | VariableType, AccessControlRuleSet | VariableType] | VariableType DEFAULT: {}

databricks_alerts

TYPE: dict[str | VariableType, Alert | VariableType] | VariableType DEFAULT: {}

databricks_apps

TYPE: dict[str | VariableType, App | VariableType] | VariableType DEFAULT: {}

databricks_catalogs

TYPE: dict[str | VariableType, Catalog | VariableType] | VariableType DEFAULT: {}

databricks_clusterpolicies

TYPE: dict[str | VariableType, ClusterPolicy | VariableType] | VariableType DEFAULT: {}

databricks_clusters

TYPE: dict[str | VariableType, Cluster | VariableType] | VariableType DEFAULT: {}

databricks_connections

TYPE: dict[str | VariableType, Connection | VariableType] | VariableType DEFAULT: {}

databricks_currentusers

TYPE: dict[str | VariableType, CurrentUser | VariableType] | VariableType DEFAULT: {}

databricks_dashboards

TYPE: dict[str | VariableType, Dashboard | VariableType] | VariableType DEFAULT: {}

databricks_dbfsfiles

TYPE: dict[str | VariableType, DbfsFile | VariableType] | VariableType DEFAULT: {}

databricks_directories

TYPE: dict[str | VariableType, Directory | VariableType] | VariableType DEFAULT: {}

databricks_entitlements

TYPE: dict[str | VariableType, Entitlements | VariableType] | VariableType DEFAULT: {}

databricks_externallocations

TYPE: dict[str | VariableType, ExternalLocation | VariableType] | VariableType DEFAULT: {}

databricks_grant

TYPE: dict[str | VariableType, Grant | VariableType] | VariableType DEFAULT: {}

databricks_grants

TYPE: dict[str | VariableType, Grants | VariableType] | VariableType DEFAULT: {}

databricks_groups

TYPE: dict[str | VariableType, Group | VariableType] | VariableType DEFAULT: {}

databricks_jobs

TYPE: dict[str | VariableType, Job | VariableType] | VariableType DEFAULT: {}

databricks_metastoreassignments

TYPE: dict[str | VariableType, MetastoreAssignment | VariableType] | VariableType DEFAULT: {}

databricks_metastoredataaccesses

TYPE: dict[str | VariableType, MetastoreDataAccess | VariableType] | VariableType DEFAULT: {}

databricks_metastores

TYPE: dict[str | VariableType, Metastore | VariableType] | VariableType DEFAULT: {}

databricks_mlflowexperiments

TYPE: dict[str | VariableType, MLflowExperiment | VariableType] | VariableType DEFAULT: {}

databricks_mlflowmodels

TYPE: dict[str | VariableType, MLflowModel | VariableType] | VariableType DEFAULT: {}

databricks_mlflowwebhooks

TYPE: dict[str | VariableType, MLflowWebhook | VariableType] | VariableType DEFAULT: {}

databricks_networkconnectivityconfig

TYPE: dict[str | VariableType, MwsNetworkConnectivityConfig | VariableType] | VariableType DEFAULT: {}

databricks_notebooks

TYPE: dict[str | VariableType, Notebook | VariableType] | VariableType DEFAULT: {}

databricks_notificationdestinations

TYPE: dict[str | VariableType, NotificationDestination | VariableType] | VariableType DEFAULT: {}

databricks_obotokens

TYPE: dict[str | VariableType, OboToken | VariableType] | VariableType DEFAULT: {}

databricks_permissions

TYPE: dict[str | VariableType, Permissions | VariableType] | VariableType DEFAULT: {}

databricks_pipelines

TYPE: dict[str | VariableType, Pipeline | VariableType] | VariableType DEFAULT: {}

databricks_pythonpackages

TYPE: dict[str | VariableType, PythonPackage | VariableType] | VariableType DEFAULT: {}

databricks_qualitymonitors

TYPE: dict[str | VariableType, QualityMonitor | VariableType] | VariableType DEFAULT: {}

databricks_queries

TYPE: dict[str | VariableType, Query | VariableType] | VariableType DEFAULT: {}

databricks_recipients

TYPE: dict[str | VariableType, Recipient | VariableType] | VariableType DEFAULT: {}

databricks_repos

TYPE: dict[str | VariableType, Repo | VariableType] | VariableType DEFAULT: {}

databricks_schemas

TYPE: dict[str | VariableType, Schema | VariableType] | VariableType DEFAULT: {}

databricks_secrets

TYPE: dict[str | VariableType, Secret | VariableType] | VariableType DEFAULT: {}

databricks_secretscopes

TYPE: dict[str | VariableType, SecretScope | VariableType] | VariableType DEFAULT: {}

databricks_serviceprincipals

TYPE: dict[str | VariableType, ServicePrincipal | VariableType] | VariableType DEFAULT: {}

databricks_shares

TYPE: dict[str | VariableType, Share | VariableType] | VariableType DEFAULT: {}

databricks_storagecredentials

TYPE: dict[str | VariableType, StorageCredential | VariableType] | VariableType DEFAULT: {}

databricks_tables

TYPE: dict[str | VariableType, Table | VariableType] | VariableType DEFAULT: {}

databricks_users

TYPE: dict[str | VariableType, User | VariableType] | VariableType DEFAULT: {}

databricks_vectorsearchendpoints

TYPE: dict[str | VariableType, VectorSearchEndpoint | VariableType] | VariableType DEFAULT: {}

databricks_vectorsearchindexes

TYPE: dict[str | VariableType, VectorSearchIndex | VariableType] | VariableType DEFAULT: {}

databricks_volumes

TYPE: dict[str | VariableType, Volume | VariableType] | VariableType DEFAULT: {}

databricks_warehouses

TYPE: dict[str | VariableType, Warehouse | VariableType] | VariableType DEFAULT: {}

databricks_workspacebindings

TYPE: dict[str | VariableType, WorkspaceBinding | VariableType] | VariableType DEFAULT: {}

databricks_workspacefiles

TYPE: dict[str | VariableType, WorkspaceFile | PythonPackage | VariableType] | VariableType DEFAULT: {}

databricks_workspacetrees

TYPE: dict[str | VariableType, WorkspaceTree | PythonPackage | VariableType] | VariableType DEFAULT: {}

pipelines

TYPE: dict[str | VariableType, Pipeline | VariableType] | VariableType DEFAULT: {}

providers

TYPE: dict[str | VariableType, AWSProvider | AzureProvider | DatabricksProvider | VariableType] | VariableType DEFAULT: {}


laktory.models.stacks.stack.LaktorySettings ¤

Bases: BaseModel

Laktory Settings

PARAMETER DESCRIPTION
build_root

Local directory where pipeline config JSON and resource files are written during build. Defaults to the Laktory cache directory. Use when deployment is delegated to third parties like Databricks Declarative Bundles.

TYPE: str | VariableType DEFAULT: ''

dataframe_api

TYPE: Literal['NARWHALS', 'NATIVE'] | VariableType DEFAULT: None

dataframe_backend

DataFrame backend

TYPE: str DEFAULT: None

runtime_root

Laktory cache root directory. Used when a pipeline needs to write checkpoint files.

TYPE: str | VariableType DEFAULT: '/laktory/'

workspace_root

Root directory of a Databricks Workspace (excluding `'/Workspace') to which databricks objects like notebooks and workspace files are deployed.

TYPE: str | VariableType DEFAULT: '/.laktory/'


laktory.models.stacks.stack.EnvironmentSettings ¤

Bases: BaseModel

Settings overwrite for a specific environments

PARAMETER DESCRIPTION
resources

Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.

TYPE: Any DEFAULT: None

terraform

Terraform-specific settings

TYPE: Terraform | VariableType DEFAULT: Terraform(variables={}, backend=None)


laktory.models.stacks.stack.Terraform ¤

Bases: BaseModel

PARAMETER DESCRIPTION
backend

TYPE: dict[str, Any] | None | VariableType DEFAULT: None