Skip to content

DAB deployments override cluster policies in the UI #2492

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
swopmichalak opened this issue Mar 14, 2025 · 0 comments
Open

DAB deployments override cluster policies in the UI #2492

swopmichalak opened this issue Mar 14, 2025 · 0 comments
Assignees
Labels
DABs DABs related issues

Comments

@swopmichalak
Copy link

Describe the issue

I created cluster policies that are available for developers to use with DABs. When manually updating the .yml configuration for the workflow we can deploy cluster configuration that is forbidden by the cluster policy.

The forbidden cluster is visible from the UI of the deploy workflow. If the workflow is connected to the bundle no errors are shown in the cluster configuration pane; after disconnecting the configuration of the cluster returns errors. But we can successfully run the workflow that will omit the cluster config in the UI and use the cluster permissible by the policy. E.g. despite having being deployed successfully with autoscaling cluster on single node policy, the run will be assigned single node cluster.

Configuration

Cluster configuration that can be deployed

      job_clusters:
        - job_cluster_key: Job_cluster
          new_cluster:
            cluster_name: ""
            spark_version: 15.4.x-cpu-ml-scala2.12
            azure_attributes:
              first_on_demand: 1
              availability: SPOT_WITH_FALLBACK_AZURE
              spot_bid_max_price: -1
            node_type_id: Standard_D4ds_v5
            driver_node_type_id: Standard_D4ds_v5
            enable_elastic_disk: true
            policy_id: ${var.single_node_policy_id}
            data_security_mode: SINGLE_USER
            runtime_engine: STANDARD
            autoscale:
              min_workers: 1
              max_workers: 9

Policy it can be deployed with

{
  "cluster_type": {
    "type": "fixed",
    "value": "job"
  },
  "data_security_mode": {
    "type": "fixed",
    "value": "SINGLE_USER"
  },
  "instance_pool_id": {
    "hidden": true,
    "type": "forbidden"
  },
  "node_type_id": {
    "type": "fixed",
    "value": "Standard_D4ds_v5"
  },
  "runtime_engine": {
    "type": "fixed",
    "value": "STANDARD"
  },
  "spark_conf.spark.databricks.cluster.profile": {
    "type": "fixed",
    "value": "singleNode"
  },
  "spark_version": {
    "type": "allowlist",
    "values": [
      "auto:latest-lts-ml",
      "auto:prev-lts-ml"
    ]
  }
}

Steps to reproduce the behavior

  1. Create a basic policy restricting access to a cluster.
  2. Created .yml workflow assigned to this policy while breaking it.
  3. Deploy the bundle.
  4. Check if the cluster config is overriding the policy.
  5. Check if the triggered run will fallback to the policy.

OS and CLI version

v0.243.0 version deployed from Databricks terminal.

@swopmichalak swopmichalak added the DABs DABs related issues label Mar 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
DABs DABs related issues
Projects
None yet
Development

No branches or pull requests

2 participants