Skip to content

Initial implementation of "direct" deployment backend #2926

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 69 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
69 commits
Select commit Hold shift + click to select a range
f5f2e06
WIP - deploying without terraform
denik Apr 4, 2025
60c6cf9
shortenv env var names to avoid "The directory name is invalid" on wi…
denik May 26, 2025
a4c1618
disable acc/cloud tests that don't work yet
denik May 26, 2025
503003c
clean up
denik May 26, 2025
cd6e82d
WIP: destroy mutator + test (pipelines only); Graph is now generic
denik May 26, 2025
660acb0
destroy for jobs + test
denik May 26, 2025
55e56aa
update github action to use ENVFILTER
denik May 27, 2025
535c064
lint fix
denik May 27, 2025
37b391f
do not create empty resources.json
denik May 27, 2025
cb449c1
clean up env var from tests
denik May 27, 2025
ecaa097
enable test
denik May 27, 2025
3677bc0
disable dashboard tests
denik May 27, 2025
2cff2aa
fix script
denik May 28, 2025
f107fae
enable run-local test
denik May 28, 2025
83e56d5
disable jobs/check-metadata, mlops-stacks
denik May 30, 2025
ee1925c
disable 2 more integration tests
denik May 30, 2025
b5a0b9d
disable volumes
denik May 30, 2025
b502310
comment why test is disabled
denik May 30, 2025
a06fd78
Add resource_types.go + test
denik May 30, 2025
f3d1453
remove switch from GetResourceConfig
denik May 30, 2025
8e8c8d3
rename terranova_{state,resources} to tn{state,resources}
denik May 30, 2025
bc14e59
clean up unused functions from testserver
denik May 30, 2025
86ed498
add map with New functions that records resource types
denik May 30, 2025
26cd78a
update comments
denik May 30, 2025
24fa168
clean up
denik May 30, 2025
b96d08f
clean up
denik May 30, 2025
683077f
lint fix
denik May 30, 2025
dccfb67
resourceConstructors -> supportedResources
denik May 30, 2025
4296258
extract IsDirectDeployment function
denik Jun 2, 2025
bec75c4
dag: maintain insertion order, do not sort and do not require Less()
denik Jun 2, 2025
edd09ac
simplify cycle message
denik Jun 2, 2025
a060936
rename libs/dag to libs/dagrun
denik Jun 2, 2025
a4d6006
clean up; do not ignore errors
denik Jun 2, 2025
e9cd1dd
post-rebase fix of test.toml + update outputs
denik Jun 3, 2025
8e45207
post rebase - restore acceptance/bundle/run/basic/test.toml
denik Jun 3, 2025
7111327
clean up warning
denik Jun 5, 2025
e550078
clean up; add comments; add error wrapping; use atomic
denik Jun 6, 2025
261413a
add comments
denik Jun 6, 2025
dcc1c16
rm dec.DisallowUnknownFields
denik Jun 6, 2025
9d2d7bf
state: use GetResourceEntry to replace both GetResourceID and GetSave…
denik Jun 6, 2025
671007e
rebase fixups
denik Jun 12, 2025
f8e9c11
Add StateFilename and StateLocalPath and use throughout
denik Jun 16, 2025
37d5b8d
Make statemgmt.State{Pull,Push} work with terranova
denik Jun 16, 2025
b6509b7
Improve log messages in statemgmt
denik Jun 16, 2025
85c89be
handle errors from OpenResourceDatabase
denik Jun 17, 2025
8e53d56
disable clusters test
denik Jun 17, 2025
4782f5b
rebase fixup
denik Jun 18, 2025
04c1b9e
split debug test in two
denik Jun 18, 2025
c0a5218
disable bind/unbind
denik Jun 18, 2025
0656b1b
Rename Plan->TerraformPlan and IsEmpty->TerraformIsEmpty
denik Jun 18, 2025
0d6e0cd
wip - plan based approach
denik Jun 18, 2025
350a574
fix storing delete actions
denik Jun 19, 2025
54a415a
rename TerranovaDeploy to TerranovaApply
denik Jun 19, 2025
c3f48a9
rename deploy_mutator.go -> apply (contents already renamed)
denik Jun 19, 2025
0424fdc
add comments to empty methods
denik Jun 19, 2025
9ed2e45
clean up commented out setting
denik Jun 19, 2025
b1f17b7
clean up addition of dummy terraform.exec_path
denik Jun 19, 2025
9310c18
rm commented out block
denik Jun 19, 2025
07e960b
add missing wrapped error
denik Jun 23, 2025
614481a
pass pointer into resource
denik Jun 23, 2025
656d937
small unit test for New() in tnresources
denik Jun 23, 2025
a0d6ead
clean up commented out block
denik Jun 23, 2025
e862d00
add a comment
denik Jun 23, 2025
da7f1c9
clean up ResourceNode; make data -> Data accessible
denik Jun 23, 2025
85a1521
rename TerraformPlan to TerraformPlanPath
denik Jun 23, 2025
c35f959
fix typo
denik Jun 23, 2025
7cba575
clarify comment
denik Jun 23, 2025
8d4c2fe
clean up unnecessary changes
denik Jun 23, 2025
7965758
Use "direct-exp" for direct backend selection
denik Jun 23, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 13 additions & 0 deletions .github/workflows/push.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,15 +42,24 @@ jobs:
- macos-latest
- ubuntu-latest
- windows-latest
deployment:
- "terraform"
- "direct"

steps:
- name: Checkout repository and submodules
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2

- name: Create deployment-specific cache identifier
run: echo "${{ matrix.deployment }}" > deployment-type.txt

- name: Setup Go
uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
with:
go-version-file: go.mod
cache-dependency-path: |
go.sum
deployment-type.txt
- name: Setup Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
Expand All @@ -72,11 +81,15 @@ jobs:
# and would like to run the tests as fast as possible. We run it on schedule as well, because that is what
# populates the cache and cache may include test results.
if: ${{ github.event_name == 'pull_request' || github.event_name == 'schedule' }}
env:
ENVFILTER: DATABRICKS_CLI_DEPLOYMENT=${{ matrix.deployment }}
run: make test

- name: Run tests with coverage
# Still run 'make cover' on push to main and merge checks to make sure it does not get broken.
if: ${{ github.event_name != 'pull_request' && github.event_name != 'schedule' }}
env:
ENVFILTER: DATABRICKS_CLI_DEPLOYMENT=${{ matrix.deployment }}
run: make cover

- name: Analyze slow tests
Expand Down
17 changes: 16 additions & 1 deletion acceptance/bin/read_id.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,4 +29,19 @@ def print_resource_terraform(section, name):
return


print_resource_terraform(*sys.argv[1:])
def print_resource_terranova(section, name):
filename = ".databricks/bundle/default/resources.json"
raw = open(filename).read()
data = json.loads(raw)
resources = data["resources"].get(section, {})
result = resources.get(name)
if result is None:
print(f"Resource {section=} {name=} not found. Available: {raw}")
return
print(result.get("__id__"))


if os.environ.get("DATABRICKS_CLI_DEPLOYMENT", "").startswith("direct"):
print_resource_terranova(*sys.argv[1:])
else:
print_resource_terraform(*sys.argv[1:])
26 changes: 21 additions & 5 deletions acceptance/bin/read_state.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,17 +13,15 @@
def print_resource_terraform(section, name, *attrs):
resource_type = "databricks_" + section[:-1]
filename = ".databricks/bundle/default/terraform/terraform.tfstate"
data = json.load(open(filename))
available = []
raw = open(filename).read()
data = json.loads(raw)
found = 0
for r in data["resources"]:
r_type = r["type"]
r_name = r["name"]
if r_type != resource_type:
available.append((r_type, r_name))
continue
if r_name != name:
available.append((r_type, r_name))
continue
for inst in r["instances"]:
attribute_values = inst.get("attributes")
Expand All @@ -35,4 +33,22 @@ def print_resource_terraform(section, name, *attrs):
print(f"State not found for {section}.{name}")


print_resource_terraform(*sys.argv[1:])
def print_resource_terranova(section, name, *attrs):
filename = ".databricks/bundle/default/resources.json"
raw = open(filename).read()
data = json.loads(raw)
resources = data["resources"].get(section, {})
result = resources.get(name)
if result is None:
print(f"State not found for {section}.{name}")
return
state = result["state"]
state.setdefault("id", result.get("__id__"))
values = [f"{x}={state.get(x)!r}" for x in attrs]
print(section, name, " ".join(values))


if os.environ.get("DATABRICKS_CLI_DEPLOYMENT", "").startswith("direct"):
print_resource_terranova(*sys.argv[1:])
else:
print_resource_terraform(*sys.argv[1:])
7 changes: 4 additions & 3 deletions acceptance/bundle/artifacts/whl_change_version/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -41,10 +41,11 @@ dist/my_test_code-0.1.0-py3-none-any.whl
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files/repls.json"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files/script"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files/setup.py"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files/test.toml"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/deploy.lock"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/deployment.json"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/metadata.json"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/terraform.tfstate"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/STATE_FILENAME"

>>> update_file.py my_test_code/__init__.py 0.1.0 0.2.0

Expand Down Expand Up @@ -88,7 +89,7 @@ dist/my_test_code-0.2.0-py3-none-any.whl
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/deploy.lock"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/deployment.json"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/metadata.json"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/terraform.tfstate"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/STATE_FILENAME"

=== Restore config to target old wheel
>>> update_file.py databricks.yml ./dist/*.whl ./dist/my*0.1.0*.whl
Expand Down Expand Up @@ -135,4 +136,4 @@ dist/my_test_code-0.2.0-py3-none-any.whl
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/deploy.lock"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/deployment.json"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/metadata.json"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/terraform.tfstate"
"/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/STATE_FILENAME"
3 changes: 3 additions & 0 deletions acceptance/bundle/artifacts/whl_change_version/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[[Repls]]
Old = '(resources.json|terraform.tfstate)'
New = 'STATE_FILENAME'
2 changes: 2 additions & 0 deletions acceptance/bundle/artifacts/whl_dynamic/test.toml
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # need to sort tasks by key like terraform does

[[Repls]]
Old = '\\\\'
New = '/'
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
BundleConfig.default_name = ""
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # need to sort tasks by key

[[Repls]]
Old = '\\'
Expand Down
93 changes: 93 additions & 0 deletions acceptance/bundle/debug/direct/out.stderr.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@

>>> [CLI] bundle validate --debug
10:07:59 Info: start pid=12345 version=[DEV_VERSION] args="[CLI], bundle, validate, --debug"
10:07:59 Debug: Found bundle root at [TEST_TMP_DIR] (file [TEST_TMP_DIR]/databricks.yml) pid=12345
10:07:59 Info: Phase: load pid=12345
10:07:59 Debug: Apply pid=12345 mutator=EntryPoint
10:07:59 Debug: Apply pid=12345 mutator=scripts.preinit
10:07:59 Debug: No script defined for preinit, skipping pid=12345 mutator=scripts.preinit
10:07:59 Debug: Apply pid=12345 mutator=ProcessRootIncludes
10:07:59 Debug: Apply pid=12345 mutator=VerifyCliVersion
10:07:59 Debug: Apply pid=12345 mutator=EnvironmentsToTargets
10:07:59 Debug: Apply pid=12345 mutator=ComputeIdToClusterId
10:07:59 Debug: Apply pid=12345 mutator=InitializeVariables
10:07:59 Debug: Apply pid=12345 mutator=DefineDefaultTarget(default)
10:07:59 Debug: Apply pid=12345 mutator=validate:unique_resource_keys
10:07:59 Debug: Apply pid=12345 mutator=SelectDefaultTarget
10:07:59 Debug: Apply pid=12345 mutator=SelectDefaultTarget mutator=SelectTarget(default)
10:07:59 Debug: Apply pid=12345 mutator=<func>
10:07:59 Info: Phase: initialize pid=12345
10:07:59 Debug: Apply pid=12345 mutator=validate:AllResourcesHaveValues
10:07:59 Debug: Apply pid=12345 mutator=validate:interpolation_in_auth_config
10:07:59 Debug: Apply pid=12345 mutator=RewriteSyncPaths
10:07:59 Debug: Apply pid=12345 mutator=SyncDefaultPath
10:07:59 Debug: Apply pid=12345 mutator=SyncInferRoot
10:07:59 Debug: Apply pid=12345 mutator=PopulateCurrentUser
10:07:59 Debug: GET /api/2.0/preview/scim/v2/Me
< HTTP/1.1 200 OK
< {
< "id": "[USERID]",
< "userName": "[USERNAME]"
< } pid=12345 mutator=PopulateCurrentUser sdk=true
10:07:59 Debug: Apply pid=12345 mutator=LoadGitDetails
10:07:59 Debug: Apply pid=12345 mutator=ApplySourceLinkedDeploymentPreset
10:07:59 Debug: Apply pid=12345 mutator=DefineDefaultWorkspaceRoot
10:07:59 Debug: Apply pid=12345 mutator=ExpandWorkspaceRoot
10:07:59 Debug: Apply pid=12345 mutator=DefaultWorkspacePaths
10:07:59 Debug: Apply pid=12345 mutator=PrependWorkspacePrefix
10:07:59 Debug: Apply pid=12345 mutator=RewriteWorkspacePrefix
10:07:59 Debug: Apply pid=12345 mutator=SetVariables
10:07:59 Debug: Apply pid=12345 mutator=ResolveVariableReferences
10:07:59 Debug: Apply pid=12345 mutator=ResolveResourceReferences
10:07:59 Debug: Apply pid=12345 mutator=ResolveVariableReferences
10:07:59 Debug: Apply pid=12345 mutator=validate:volume-path
10:07:59 Debug: Apply pid=12345 mutator=ApplyTargetMode
10:07:59 Debug: Apply pid=12345 mutator=ConfigureWSFS
10:07:59 Debug: Apply pid=12345 mutator=ProcessStaticResources
10:07:59 Debug: Apply pid=12345 mutator=ProcessStaticResources mutator=ResolveVariableReferences(resources)
10:07:59 Debug: Apply pid=12345 mutator=ProcessStaticResources mutator=NormalizePaths
10:07:59 Debug: Apply pid=12345 mutator=ProcessStaticResources mutator=TranslatePathsDashboards
10:07:59 Debug: Apply pid=12345 mutator=PythonMutator(load)
10:07:59 Debug: Apply pid=12345 mutator=PythonMutator(init)
10:07:59 Debug: Apply pid=12345 mutator=PythonMutator(load_resources)
10:07:59 Debug: Apply pid=12345 mutator=PythonMutator(apply_mutators)
10:07:59 Debug: Apply pid=12345 mutator=CheckPermissions
10:07:59 Debug: Apply pid=12345 mutator=TranslatePaths
10:07:59 Debug: Apply pid=12345 mutator=PythonWrapperWarning
10:07:59 Debug: Apply pid=12345 mutator=ApplyArtifactsDynamicVersion
10:07:59 Debug: Apply pid=12345 mutator=artifacts.Prepare
10:07:59 Info: No local tasks in databricks.yml config, skipping auto detect pid=12345 mutator=artifacts.Prepare
10:07:59 Debug: Apply pid=12345 mutator=apps.Validate
10:07:59 Debug: Apply pid=12345 mutator=ValidateTargetMode
10:07:59 Debug: Apply pid=12345 mutator=ValidateSharedRootPermissions
10:07:59 Debug: Apply pid=12345 mutator=metadata.AnnotateJobs
10:07:59 Debug: Apply pid=12345 mutator=metadata.AnnotatePipelines
10:07:59 Debug: Apply pid=12345 mutator=scripts.postinit
10:07:59 Debug: No script defined for postinit, skipping pid=12345 mutator=scripts.postinit
10:07:59 Debug: ApplyParallel pid=12345 mutator=fast_validate(readonly)
10:07:59 Debug: ApplyParallel pid=12345 mutator=validate:files_to_sync
10:07:59 Debug: ApplyParallel pid=12345 mutator=validate:folder_permissions
10:07:59 Debug: ApplyParallel pid=12345 mutator=validate:validate_sync_patterns
10:07:59 Debug: ApplyParallel pid=12345 mutator=fast_validate(readonly) mutator=validate:job_cluster_key_defined
10:07:59 Debug: ApplyParallel pid=12345 mutator=fast_validate(readonly) mutator=validate:job_task_cluster_spec
10:07:59 Debug: ApplyParallel pid=12345 mutator=fast_validate(readonly) mutator=validate:artifact_paths
10:07:59 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files
< HTTP/1.1 404 Not Found
< {
< "message": "Workspace path not found"
< } pid=12345 mutator=validate:files_to_sync sdk=true
10:07:59 Debug: non-retriable error: Workspace path not found pid=12345 mutator=validate:files_to_sync sdk=true
10:07:59 Debug: POST /api/2.0/workspace/mkdirs
> {
> "path": "/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files"
> }
< HTTP/1.1 200 OK pid=12345 mutator=validate:files_to_sync sdk=true
10:07:59 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files
< HTTP/1.1 200 OK
< {
< "object_type": "DIRECTORY",
< "path": "/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files"
< } pid=12345 mutator=validate:files_to_sync sdk=true
10:07:59 Debug: Path /Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files has type directory (ID: 0) pid=12345 mutator=validate:files_to_sync
10:07:59 Info: completed execution pid=12345 exit_code=0
10:07:59 Debug: no telemetry logs to upload pid=12345
25 changes: 25 additions & 0 deletions acceptance/bundle/debug/direct/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
Name: test-bundle
Target: default
Workspace:
User: [USERNAME]
Path: /Workspace/Users/[USERNAME]/.bundle/test-bundle/default

Validation OK!

>>> diff.py [TESTROOT]/bundle/debug/direct/../tf/out.stderr.txt out.stderr.txt
--- [TESTROOT]/bundle/debug/direct/../tf/out.stderr.txt
+++ out.stderr.txt
@@ -1,2 +1,4 @@
+
+>>> [CLI] bundle validate --debug
10:07:59 Info: start pid=12345 version=[DEV_VERSION] args="[CLI], bundle, validate, --debug"
10:07:59 Debug: Found bundle root at [TEST_TMP_DIR] (file [TEST_TMP_DIR]/databricks.yml) pid=12345
@@ -61,8 +63,4 @@
10:07:59 Debug: Apply pid=12345 mutator=metadata.AnnotateJobs
10:07:59 Debug: Apply pid=12345 mutator=metadata.AnnotatePipelines
-10:07:59 Debug: Apply pid=12345 mutator=terraform.Initialize
-10:07:59 Debug: Using Terraform from DATABRICKS_TF_EXEC_PATH at [TERRAFORM] pid=12345 mutator=terraform.Initialize
-10:07:59 Debug: Using Terraform CLI config from DATABRICKS_TF_CLI_CONFIG_FILE at [DATABRICKS_TF_CLI_CONFIG_FILE] pid=12345 mutator=terraform.Initialize
-10:07:59 Debug: Environment variables for Terraform: ...redacted... pid=12345 mutator=terraform.Initialize
10:07:59 Debug: Apply pid=12345 mutator=scripts.postinit
10:07:59 Debug: No script defined for postinit, skipping pid=12345 mutator=scripts.postinit
2 changes: 2 additions & 0 deletions acceptance/bundle/debug/direct/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
trace $CLI bundle validate --debug 2> out.stderr.txt
trace diff.py $TESTDIR/../tf/out.stderr.txt out.stderr.txt
1 change: 1 addition & 0 deletions acceptance/bundle/debug/direct/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["direct-exp"]
Empty file.
File renamed without changes.
1 change: 1 addition & 0 deletions acceptance/bundle/debug/tf/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"]
1 change: 1 addition & 0 deletions acceptance/bundle/deploy/dashboard/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # dashboard not supported yet
3 changes: 3 additions & 0 deletions acceptance/bundle/deploy/fail-on-active-runs/test.toml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
RecordRequests = true

# --fail-on-active-runs not implemented yet
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"]

[[Server]]
Pattern = "GET /api/2.2/jobs/runs/list"
Response.Body = '''
Expand Down
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/jobs/check-metadata/test.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
Local = false
Cloud = true

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # require "bundle summary"

Ignore = [
"databricks.yml",
"a/b/resources.yml",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
Local = true
Cloud = true

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # needs investigation Error: deploying jobs.foo: Method=Jobs.Create *retries.Err *apierr.APIError StatusCode=400 ErrorCode="INVALID_PARAMETER_VALUE" Message="Missing required field: settings.tasks.task_key."

Ignore = [
"databricks.yml",
]
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/mlops-stacks/test.toml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ Local=false

Badness = "the newly initialized bundle from the 'mlops-stacks' template contains two validation warnings in the configuration"

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # requires "bundle summary"

Ignore = [
"config.json"
]
Expand Down
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/pipeline/auto-approve/test.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
Local = true
Cloud = true

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # requires "bundle summary"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: mix of preceding line and suffix comments.

Do you have a list of these missing features or intend to grep for this EnvMatrix to get the accurate list?

Copy link
Contributor Author

@denik denik Jun 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Going to grep.


Ignore = [
"databricks.yml"
]
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/pipeline/recreate/test.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@ Local = true
Cloud = true
RequiresUnityCatalog = true

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"]

Ignore = [
"databricks.yml"
]
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/schema/auto-approve/test.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@ Local = true
Cloud = true
RequiresUnityCatalog = true

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # requires "bundle summary"

Ignore = [
"databricks.yml",
"test-file-*.txt",
Expand Down
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/secret-scope/test.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
Cloud = true
Local = true

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"]

Ignore = [
"databricks.yml",
]
Expand Down
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/volume/recreate/test.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@ Local = false
Cloud = true
RequiresUnityCatalog = true

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # volumes are not supported

Ignore = [
"databricks.yml",
]
2 changes: 2 additions & 0 deletions acceptance/bundle/deployment/test.toml
Original file line number Diff line number Diff line change
@@ -1 +1,3 @@
Cloud = true

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # summary,bind,unbind not implemented
2 changes: 2 additions & 0 deletions acceptance/bundle/destroy/jobs-and-pipeline/test.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
Local = false
Cloud = true

EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # requires "bundle summary"

Ignore = [
"databricks.yml",
"resources.yml",
Expand Down
2 changes: 2 additions & 0 deletions acceptance/bundle/generate/dashboard-inplace/test.toml
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"] # dashboards not supported yet

[[Repls]]
Old = "[0-9a-f]{32}"
New = "[DASHBOARD_ID]"
5 changes: 1 addition & 4 deletions acceptance/bundle/includes/yml_outside_root/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,5 @@ Validation OK!
"bundle_root_path": "."
},
"name": "yml_outside_root",
"target": "default",
"terraform": {
"exec_path": "[TERRAFORM]"
}
"target": "default"
}
2 changes: 1 addition & 1 deletion acceptance/bundle/includes/yml_outside_root/script
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@ cd root
trace $CLI bundle validate
trace $CLI bundle validate -o json | jq '.resources.jobs[] | select(.name == "include_outside_root")'
trace $CLI bundle validate -o json | jq '.sync'
trace $CLI bundle validate -o json | jq '.bundle'
trace $CLI bundle validate -o json | jq '.bundle' | jq 'del(.terraform)'
2 changes: 2 additions & 0 deletions acceptance/bundle/python/restricted-execution/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# "bundle summary" is not implemented
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"]
Loading
Loading