You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* upgrade to serverless v4, upgrade databricks cli and uv
* explain databricks-connect usage
* change pinning
* rename dependencies group
* use all extras in cicd
* use proper connect version
* fix dbc tests
Copy file name to clipboardExpand all lines: README.md
+14-6Lines changed: 14 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,7 +23,7 @@ For this example we use a Databricks Free Edition workspace https://www.databric
23
23
24
24
Groups and Service Principals are not necessary, but are used in this project to showcase handling permissions on resources such as catalogs or workflows.
25
25
26
-
***Serverless environment version 3**, which matches [Databricks Runtime 16.3](https://docs.databricks.com/aws/en/release-notes/serverless/#version-163)
26
+
***Serverless environment**: [Version 4](https://docs.databricks.com/aws/en/release-notes/serverless/environment-version/four) which is similar to Databricks Runtime ~17.*
27
27
***Catalogs**: `lake_dev`, `lake_test` and `lake_prod`
28
28
***Service principals** (for CI/CD and Workflow runners)
29
29
*`sp_etl_dev` (for dev and test) and `sp_etl_prod` (for prod)
@@ -45,9 +45,9 @@ A script exists set up the (Free) Workspace as described in [scripts/setup_works
45
45
46
46
### Setup environment
47
47
48
-
Sync entire `uv` environment with dev dependencies:
48
+
Sync entire `uv` environment with all optional dependency groups:
49
49
```bash
50
-
uv sync --extra dev
50
+
uv sync --all-extras
51
51
```
52
52
53
53
> **Note:** we install Databricks Connect in a follow-up step
@@ -70,16 +70,24 @@ Install `databricks-connect` in active environment. This requires authentication
70
70
71
71
```bash
72
72
uv pip uninstall pyspark
73
-
uv pip install databricks-connect==16.3.5
73
+
uv pip install databricks-connect==17.2.*
74
74
```
75
-
> **Note:** For Databricks Runtime 16.3
75
+
76
+
**Option 2: Run with temporary dependency**
77
+
```bash
78
+
uv run --with databricks-connect==17.2.* pytest
79
+
```
80
+
81
+
> **Note:** For Databricks Runtime Serverless v4
82
+
76
83
77
84
See https://docs.databricks.com/aws/en/dev-tools/vscode-ext/ for using Databricks Connect extension in VS Code.
78
85
79
86
### Unit-Tests
80
87
81
88
```bash
82
-
uv run pytest -v
89
+
# in case databricks-connect is installed, --no-sync prevents reinstalling pyspark
90
+
uv run --no-sync pytest -v
83
91
```
84
92
85
93
Based on whether Databricks Connect is enabled or not the Unit-Tests try to use a Databricks Cluster or start a local Spark session with Delta support.
0 commit comments