Skip to content
This repository was archived by the owner on Apr 8, 2024. It is now read-only.

Commit 935cb17

Browse files
authored
feat(dbt-fal): adapt to new 1.5 dbt changes (#814)
* update poetry deps * chore: adapt to new flags reading and introduce AdapterConfig * tests: fix dbt version to test * tests: fix dbt version to test * change OVERRIDE_PROPERTIES for easier logic * send flags to environment-run models * remove deprecated target-path * Set target path for integartion tests based on temp dir * Make _isolated_runner receive only named args * remove redshift for now * use 1.5 release candidate * formatting * use adapt to new model.refs format and new model versioning * use 1.5.0 * Poetry lock updated --------- Co-authored-by: chamini2 <chamini2@users.noreply.github.com>
1 parent 1963216 commit 935cb17

File tree

16 files changed

+246
-291
lines changed

16 files changed

+246
-291
lines changed

.github/workflows/test_integration_adapter.yml

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -28,14 +28,16 @@ jobs:
2828
profile:
2929
- postgres
3030
- bigquery
31-
# - duckdb
3231
- snowflake
33-
- redshift
34-
- athena
35-
- trino
36-
# - sqlserver # does not support dbt 1.4 yet
32+
# TODO: redshift had a bigger change between 1.4 and 1.5 and will be addressed separately
33+
# - redshift
34+
# TODO: enable as 1.5 becomes available for following adapters
35+
# - athena
36+
# - trino
37+
# - duckdb
38+
# - sqlserver
3739
dbt_version:
38-
- "1.4.*"
40+
- "1.5.0"
3941
python:
4042
# - "3.7"
4143
- "3.8"
@@ -46,8 +48,8 @@ jobs:
4648
- profile: snowflake
4749
teleport: true
4850
cloud: true
49-
- profile: redshift
50-
cloud: true
51+
# - profile: redshift
52+
# cloud: true
5153
- profile: bigquery
5254
cloud: true
5355

projects/adapter/integration_tests/features/steps/fal_adapter_steps.py

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,10 @@
1010
import re
1111

1212

13+
def target_path(context):
14+
return str(Path(context.temp_dir.name) / "target")
15+
16+
1317
@when("the following shell command is invoked")
1418
def run_command_step(context):
1519
context.exc = None
@@ -46,7 +50,7 @@ def set_project_folder(context, project: str):
4650
context.base_dir = str(project_path)
4751
context.temp_dir = tempfile.TemporaryDirectory()
4852
context.project_name = _load_dbt_project_file(context)["name"]
49-
os.environ["temp_dir"] = context.temp_dir.name
53+
os.environ["DBT_TARGET_PATH"] = target_path(context)
5054

5155

5256
@then("the following models are calculated in order")
@@ -93,9 +97,7 @@ def _get_dated_dbt_models(context):
9397

9498

9599
def _load_dbt_result_file(context):
96-
with open(
97-
os.path.join(context.temp_dir.name, "target", "run_results.json")
98-
) as stream:
100+
with open(os.path.join(target_path(context), "run_results.json")) as stream:
99101
return json.load(stream)["results"]
100102

101103

@@ -105,10 +107,9 @@ def _load_dbt_project_file(context):
105107

106108

107109
def _load_target_run_model(context, model_name: str, file_ext: str):
108-
109110
# TODO: we should use fal to find these files from fal reading the dbt_project.yml and making it easily available
110111
models_dir: Path = (
111-
Path(context.temp_dir.name) / "target" / "run" / context.project_name / "models"
112+
Path(target_path(context)) / "run" / context.project_name / "models"
112113
)
113114

114115
found_model_files = list(models_dir.rglob(f"{model_name}.{file_ext}"))
@@ -121,7 +122,6 @@ def _load_target_run_model(context, model_name: str, file_ext: str):
121122
def _replace_vars(context, msg):
122123
return (
123124
msg.replace("$baseDir", context.base_dir)
124-
.replace("$tempDir", str(context.temp_dir.name))
125125
.replace("$profilesDir", _get_profiles_dir(context))
126126
.replace("$profile", _get_profile(context))
127127
)
@@ -140,7 +140,7 @@ def _get_profile(context) -> str:
140140
"duckdb",
141141
"athena",
142142
"trino",
143-
"sqlserver"
143+
"sqlserver",
144144
]
145145
profile = context.config.userdata.get("profile", "postgres")
146146
if profile not in available_profiles:

projects/adapter/integration_tests/projects/env_project/dbt_project.yml

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,6 @@ seed-paths: ["data"]
88
macro-paths: ["macros"]
99
snapshot-paths: ["snapshots"]
1010

11-
target-path: "{{ env_var('temp_dir') }}/target"
12-
1311
vars:
1412
fal-scripts-path: "fal_scripts"
1513

projects/adapter/integration_tests/projects/simple_project/dbt_project.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,10 +8,10 @@ seed-paths: ["data"]
88
macro-paths: ["macros"]
99
snapshot-paths: ["snapshots"]
1010

11-
target-path: "{{ env_var('temp_dir') }}/target"
12-
1311
models:
1412
+schema: custom
13+
# All serverless should run on M machines by default
14+
+fal_machine: "M"
1515
simple_test:
1616
python:
1717
+materialized: table

projects/adapter/integration_tests/projects/source_freshness/dbt_project.yml

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,6 @@ seed-paths: ["data"]
88
macro-paths: ["macros"]
99
snapshot-paths: ["snapshots"]
1010

11-
target-path: "{{ env_var('temp_dir') }}/target"
12-
1311
vars:
1412
fal-scripts-path: "fal_scripts"
1513

0 commit comments

Comments
 (0)