Skip to content

Issues with replace flow causing cyclical errors #997

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
geo-martino opened this issue Apr 16, 2025 · 4 comments · May be fixed by #1004
Open

Issues with replace flow causing cyclical errors #997

geo-martino opened this issue Apr 16, 2025 · 4 comments · May be fixed by #1004
Labels
bug Something isn't working

Comments

@geo-martino
Copy link
Contributor

geo-martino commented Apr 16, 2025

Describe the bug

When attempting to rerun models which materialize as flows using materialization V2, numerous bugs are thrown. When creating the required intermediate views, the create or replace query uses just the identifier without the catalog or schema. This causes the view to be created in the hive_metastore. If the view relies on any resources in the Unity Catalog, this will throw an error like below.

Similarly, if the view is configured to use a unique name for the temporary intermediate view, the name of the intermediate view created will not match the name of the view in the later rename query on 'safely replace` flows causing an error like below.

During each of these operations, the original view will be dropped meaning the subsequent view will be created as expected. However, the following run will throw one of the above errors again leading to cyclical errors.

Steps To Reproduce

  1. Create and materialize a view with the unique_tmp_table_suffix enabled without any reference to other resources in the Unity Catalog.
  2. Create and materialize a view with any references to other resources in the Unity Catalog.
  3. Attempt to recreate the views and both errors should be thrown

Expected behavior

The view is recreated according to the logic of the expected replace flow.

Screenshots and log output

Logs when attempting to create the intermediate view in hive_metastore:

�[0m16:37:50.261151 [debug] [Thread-1 (]: Began running node model.dbt.test
�[0m16:37:50.261518 [info ] [Thread-1 (]: 1 of 1 START sql view model gold.default.test ..................... [RUN]
�[0m16:37:50.261841 [debug] [Thread-1 (]: Re-using an available connection from the pool (formerly list_dbt_gmarino_silver_reporting, now model.dbt.test)
�[0m16:37:50.262112 [debug] [Thread-1 (]: Databricks adapter: DatabricksDBTConnection(session-id=85704dde-2afe-42e8-839a-06b8406324c2, name=model.dbt.test, idle-time=0.024745941162109375s, language=None, compute-name=) - Reusing connection previously named list_dbt_gmarino_silver_reporting
�[0m16:37:50.262349 [debug] [Thread-1 (]: Began compiling node model.dbt.test
�[0m16:37:50.265461 [debug] [Thread-1 (]: Writing injected SQL for node "model.dbt.test"
�[0m16:37:50.266370 [debug] [Thread-1 (]: Began executing node model.dbt.test
�[0m16:37:50.274881 [debug] [Thread-1 (]: MATERIALIZING VIEW
�[0m16:37:50.279202 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:50.279410 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

        CREATE SCHEMA IF NOT EXISTS `gold`.`default`
      
�[0m16:37:50.494651 [debug] [Thread-1 (]: SQL status: OK in 0.210 seconds
�[0m16:37:50.496418 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=85704dde-2afe-42e8-839a-06b8406324c2, command-id=f0d87aa8-260e-4f79-a1e5-c9783b85efd1) - Closing
�[0m16:37:50.595333 [debug] [Thread-1 (]: On "model.dbt.test": cache miss for schema "hive_metastore.default", this is inefficient
�[0m16:37:50.595559 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:50.595710 [debug] [Thread-1 (]: On model.dbt.test: GetTables(database=hive_metastore, schema=default)
�[0m16:37:50.819662 [debug] [Thread-1 (]: SQL status: OK in 0.220 seconds
�[0m16:37:50.821914 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=85704dde-2afe-42e8-839a-06b8406324c2, command-id=f96881a4-99d5-4306-b1f4-6ae5ece40175) - Closing
�[0m16:37:50.836413 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:50.836903 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

    
SELECT current_catalog()

  
�[0m16:37:51.041363 [debug] [Thread-1 (]: SQL status: OK in 0.200 seconds
�[0m16:37:51.045091 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=85704dde-2afe-42e8-839a-06b8406324c2, command-id=96a87fae-b5f9-4b6e-95fd-17f4355a71b3) - Closing
�[0m16:37:51.058695 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:51.059229 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

    
SHOW VIEWS IN `hive_metastore`.`default`

  
�[0m16:37:51.401422 [debug] [Thread-1 (]: SQL status: OK in 0.340 seconds
�[0m16:37:51.405980 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=85704dde-2afe-42e8-839a-06b8406324c2, command-id=8fd127cc-c0c7-4e1d-943d-11fae9027e91) - Closing
�[0m16:37:51.407740 [debug] [Thread-1 (]: While listing relations in database=hive_metastore, schema=default, found: calendar_date_series, test__dbt_tmp_2fc09784_af77_488f_a830_72161f294147
�[0m16:37:51.409392 [debug] [Thread-1 (]: On "model.dbt.test": cache miss for schema "hive_metastore.dbt_test__audit", this is inefficient
�[0m16:37:51.410345 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:51.411130 [debug] [Thread-1 (]: On model.dbt.test: GetTables(database=hive_metastore, schema=dbt_test__audit)
�[0m16:37:51.587470 [debug] [Thread-1 (]: SQL status: OK in 0.180 seconds
�[0m16:37:51.589751 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=85704dde-2afe-42e8-839a-06b8406324c2, command-id=7a6bc460-f56a-49ec-9fe5-a64934102ee3) - Closing
�[0m16:37:51.590425 [debug] [Thread-1 (]: While listing relations in database=hive_metastore, schema=dbt_test__audit, found: 
�[0m16:37:51.614885 [debug] [Thread-1 (]: Applying REPLACE to: `gold`.`default`.`test`
�[0m16:37:51.625420 [debug] [Thread-1 (]: Applying DROP to: `gold`.`default`.`test__dbt_backup`
�[0m16:37:51.627661 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:51.627940 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */
DROP VIEW IF EXISTS `gold`.`default`.`test__dbt_backup`
�[0m16:37:51.816606 [debug] [Thread-1 (]: SQL status: OK in 0.190 seconds
�[0m16:37:51.818054 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=85704dde-2afe-42e8-839a-06b8406324c2, command-id=60820bb8-fcca-470f-aa9a-d47fee191306) - Closing
�[0m16:37:51.826496 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:51.826909 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

        alter view `gold`.`default`.`test` rename to `gold`.`default`.`test__dbt_backup`
    
�[0m16:37:52.507448 [debug] [Thread-1 (]: SQL status: OK in 0.680 seconds
�[0m16:37:52.511037 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=85704dde-2afe-42e8-839a-06b8406324c2, command-id=4b2d69da-80d8-4b44-8baa-10b06fb773f2) - Closing
�[0m16:37:52.514465 [debug] [Thread-1 (]: Applying CREATE INTERMEDIATE to: `gold`.`default`.`test`
�[0m16:37:52.527065 [debug] [Thread-1 (]: Applying DROP to: `test__dbt_tmp_52d8d301_a609_483c_aed2_bf8f50319254`
�[0m16:37:52.527748 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:52.528088 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */
DROP VIEW IF EXISTS `test__dbt_tmp_52d8d301_a609_483c_aed2_bf8f50319254`
�[0m16:37:52.696467 [debug] [Thread-1 (]: SQL status: OK in 0.170 seconds
�[0m16:37:52.698420 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=85704dde-2afe-42e8-839a-06b8406324c2, command-id=43f92f64-5256-4cba-a990-054ece610779) - Closing
�[0m16:37:52.702066 [debug] [Thread-1 (]: Applying CREATE to: `test__dbt_tmp_52d8d301_a609_483c_aed2_bf8f50319254`
�[0m16:37:52.716411 [debug] [Thread-1 (]: Creating view `test__dbt_tmp_52d8d301_a609_483c_aed2_bf8f50319254`
�[0m16:37:52.721826 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:52.722106 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */
select * from (
        SELECT * FROM `bronze`.`sftp`.`flowback`
    ) as __dbt_sbq
    where false
    limit 0

    
�[0m16:37:53.026573 [debug] [Thread-1 (]: SQL status: OK in 0.300 seconds
�[0m16:37:53.031482 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=85704dde-2afe-42e8-839a-06b8406324c2, command-id=1005e2e9-e39f-40d8-87b0-6832829adbfe) - Closing
�[0m16:37:53.044771 [debug] [Thread-1 (]: Applying RENAME INTERMEDIATE to: `gold`.`default`.`test`
�[0m16:37:53.047440 [debug] [Thread-1 (]: Applying RENAME to: `test__dbt_tmp_9326ec63_aeaa_411c_ace4_cded30f9831b`
�[0m16:37:53.052126 [debug] [Thread-1 (]: Applying DROP BACKUP to: `gold`.`default`.`test`
�[0m16:37:53.053408 [debug] [Thread-1 (]: Applying DROP to: `gold`.`default`.`test__dbt_backup`
�[0m16:37:53.056154 [debug] [Thread-1 (]: Writing runtime sql for node "model.dbt.test"
�[0m16:37:53.057481 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:37:53.057716 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

        
  
  create or replace view `test__dbt_tmp_52d8d301_a609_483c_aed2_bf8f50319254`
  (
    `flowback_month`,
  `flowback_day`,
  `flowback_year`,
  `reference_number`,
  `customer`,
  `flowback_category`,
  `flowback_reason`,
  `flowback_response`,
  `funder`,
  `screener`,
  `investigated_by`,
  `comments`,
  `_path`,
  `_row_id`,
  `_modified_at`,
  `_updated_at`,
  `_process_id`
  )
  
    tblproperties ('delta.minReaderVersion' = '2' , 'delta.minWriterVersion' = '5' , 'delta.columnMapping.mode' = 'name' 
    )
  as (
    SELECT * FROM `bronze`.`sftp`.`flowback`
  )

      
�[0m16:37:53.431588 [debug] [Thread-1 (]: Databricks adapter: Exception while trying to execute query
/* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

        
  
  create or replace view `test__dbt_tmp_52d8d301_a609_483c_aed2_bf8f50319254`
  (
    `flowback_month`,
  `flowback_day`,
  `flowback_year`,
  `reference_number`,
  `customer`,
  `flowback_category`,
  `flowback_reason`,
  `flowback_response`,
  `funder`,
  `screener`,
  `investigated_by`,
  `comments`,
  `_path`,
  `_row_id`,
  `_modified_at`,
  `_updated_at`,
  `_process_id`
  )
  
    tblproperties ('delta.minReaderVersion' = '2' , 'delta.minWriterVersion' = '5' , 'delta.columnMapping.mode' = 'name' 
    )
  as (
    SELECT * FROM `bronze`.`sftp`.`flowback`
  )

      
: [UC_COMMAND_NOT_SUPPORTED.WITHOUT_RECOMMENDATION] The command(s): Creating a persistent view that references both Unity Catalog and Hive Metastore objects are not supported in Unity Catalog.  SQLSTATE: 0AKUC
Error properties: diagnostic-info=org.apache.hive.service.cli.HiveSQLException: Error running query: [UC_COMMAND_NOT_SUPPORTED.WITHOUT_RECOMMENDATION] org.apache.spark.sql.catalyst.ExtendedAnalysisException: [UC_COMMAND_NOT_SUPPORTED.WITHOUT_RECOMMENDATION] The command(s): Creating a persistent view that references both Unity Catalog and Hive Metastore objects are not supported in Unity Catalog.  SQLSTATE: 0AKUC
  at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:49)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:929)
  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:51)
  at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:727)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$5(SparkExecuteStatementOperation.scala:544)
  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  at org.apache.spark.sql.execution.SQLExecution$.withRootExecution(SQLExecution.scala:671)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:544)
  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)
  at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:293)
  at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
  at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:289)
  at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)
  at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)
  at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:30)
  at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:96)
  at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:77)
  at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:30)
  at com.databricks.spark.util.PublicDBLogging.withAttributionTags0(DatabricksSparkUsageLogger.scala:91)
  at com.databricks.spark.util.DatabricksSparkUsageLogger.withAttributionTags(DatabricksSparkUsageLogger.scala:195)
  at com.databricks.spark.util.UsageLogging.$anonfun$withAttributionTags$1(UsageLogger.scala:617)
  at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:729)
  at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:738)
  at com.databricks.spark.util.UsageLogging.withAttributionTags(UsageLogger.scala:617)
  at com.databricks.spark.util.UsageLogging.withAttributionTags$(UsageLogger.scala:615)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withAttributionTags(SparkExecuteStatementOperation.scala:72)
  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.$anonfun$withLocalProperties$12(ThriftLocalProperties.scala:234)
  at com.databricks.spark.util.IdentityClaim$.withClaim(IdentityClaim.scala:48)
  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:229)
  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:89)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:72)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:521)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:507)
  at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
  at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:557)
  at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
  at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: org.apache.spark.sql.catalyst.ExtendedAnalysisException: [UC_COMMAND_NOT_SUPPORTED.WITHOUT_RECOMMENDATION] The command(s): Creating a persistent view that references both Unity Catalog and Hive Metastore objects are not supported in Unity Catalog.  SQLSTATE: 0AKUC
  at org.apache.spark.sql.catalyst.ExtendedAnalysisException.copyPlan(ExtendedAnalysisException.scala:91)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:899)
  ... 43 more
, operation-id=bafc6250-dbf8-4b05-815d-78d76b99a1ed
�[0m16:37:53.453586 [debug] [Thread-1 (]: Database Error in model test (models/gold/test.sql)
  [UC_COMMAND_NOT_SUPPORTED.WITHOUT_RECOMMENDATION] The command(s): Creating a persistent view that references both Unity Catalog and Hive Metastore objects are not supported in Unity Catalog.  SQLSTATE: 0AKUC
  compiled code at target/run/dbt/models/gold/test.sql

Logs when attempting to rename the intermediate view with a unique identifier:

�[0m16:34:41.278556 [debug] [Thread-1 (]: Began running node model.dbt.test
�[0m16:34:41.278903 [info ] [Thread-1 (]: 1 of 1 START sql view model gold.default.test ..................... [RUN]
�[0m16:34:41.279212 [debug] [Thread-1 (]: Re-using an available connection from the pool (formerly list_bronze_b2b, now model.dbt.test)
�[0m16:34:41.279470 [debug] [Thread-1 (]: Databricks adapter: DatabricksDBTConnection(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, name=model.dbt.test, idle-time=0.016257047653198242s, language=None, compute-name=) - Reusing connection previously named list_bronze_b2b
�[0m16:34:41.279688 [debug] [Thread-1 (]: Began compiling node model.dbt.test
�[0m16:34:41.282228 [debug] [Thread-1 (]: Writing injected SQL for node "model.dbt.test"
�[0m16:34:41.283483 [debug] [Thread-1 (]: Began executing node model.dbt.test
�[0m16:34:41.291880 [debug] [Thread-1 (]: MATERIALIZING VIEW
�[0m16:34:41.296148 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:41.296349 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

        CREATE SCHEMA IF NOT EXISTS `gold`.`default`
      
�[0m16:34:41.536226 [debug] [Thread-1 (]: SQL status: OK in 0.240 seconds
�[0m16:34:41.537371 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=5dc5ab16-34ee-4231-98b2-7ce7f7c16146) - Closing
�[0m16:34:41.627567 [debug] [Thread-1 (]: On "model.dbt.test": cache miss for schema "hive_metastore.default", this is inefficient
�[0m16:34:41.627855 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:41.628029 [debug] [Thread-1 (]: On model.dbt.test: GetTables(database=hive_metastore, schema=default)
�[0m16:34:41.819443 [debug] [Thread-1 (]: SQL status: OK in 0.190 seconds
�[0m16:34:41.820809 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=b2a97c4e-9821-4108-b498-eab759bba179) - Closing
�[0m16:34:41.827277 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:41.827570 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

    
SELECT current_catalog()

  
�[0m16:34:42.143252 [debug] [Thread-1 (]: SQL status: OK in 0.320 seconds
�[0m16:34:42.146471 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=7bca5456-0ea0-4c9d-8274-5a46678d54cb) - Closing
�[0m16:34:42.158504 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:42.158976 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

    
SHOW VIEWS IN `hive_metastore`.`default`

  
�[0m16:34:42.400874 [debug] [Thread-1 (]: SQL status: OK in 0.240 seconds
�[0m16:34:42.404430 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=823fcb59-e83a-49e3-8aa2-7355a44e26d6) - Closing
�[0m16:34:42.406028 [debug] [Thread-1 (]: While listing relations in database=hive_metastore, schema=default, found: calendar_date_series
�[0m16:34:42.407125 [debug] [Thread-1 (]: On "model.dbt.test": cache miss for schema "hive_metastore.dbt_test__audit", this is inefficient
�[0m16:34:42.407644 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:42.408266 [debug] [Thread-1 (]: On model.dbt.test: GetTables(database=hive_metastore, schema=dbt_test__audit)
�[0m16:34:42.589693 [debug] [Thread-1 (]: SQL status: OK in 0.180 seconds
�[0m16:34:42.591873 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=7a644cc4-7414-4d8c-a0b0-c07959bcfd81) - Closing
�[0m16:34:42.592461 [debug] [Thread-1 (]: While listing relations in database=hive_metastore, schema=dbt_test__audit, found: 
�[0m16:34:42.614673 [debug] [Thread-1 (]: Applying REPLACE to: `gold`.`default`.`test`
�[0m16:34:42.625305 [debug] [Thread-1 (]: Applying DROP to: `gold`.`default`.`test__dbt_backup`
�[0m16:34:42.627544 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:42.627846 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */
DROP VIEW IF EXISTS `gold`.`default`.`test__dbt_backup`
�[0m16:34:42.960908 [debug] [Thread-1 (]: SQL status: OK in 0.330 seconds
�[0m16:34:42.962685 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=5e2006e6-2467-43c1-95f0-a873f8fd5a42) - Closing
�[0m16:34:42.974087 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:42.974558 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

        alter view `gold`.`default`.`test` rename to `gold`.`default`.`test__dbt_backup`
    
�[0m16:34:43.473528 [debug] [Thread-1 (]: SQL status: OK in 0.500 seconds
�[0m16:34:43.477063 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=5dd72531-4096-45cf-af4e-22e8267f3aa7) - Closing
�[0m16:34:43.480218 [debug] [Thread-1 (]: Applying CREATE INTERMEDIATE to: `gold`.`default`.`test`
�[0m16:34:43.492419 [debug] [Thread-1 (]: Applying DROP to: `test__dbt_tmp_2fc09784_af77_488f_a830_72161f294147`
�[0m16:34:43.493147 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:43.493490 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */
DROP VIEW IF EXISTS `test__dbt_tmp_2fc09784_af77_488f_a830_72161f294147`
�[0m16:34:43.690940 [debug] [Thread-1 (]: SQL status: OK in 0.200 seconds
�[0m16:34:43.692798 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=46ad71fc-c3a4-4a2a-b062-b1701f30a532) - Closing
�[0m16:34:43.696393 [debug] [Thread-1 (]: Applying CREATE to: `test__dbt_tmp_2fc09784_af77_488f_a830_72161f294147`
�[0m16:34:43.709689 [debug] [Thread-1 (]: Creating view `test__dbt_tmp_2fc09784_af77_488f_a830_72161f294147`
�[0m16:34:43.715264 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:43.715555 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */
select * from (
        SELECT * FROM VALUES (1,2,3)
    ) as __dbt_sbq
    where false
    limit 0

    
�[0m16:34:43.894412 [debug] [Thread-1 (]: SQL status: OK in 0.180 seconds
�[0m16:34:43.898545 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=c0b56599-3318-4ac1-9561-3579867df207) - Closing
�[0m16:34:43.913199 [debug] [Thread-1 (]: Applying RENAME INTERMEDIATE to: `gold`.`default`.`test`
�[0m16:34:43.916483 [debug] [Thread-1 (]: Applying RENAME to: `test__dbt_tmp_307bbf42_a1a8_48b8_9143_c688f3c4f4b0`
�[0m16:34:43.922669 [debug] [Thread-1 (]: Applying DROP BACKUP to: `gold`.`default`.`test`
�[0m16:34:43.924017 [debug] [Thread-1 (]: Applying DROP to: `gold`.`default`.`test__dbt_backup`
�[0m16:34:43.926069 [debug] [Thread-1 (]: Writing runtime sql for node "model.dbt.test"
�[0m16:34:43.927499 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:43.927759 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

        
  
  create or replace view `test__dbt_tmp_2fc09784_af77_488f_a830_72161f294147`
  (
    `col1`,
  `col2`,
  `col3`
  )
  
    tblproperties ('delta.minReaderVersion' = '2' , 'delta.minWriterVersion' = '5' , 'delta.columnMapping.mode' = 'name' 
    )
  as (
    SELECT * FROM VALUES (1,2,3)
  )

      
�[0m16:34:45.317745 [debug] [Thread-1 (]: SQL status: OK in 1.390 seconds
�[0m16:34:45.320474 [debug] [Thread-1 (]: Databricks adapter: Cursor(session-id=65eea05f-b9a4-4eda-81e5-224336233d02, command-id=66714783-69c0-4cb1-a8a6-ae688bcc173a) - Closing
�[0m16:34:45.323000 [debug] [Thread-1 (]: Writing runtime sql for node "model.dbt.test"
�[0m16:34:45.325956 [debug] [Thread-1 (]: Using databricks connection "model.dbt.test"
�[0m16:34:45.326892 [debug] [Thread-1 (]: On model.dbt.test: /* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

        -- get the standard intermediate name
    

    
  ALTER VIEW `test__dbt_tmp_307bbf42_a1a8_48b8_9143_c688f3c4f4b0` RENAME TO `test`

      
�[0m16:34:45.507415 [debug] [Thread-1 (]: Databricks adapter: Exception while trying to execute query
/* {"app": "dbt", "dbt_version": "1.9.4", "dbt_databricks_version": "1.10.1", "databricks_sql_connector_version": "4.0.2", "profile_name": "dbt", "target_name": "local", "node_id": "model.dbt.test"} */

        -- get the standard intermediate name
    

    
  ALTER VIEW `test__dbt_tmp_307bbf42_a1a8_48b8_9143_c688f3c4f4b0` RENAME TO `test`

      
: [TABLE_OR_VIEW_NOT_FOUND] The table or view `test__dbt_tmp_307bbf42_a1a8_48b8_9143_c688f3c4f4b0` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 7 pos 13
Error properties: diagnostic-info=org.apache.hive.service.cli.HiveSQLException: Error running query: [TABLE_OR_VIEW_NOT_FOUND] org.apache.spark.sql.catalyst.ExtendedAnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or view `test__dbt_tmp_307bbf42_a1a8_48b8_9143_c688f3c4f4b0` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 7 pos 13
  at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:49)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:929)
  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:51)
  at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:727)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$5(SparkExecuteStatementOperation.scala:544)
  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  at org.apache.spark.sql.execution.SQLExecution$.withRootExecution(SQLExecution.scala:671)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:544)
  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)
  at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:293)
  at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
  at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:289)
  at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)
  at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)
  at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:30)
  at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:96)
  at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:77)
  at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:30)
  at com.databricks.spark.util.PublicDBLogging.withAttributionTags0(DatabricksSparkUsageLogger.scala:91)
  at com.databricks.spark.util.DatabricksSparkUsageLogger.withAttributionTags(DatabricksSparkUsageLogger.scala:195)
  at com.databricks.spark.util.UsageLogging.$anonfun$withAttributionTags$1(UsageLogger.scala:617)
  at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:729)
  at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:738)
  at com.databricks.spark.util.UsageLogging.withAttributionTags(UsageLogger.scala:617)
  at com.databricks.spark.util.UsageLogging.withAttributionTags$(UsageLogger.scala:615)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withAttributionTags(SparkExecuteStatementOperation.scala:72)
  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.$anonfun$withLocalProperties$12(ThriftLocalProperties.scala:234)
  at com.databricks.spark.util.IdentityClaim$.withClaim(IdentityClaim.scala:48)
  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:229)
  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:89)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:72)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:521)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:507)
  at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
  at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:557)
  at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
  at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: org.apache.spark.sql.catalyst.ExtendedAnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or view `test__dbt_tmp_307bbf42_a1a8_48b8_9143_c688f3c4f4b0` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 7 pos 13
  at org.apache.spark.sql.catalyst.ExtendedAnalysisException.copyPlan(ExtendedAnalysisException.scala:91)
  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:899)
  ... 43 more
, operation-id=6dde4c94-1ede-4e44-b650-da994329df16
�[0m16:34:45.529655 [debug] [Thread-1 (]: Database Error in model test (models/gold/test.sql)
  [TABLE_OR_VIEW_NOT_FOUND] The table or view `test__dbt_tmp_307bbf42_a1a8_48b8_9143_c688f3c4f4b0` cannot be found. Verify the spelling and correctness of the schema and catalog.
  If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
  To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 7 pos 13
  compiled code at target/run/dbt/models/gold/test.sql

System information

The output of dbt --version:

Core:
  - installed: 1.9.4
  - latest:    1.9.4 - Up to date!

Plugins:
  - spark:      1.9.2  - Up to date!

Using the latest commit of dbt-databricks from main branch.

The operating system you're using:
MacOS 15.3.2

The output of python --version:
Python 3.12.9

@geo-martino geo-martino added the bug Something isn't working label Apr 16, 2025
@geo-martino geo-martino mentioned this issue Apr 16, 2025
3 tasks
@benc-db
Copy link
Collaborator

benc-db commented Apr 16, 2025

Something has really gone wrong here if 'CREATE SCHEMA IF NOT EXISTS gold.default' is in HMS. Here the catalog is being specified as gold, and HMS has catalog of 'hive_metastore'.

@benc-db
Copy link
Collaborator

benc-db commented Apr 16, 2025

This is the line that looks wrong:

create or replace view `test__dbt_tmp_2fc09784_af77_488f_a830_72161f294147`
  (
    `col1`,
  `col2`,
  `col3`
  )
  
    tblproperties ('delta.minReaderVersion' = '2' , 'delta.minWriterVersion' = '5' , 'delta.columnMapping.mode' = 'name' 
    )
  as (
    SELECT * FROM VALUES (1,2,3)
  )

This is trying to create a real view but using the tmp name.

@geo-martino
Copy link
Contributor Author

That makes sense. Is this then an issue with the materialization V2 logic?

@benc-db
Copy link
Collaborator

benc-db commented Apr 16, 2025

Yes, I have been able to repro. I don't know why the temp view name is being used as though its the target, that's what we need to figure out.

@benc-db benc-db linked a pull request Apr 24, 2025 that will close this issue
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
2 participants