Skip to content

bug: insert into an existing table errors out in table not exists with pyspark backend #11184

Open
@pareamit

Description

@pareamit

What happened?

Below is my code snippet. I am trying to create an empty table first and then add data to it.
con.create_table( "table", database="warehouse.schema", schema=df.schema() )
con.insert("table", obj=df, database="warehouse.schema")

What version of ibis are you using?

ibis-framework:9.5.0

What backend(s) are you using, if any?

pyspark

Relevant log output

[TABLE_OR_VIEW_NOT_FOUND] The table or view `table` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01
File <command-6653096731292639>, line 1
----> 1 con.insert("table",  obj=df, database="warehouse.schema")
File /databricks/spark/python/pyspark/errors/exceptions/captured.py:230, in capture_sql_exception.<locals>.deco(*a, **kw)
    226 converted = convert_exception(e.java_exception)
    227 if not isinstance(converted, UnknownException):
    228     # Hide where the exception came from that shows a non-Pythonic
    229     # JVM exception message.
--> 230     raise converted from None
    231 else:
    232     raise

Code of Conduct

  • I agree to follow this project's Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugIncorrect behavior inside of ibis

    Type

    No type

    Projects

    Status

    backlog

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions