Skip to content

Commit fc1d503

Browse files
committed
chore: set streaming opt-in
Signed-off-by: Ion Koutsouris <15728914+ion-elgreco@users.noreply.github.com>
1 parent 5a7678c commit fc1d503

File tree

2 files changed

+26
-119
lines changed

2 files changed

+26
-119
lines changed

python/deltalake/table.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -980,6 +980,7 @@ def merge(
980980
error_on_type_mismatch: bool = True,
981981
writer_properties: Optional[WriterProperties] = None,
982982
large_dtypes: Optional[bool] = None,
983+
streaming: bool = False,
983984
custom_metadata: Optional[Dict[str, str]] = None,
984985
post_commithook_properties: Optional[PostCommitHookProperties] = None,
985986
commit_properties: Optional[CommitProperties] = None,
@@ -997,6 +998,7 @@ def merge(
997998
error_on_type_mismatch: specify if merge will return error if data types are mismatching :default = True
998999
writer_properties: Pass writer properties to the Rust parquet writer
9991000
large_dtypes: Deprecated, will be removed in 1.0
1001+
streaming: Will execute MERGE using a LazyMemoryExec plan
10001002
arrow_schema_conversion_mode: Large converts all types of data schema into Large Arrow types, passthrough keeps string/binary/list types untouched
10011003
custom_metadata: Deprecated and will be removed in future versions. Use commit_properties instead.
10021004
post_commithook_properties: properties for the post commit hook. If None, default values are used.
@@ -1035,17 +1037,14 @@ def merge(
10351037
convert_pyarrow_table,
10361038
)
10371039

1038-
streaming = False
10391040
if isinstance(source, pyarrow.RecordBatchReader):
10401041
source = convert_pyarrow_recordbatchreader(source, conversion_mode)
1041-
streaming = True
10421042
elif isinstance(source, pyarrow.RecordBatch):
10431043
source = convert_pyarrow_recordbatch(source, conversion_mode)
10441044
elif isinstance(source, pyarrow.Table):
10451045
source = convert_pyarrow_table(source, conversion_mode)
10461046
elif isinstance(source, ds.Dataset):
10471047
source = convert_pyarrow_dataset(source, conversion_mode)
1048-
streaming = True
10491048
elif _has_pandas and isinstance(source, pd.DataFrame):
10501049
source = convert_pyarrow_table(
10511050
pyarrow.Table.from_pandas(source), conversion_mode

0 commit comments

Comments
 (0)