Releases: Open-EO/openeo-python-client
Releases · Open-EO/openeo-python-client
openEO Python Client v0.40.0
Added
sar_backscatter: try to retrieve coefficient options from backend (#693)- Improve error message when OIDC provider is unavailable (#751)
- Added
on_response_headersargument toDataCube.download()and related to handle (e.g.print) the response headers (#560)
Changed
- When the bands provided to
Connection.load_stac(..., bands=[...])do not fully match the bands the client extracted from the STAC metadata, a warning will be triggered, but the provided band names will still be used during the client-side preparation of the process graph. This is a pragmatic approach to bridge the gap between differing interpretations of band detection in STAC. Note that this might produce process graphs that are technically invalid and might not work on other backends or future versions of the backend you currently use. It is recommended to consult with the provider of the STAC metadata and openEO backend on the correct and future-proof band names. (#752)
Fixed
STACAPIJobDatabase.get_by_status()now always returns apandas.DataFramewith an index compatible withMultiBackendJobManager. (#707)
openEO Python Client v0.39.1
Fixed
- Fix legacy usage pattern to append
export_workspacetosave_resultwith genericprocess()helper method (#742)
openEO Python Client v0.39.0
Added
- Add support for
export_workspaceprocess (#720) - Add support for processing parameter extension (e.g. default job options) in
build_process_dict(#731)
Changed
DataCube.save_result()(and related methods) now return aSaveResult/StacResourceobject instead of anotherDataCubeobject to be more in line with the officialsave_resultspecification (#402, #720)- Deprecate
BatchJob.run_synchronousin favor ofBatchJob.start_and_wait(#570).
Fixed
- Fix incompatibility problem when combining
load_stacandresample_spatial(#737)
openEO Python Client v0.38.0
Added
- Add initial support for accessing Federation Extension related metadata (#668)
Changed
- Improved tracking of metadata changes with
resample_spatialandresample_cube_spatial(#690) - Move
ComparableVersiontoopeneo.utils.version(related to #611) - Deprecate
openeo.rest.rest_capabilities.RESTCapabilitiesand introduce replacementopeneo.rest.capabilities.OpenEoCapabilities(#611, #610) MultiBackendJobManager: start new jobs before downloading the results of finished jobs to use time more efficiently (#633)
Removed
- Remove unnecessary base class
openeo.capabilities.Capabilities#611
Fixed
CsvJobDatabase: workaround GeoPandas issue (on Python>3.9) when there is a column named "crs" (#714)
openEO Python Client v0.37.0
Added
- Added
show_error_logsargument tocube.execute_batch()/job.start_and_wait()/... to toggle the automatic printing of error logs on failure (#505) - Added
Connection.web_editor()to build link to the openEO backend in the openEO Web Editor - Add support for
log_levelincreate_job()andexecute_job()(#704) - Add initial support for "geometry" dimension type in
CubeMetadata(#705) - Add support for parameterized
bandsargument inload_stac() - Argument
spatial_extentinload_collection()/load_stac(): add support for Shapely objects, loading GeoJSON from a local path and loading geometry from GeoJSON/GeoParquet URL. (#678)
Changed
- Raise exception when providing empty bands array to
load_collection/load_stac(#424, Open-EO/openeo-processes#372) - Start showing deprecation warnings on usage of GeoJSON "GeometryCollection" (in
filter_spatial,aggregate_spatial,chunk_polygon,mask_polygon). Use a GeoJSON FeatureCollection instead. (#706, Open-EO/openeo-processes#389) - The
contextparameter is now used inexecute_local_udf(#556
Fixed
- Clear capabilities cache on login (#254)
openEO Python Client v0.36.0
Added
- Automatically use
load_urlwhen providing a URL as geometries toDataCube.aggregate_spatial(),DataCube.mask_polygon(), etc. (#104, #457) - Allow specifying
limitwhen listing batch jobs withConnection.list_jobs()(#677) - Add
additionalandjob_optionsarguments toConnection.download(),Datacube.download()and related (#681)
Changed
MultiBackendJobManager: costs has been added as a column in tracking databases ([#588])- When passing a path/string as
geometrytoDataCube.aggregate_spatial(),DataCube.mask_polygon(), etc.:
this is not translated automatically anymore to deprecated, non-standardread_vectorusage.
Instead, if it is a local GeoJSON file, the GeoJSON data will be loaded directly client-side.
(#104, #457) - Move
read()method from generalJobDatabaseInterfaceto more specificFullDataFrameJobDatabase(#680) - Align
additionalandjob_optionsarguments inConnection.create_job(),DataCube.create_job()and related.
Also, follow official spec more closely. (#683, Open-EO/openeo-api#276)
Fixed
openEO Python Client v0.35.0
Added
- Added
MultiResulthelper class to build process graphs with multiple result nodes (#391)
Fixed
MultiBackendJobManager: Fix issue with duplicate job starting across multiple backends (#654)MultiBackendJobManager: Fix encoding issue of job metadata inon_job_done(#657)MultiBackendJobManager: AvoidSettingWithCopyWarning(#641)- Avoid creating empty file if asset download request failed.
MultiBackendJobManager: avoid dtype loading mistakes inCsvJobDatabaseon empty columns (#656)MultiBackendJobManager: restore logging of job status histogram duringrun_jobs(#655)
openEO Python Client v0.34.0
openEO Python Client v0.33.0
Added
- Added
DataCube.load_stac()to also support creating aload_stacbased cube without a connection (#638) MultiBackendJobManager: Addedinitialize_from_df(df)(toCsvJobDatabaseandParquetJobDatabase) to initialize (and persist) the job database from a given DataFrame.
Also addedcreate_job_db()factory to easily create a job database from a given dataframe and its type guessed from filename extension.
(#635)MultiBackendJobManager.run_jobs()now returns a dictionary with counters/stats about various events during the full run of the job manager (#645)- Added (experimental)
ProcessBasedJobCreatorto be used asstart_jobcallable withMultiBackendJobManagerto create multiple jobs from a single parameterized process (e.g. a UDP or remote process definition) (#604)
Fixed
- When using
DataCube.load_collection()without a connection, it is not necessary anymore to also explicitly setfetch_metadata=False(#638)
openEO Python Client v0.32.0
Added
load_stac/metadata_from_stac: add support for extracting actual temporal dimension metadata (#567)MultiBackendJobManager: addcancel_running_job_afteroption to automatically cancel jobs that are running for too long (#590)- Added
openeo.api.process.Parameterhelper to easily create a "spatial_extent" UDP parameter - Wrap OIDC token request failure in more descriptive
OidcException(related to #624) - Added
auto_add_save_resultoption (on by default) to disable automatic addition ofsave_resultnode ondownload/create_job/execute_batch(#513) - Add support for
apply_vectorcubeUDF signature inrun_udf_code([Open-EO/openeo-geopyspark-driver#881]Open-EO/openeo-geopyspark-driver#811) MultiBackendJobManager: add API to the update loop in a separate thread, allowing controlled interruption.
Changed
MultiBackendJobManager: changed job metadata storage API, to enable working with large databasesDataCube.apply_polygon(): renamepolygonsargument togeometries, but keep support for legacypolygonsfor now (#592, #511)- Disallow ambiguous single string argument in
DataCube.filter_temporal()(#628) - Automatic adding of
save_resultfromdownload()orcreate_job(): inspect whole process graph for pre-existingsave_resultnodes (related to #623, #401, #583) - Disallow ambiguity of combining explicit
save_resultnodes and implicitsave_resultaddition fromdownload()/create_job()calls withformat(related to #623, #401, #583)
Fixed
apply_dimensionwith atarget_dimensionargument was not correctly adjusting datacube metadata on the client side, causing a mismatch.- Preserve non-spatial dimension metadata in
aggregate_spatial(#612)