Releases: Open-EO/openeo-python-client
Releases · Open-EO/openeo-python-client
openEO Python Client v0.40.0
Added
sar_backscatter
: try to retrieve coefficient options from backend (#693)- Improve error message when OIDC provider is unavailable (#751)
- Added
on_response_headers
argument toDataCube.download()
and related to handle (e.g.print
) the response headers (#560)
Changed
- When the bands provided to
Connection.load_stac(..., bands=[...])
do not fully match the bands the client extracted from the STAC metadata, a warning will be triggered, but the provided band names will still be used during the client-side preparation of the process graph. This is a pragmatic approach to bridge the gap between differing interpretations of band detection in STAC. Note that this might produce process graphs that are technically invalid and might not work on other backends or future versions of the backend you currently use. It is recommended to consult with the provider of the STAC metadata and openEO backend on the correct and future-proof band names. (#752)
Fixed
STACAPIJobDatabase.get_by_status()
now always returns apandas.DataFrame
with an index compatible withMultiBackendJobManager
. (#707)
openEO Python Client v0.39.1
Fixed
- Fix legacy usage pattern to append
export_workspace
tosave_result
with genericprocess()
helper method (#742)
openEO Python Client v0.39.0
Added
- Add support for
export_workspace
process (#720) - Add support for processing parameter extension (e.g. default job options) in
build_process_dict
(#731)
Changed
DataCube.save_result()
(and related methods) now return aSaveResult
/StacResource
object instead of anotherDataCube
object to be more in line with the officialsave_result
specification (#402, #720)- Deprecate
BatchJob.run_synchronous
in favor ofBatchJob.start_and_wait
(#570).
Fixed
- Fix incompatibility problem when combining
load_stac
andresample_spatial
(#737)
openEO Python Client v0.38.0
Added
- Add initial support for accessing Federation Extension related metadata (#668)
Changed
- Improved tracking of metadata changes with
resample_spatial
andresample_cube_spatial
(#690) - Move
ComparableVersion
toopeneo.utils.version
(related to #611) - Deprecate
openeo.rest.rest_capabilities.RESTCapabilities
and introduce replacementopeneo.rest.capabilities.OpenEoCapabilities
(#611, #610) MultiBackendJobManager
: start new jobs before downloading the results of finished jobs to use time more efficiently (#633)
Removed
- Remove unnecessary base class
openeo.capabilities.Capabilities
#611
Fixed
CsvJobDatabase
: workaround GeoPandas issue (on Python>3.9) when there is a column named "crs" (#714)
openEO Python Client v0.37.0
Added
- Added
show_error_logs
argument tocube.execute_batch()
/job.start_and_wait()
/... to toggle the automatic printing of error logs on failure (#505) - Added
Connection.web_editor()
to build link to the openEO backend in the openEO Web Editor - Add support for
log_level
increate_job()
andexecute_job()
(#704) - Add initial support for "geometry" dimension type in
CubeMetadata
(#705) - Add support for parameterized
bands
argument inload_stac()
- Argument
spatial_extent
inload_collection()
/load_stac()
: add support for Shapely objects, loading GeoJSON from a local path and loading geometry from GeoJSON/GeoParquet URL. (#678)
Changed
- Raise exception when providing empty bands array to
load_collection
/load_stac
(#424, Open-EO/openeo-processes#372) - Start showing deprecation warnings on usage of GeoJSON "GeometryCollection" (in
filter_spatial
,aggregate_spatial
,chunk_polygon
,mask_polygon
). Use a GeoJSON FeatureCollection instead. (#706, Open-EO/openeo-processes#389) - The
context
parameter is now used inexecute_local_udf
(#556
Fixed
- Clear capabilities cache on login (#254)
openEO Python Client v0.36.0
Added
- Automatically use
load_url
when providing a URL as geometries toDataCube.aggregate_spatial()
,DataCube.mask_polygon()
, etc. (#104, #457) - Allow specifying
limit
when listing batch jobs withConnection.list_jobs()
(#677) - Add
additional
andjob_options
arguments toConnection.download()
,Datacube.download()
and related (#681)
Changed
MultiBackendJobManager
: costs has been added as a column in tracking databases ([#588])- When passing a path/string as
geometry
toDataCube.aggregate_spatial()
,DataCube.mask_polygon()
, etc.:
this is not translated automatically anymore to deprecated, non-standardread_vector
usage.
Instead, if it is a local GeoJSON file, the GeoJSON data will be loaded directly client-side.
(#104, #457) - Move
read()
method from generalJobDatabaseInterface
to more specificFullDataFrameJobDatabase
(#680) - Align
additional
andjob_options
arguments inConnection.create_job()
,DataCube.create_job()
and related.
Also, follow official spec more closely. (#683, Open-EO/openeo-api#276)
Fixed
openEO Python Client v0.35.0
Added
- Added
MultiResult
helper class to build process graphs with multiple result nodes (#391)
Fixed
MultiBackendJobManager
: Fix issue with duplicate job starting across multiple backends (#654)MultiBackendJobManager
: Fix encoding issue of job metadata inon_job_done
(#657)MultiBackendJobManager
: AvoidSettingWithCopyWarning
(#641)- Avoid creating empty file if asset download request failed.
MultiBackendJobManager
: avoid dtype loading mistakes inCsvJobDatabase
on empty columns (#656)MultiBackendJobManager
: restore logging of job status histogram duringrun_jobs
(#655)
openEO Python Client v0.34.0
openEO Python Client v0.33.0
Added
- Added
DataCube.load_stac()
to also support creating aload_stac
based cube without a connection (#638) MultiBackendJobManager
: Addedinitialize_from_df(df)
(toCsvJobDatabase
andParquetJobDatabase
) to initialize (and persist) the job database from a given DataFrame.
Also addedcreate_job_db()
factory to easily create a job database from a given dataframe and its type guessed from filename extension.
(#635)MultiBackendJobManager.run_jobs()
now returns a dictionary with counters/stats about various events during the full run of the job manager (#645)- Added (experimental)
ProcessBasedJobCreator
to be used asstart_job
callable withMultiBackendJobManager
to create multiple jobs from a single parameterized process (e.g. a UDP or remote process definition) (#604)
Fixed
- When using
DataCube.load_collection()
without a connection, it is not necessary anymore to also explicitly setfetch_metadata=False
(#638)
openEO Python Client v0.32.0
Added
load_stac
/metadata_from_stac
: add support for extracting actual temporal dimension metadata (#567)MultiBackendJobManager
: addcancel_running_job_after
option to automatically cancel jobs that are running for too long (#590)- Added
openeo.api.process.Parameter
helper to easily create a "spatial_extent" UDP parameter - Wrap OIDC token request failure in more descriptive
OidcException
(related to #624) - Added
auto_add_save_result
option (on by default) to disable automatic addition ofsave_result
node ondownload
/create_job
/execute_batch
(#513) - Add support for
apply_vectorcube
UDF signature inrun_udf_code
([Open-EO/openeo-geopyspark-driver#881]Open-EO/openeo-geopyspark-driver#811) MultiBackendJobManager
: add API to the update loop in a separate thread, allowing controlled interruption.
Changed
MultiBackendJobManager
: changed job metadata storage API, to enable working with large databasesDataCube.apply_polygon()
: renamepolygons
argument togeometries
, but keep support for legacypolygons
for now (#592, #511)- Disallow ambiguous single string argument in
DataCube.filter_temporal()
(#628) - Automatic adding of
save_result
fromdownload()
orcreate_job()
: inspect whole process graph for pre-existingsave_result
nodes (related to #623, #401, #583) - Disallow ambiguity of combining explicit
save_result
nodes and implicitsave_result
addition fromdownload()
/create_job()
calls withformat
(related to #623, #401, #583)
Fixed
apply_dimension
with atarget_dimension
argument was not correctly adjusting datacube metadata on the client side, causing a mismatch.- Preserve non-spatial dimension metadata in
aggregate_spatial
(#612)