Releases: Open-EO/openeo-python-client
Releases · Open-EO/openeo-python-client
openEO Python Client v0.38.0
Added
- Add initial support for accessing Federation Extension related metadata (#668)
Changed
- Improved tracking of metadata changes with
resample_spatialandresample_cube_spatial(#690) - Move
ComparableVersiontoopeneo.utils.version(related to #611) - Deprecate
openeo.rest.rest_capabilities.RESTCapabilitiesand introduce replacementopeneo.rest.capabilities.OpenEoCapabilities(#611, #610) MultiBackendJobManager: start new jobs before downloading the results of finished jobs to use time more efficiently (#633)
Removed
- Remove unnecessary base class
openeo.capabilities.Capabilities#611
Fixed
CsvJobDatabase: workaround GeoPandas issue (on Python>3.9) when there is a column named "crs" (#714)
openEO Python Client v0.37.0
Added
- Added
show_error_logsargument tocube.execute_batch()/job.start_and_wait()/... to toggle the automatic printing of error logs on failure (#505) - Added
Connection.web_editor()to build link to the openEO backend in the openEO Web Editor - Add support for
log_levelincreate_job()andexecute_job()(#704) - Add initial support for "geometry" dimension type in
CubeMetadata(#705) - Add support for parameterized
bandsargument inload_stac() - Argument
spatial_extentinload_collection()/load_stac(): add support for Shapely objects, loading GeoJSON from a local path and loading geometry from GeoJSON/GeoParquet URL. (#678)
Changed
- Raise exception when providing empty bands array to
load_collection/load_stac(#424, Open-EO/openeo-processes#372) - Start showing deprecation warnings on usage of GeoJSON "GeometryCollection" (in
filter_spatial,aggregate_spatial,chunk_polygon,mask_polygon). Use a GeoJSON FeatureCollection instead. (#706, Open-EO/openeo-processes#389) - The
contextparameter is now used inexecute_local_udf(#556
Fixed
- Clear capabilities cache on login (#254)
openEO Python Client v0.36.0
Added
- Automatically use
load_urlwhen providing a URL as geometries toDataCube.aggregate_spatial(),DataCube.mask_polygon(), etc. (#104, #457) - Allow specifying
limitwhen listing batch jobs withConnection.list_jobs()(#677) - Add
additionalandjob_optionsarguments toConnection.download(),Datacube.download()and related (#681)
Changed
MultiBackendJobManager: costs has been added as a column in tracking databases ([#588])- When passing a path/string as
geometrytoDataCube.aggregate_spatial(),DataCube.mask_polygon(), etc.:
this is not translated automatically anymore to deprecated, non-standardread_vectorusage.
Instead, if it is a local GeoJSON file, the GeoJSON data will be loaded directly client-side.
(#104, #457) - Move
read()method from generalJobDatabaseInterfaceto more specificFullDataFrameJobDatabase(#680) - Align
additionalandjob_optionsarguments inConnection.create_job(),DataCube.create_job()and related.
Also, follow official spec more closely. (#683, Open-EO/openeo-api#276)
Fixed
openEO Python Client v0.35.0
Added
- Added
MultiResulthelper class to build process graphs with multiple result nodes (#391)
Fixed
MultiBackendJobManager: Fix issue with duplicate job starting across multiple backends (#654)MultiBackendJobManager: Fix encoding issue of job metadata inon_job_done(#657)MultiBackendJobManager: AvoidSettingWithCopyWarning(#641)- Avoid creating empty file if asset download request failed.
MultiBackendJobManager: avoid dtype loading mistakes inCsvJobDatabaseon empty columns (#656)MultiBackendJobManager: restore logging of job status histogram duringrun_jobs(#655)
openEO Python Client v0.34.0
openEO Python Client v0.33.0
Added
- Added
DataCube.load_stac()to also support creating aload_stacbased cube without a connection (#638) MultiBackendJobManager: Addedinitialize_from_df(df)(toCsvJobDatabaseandParquetJobDatabase) to initialize (and persist) the job database from a given DataFrame.
Also addedcreate_job_db()factory to easily create a job database from a given dataframe and its type guessed from filename extension.
(#635)MultiBackendJobManager.run_jobs()now returns a dictionary with counters/stats about various events during the full run of the job manager (#645)- Added (experimental)
ProcessBasedJobCreatorto be used asstart_jobcallable withMultiBackendJobManagerto create multiple jobs from a single parameterized process (e.g. a UDP or remote process definition) (#604)
Fixed
- When using
DataCube.load_collection()without a connection, it is not necessary anymore to also explicitly setfetch_metadata=False(#638)
openEO Python Client v0.32.0
Added
load_stac/metadata_from_stac: add support for extracting actual temporal dimension metadata (#567)MultiBackendJobManager: addcancel_running_job_afteroption to automatically cancel jobs that are running for too long (#590)- Added
openeo.api.process.Parameterhelper to easily create a "spatial_extent" UDP parameter - Wrap OIDC token request failure in more descriptive
OidcException(related to #624) - Added
auto_add_save_resultoption (on by default) to disable automatic addition ofsave_resultnode ondownload/create_job/execute_batch(#513) - Add support for
apply_vectorcubeUDF signature inrun_udf_code([Open-EO/openeo-geopyspark-driver#881]Open-EO/openeo-geopyspark-driver#811) MultiBackendJobManager: add API to the update loop in a separate thread, allowing controlled interruption.
Changed
MultiBackendJobManager: changed job metadata storage API, to enable working with large databasesDataCube.apply_polygon(): renamepolygonsargument togeometries, but keep support for legacypolygonsfor now (#592, #511)- Disallow ambiguous single string argument in
DataCube.filter_temporal()(#628) - Automatic adding of
save_resultfromdownload()orcreate_job(): inspect whole process graph for pre-existingsave_resultnodes (related to #623, #401, #583) - Disallow ambiguity of combining explicit
save_resultnodes and implicitsave_resultaddition fromdownload()/create_job()calls withformat(related to #623, #401, #583)
Fixed
apply_dimensionwith atarget_dimensionargument was not correctly adjusting datacube metadata on the client side, causing a mismatch.- Preserve non-spatial dimension metadata in
aggregate_spatial(#612)
openEO Python Client v0.31.0
Added
- Add experimental
openeo.testing.resultssubpackage with reusable test utilities for comparing batch job results with reference data MultiBackendJobManager: add initial support for storing job metadata in Parquet file (instead of CSV) (#571)- Add
Connection.authenticate_oidc_access_token()to set up authorization headers with an access token that is obtained "out-of-band" (#598) - Add
JobDatabaseInterfaceto allow custom job metadata storage withMultiBackendJobManager(#571)
openEO Python Client v0.30.0
Added
- Add
openeo.udf.run_code.extract_udf_dependencies()to extract UDF dependency declarations from UDF code
(related to Open-EO/openeo-geopyspark-driver#237) - Document PEP 723 based Python UDF dependency declarations (Open-EO/openeo-geopyspark-driver#237)
- Added more
openeo.api.process.Parameterhelpers to easily create "bounding_box", "date", "datetime", "geojson" and "temporal_interval" parameters for UDP construction. - Added convenience method
Connection.load_stac_from_job(job)to easily load the results of a batch job with theload_stacprocess (#566) load_stac/metadata_from_stac: add support for extracting band info from "item_assets" in collection metadata (#573)- Added initial
openeo.testingsubmodule for reusable test utilities
Fixed
- Initial fix for broken
DataCube.reduce_temporal()afterload_stac(#568)