Skip to content

Commit c03a7d3

Browse files
authored
run_udf: Simplified and clarified the schema for data #515 (#519)
* `run_udf`: Simplified and clarified the schema for `data` - no functional change. #515 * Applied proposed changes from code review
1 parent 605a7e8 commit c03a7d3

File tree

3 files changed

+11
-32
lines changed

3 files changed

+11
-32
lines changed

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
3838
- `filter_spatial`: Clarified that masking is applied using the given geometries. [#469](https://github.com/Open-EO/openeo-processes/issues/469)
3939
- `load_collection` and `load_stac`: Clarified that scale and offset are not applied automatically when loading the data. [#503](https://github.com/Open-EO/openeo-processes/issues/503)
4040
- `mod`: Clarified behavior for y = 0
41+
- `run_udf`: Simplified and clarified the schema for `data` - no functional change.
4142
- `sqrt`: Clarified that NaN is returned for negative numbers.
4243
- Clarify allowed `FeatureCollection` geometries in `load_collection`, `mask_polygon`, `apply_polygon`, and `load_stac` [#527](https://github.com/Open-EO/openeo-processes/issues/527)
4344

proposals/run_udf_externally.json

Lines changed: 5 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"id": "run_udf_externally",
33
"summary": "Run an externally hosted UDF container",
4-
"description": "Runs a compatible UDF container that is either externally hosted by a service provider or running on a local machine of the user. The UDF container must follow the [openEO UDF specification](https://openeo.org/documentation/1.0/udfs.html).\n\nThe referenced UDF service can be executed in several processes such as ``aggregate_spatial()``, ``apply()``, ``apply_dimension()`` and ``reduce_dimension()``. In this case, an array is passed instead of a data cube. The user must ensure that the data is given in a way that the UDF code can make sense of it.",
4+
"description": "Runs a compatible UDF container that is either externally hosted by a service provider or running on a local machine of the user. The UDF container must follow the [openEO UDF specification](https://openeo.org/documentation/1.0/udfs.html).\n\nThe referenced UDF service can be executed in several processes such as ``aggregate_spatial()``, ``apply()``, ``apply_dimension()`` and ``reduce_dimension()``. In this case, an array is passed instead of a data cube. The user must ensure that the data is provided in a way that the UDF code can make sense of it.",
55
"categories": [
66
"cubes",
77
"import",
@@ -11,21 +11,10 @@
1111
"parameters": [
1212
{
1313
"name": "data",
14-
"description": "The data to be passed to the UDF.",
15-
"schema": [
16-
{
17-
"title": "Array",
18-
"type": "array",
19-
"minItems": 1,
20-
"items": {
21-
"description": "Any data type."
22-
}
23-
},
24-
{
25-
"title": "Single Value",
26-
"description": "A single value of any data type."
27-
}
28-
]
14+
"description": "The data to be passed to the UDF.\n\nUsually, `run_udf` process is used as a child process of a data cube process such as ``reduce_dimension()`` or ``apply()``. These data cube processes define the actual type of this data. For example, in case of ``reduce_dimension()``, `data` will be an array of values, while in case of ``apply()`` it will be a single scalar value.",
15+
"schema": {
16+
"description": "A value of any data type."
17+
}
2918
},
3019
{
3120
"name": "url",

run_udf.json

Lines changed: 5 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"id": "run_udf",
33
"summary": "Run a UDF",
4-
"description": "Runs a UDF in one of the supported runtime environments.\n\nThe process can either:\n\n1. load and run a UDF stored in a file on the server-side workspace of the authenticated user. The path to the UDF file must be relative to the root directory of the user's workspace.\n2. fetch and run a remotely stored and published UDF by absolute URI.\n3. run the source code specified inline as string.\n\nThe loaded UDF can be executed in several processes such as ``aggregate_spatial()``, ``apply()``, ``apply_dimension()`` and ``reduce_dimension()``. The user must ensure that the data is given in a way that the UDF code can make sense of it.",
4+
"description": "Runs a UDF in one of the supported runtime environments.\n\nThe process can either:\n\n1. load and run a UDF stored in a file on the server-side workspace of the authenticated user. The path to the UDF file must be relative to the root directory of the user's workspace.\n2. fetch and run a remotely stored and published UDF by absolute URI.\n3. run the source code specified inline as string.\n\nThe loaded UDF can be executed in several processes such as ``aggregate_spatial()``, ``apply()``, ``apply_dimension()`` and ``reduce_dimension()``. The user must ensure that the data is provided in a way that the UDF code can make sense of it.",
55
"categories": [
66
"cubes",
77
"import",
@@ -10,21 +10,10 @@
1010
"parameters": [
1111
{
1212
"name": "data",
13-
"description": "The data to be passed to the UDF.",
14-
"schema": [
15-
{
16-
"title": "Array",
17-
"type": "array",
18-
"minItems": 1,
19-
"items": {
20-
"description": "Any data type."
21-
}
22-
},
23-
{
24-
"title": "Single Value",
25-
"description": "A single value of any data type."
26-
}
27-
]
13+
"description": "The data to be passed to the UDF.\n\nUsually, `run_udf` process is used as a child process of a data cube process such as ``reduce_dimension()`` or ``apply()``. These data cube processes define the actual type of this data. For example, in case of ``reduce_dimension()``, `data` will be an array of values, while in case of ``apply()`` it will be a single scalar value.",
14+
"schema": {
15+
"description": "A value of any data type."
16+
}
2817
},
2918
{
3019
"name": "udf",

0 commit comments

Comments
 (0)