Skip to content
Open
Show file tree
Hide file tree
Changes from 39 commits
Commits
Show all changes
43 commits
Select commit Hold shift + click to select a range
08bdec4
Add first tests from process examples
m-mohr Oct 12, 2023
2db6186
Add tests and docs
m-mohr Oct 13, 2023
af46f66
Add more tests
m-mohr Oct 24, 2023
29b61af
More tests, clean-up
m-mohr Oct 25, 2023
ac702b8
Tests for arrays
m-mohr Oct 26, 2023
aa3a632
Remove optional tests in favor of returns or throws
m-mohr Oct 30, 2023
dac84c1
EditorConfig + fix invalid files
m-mohr Oct 30, 2023
cad3ee1
Add id and experimental flag to test files
m-mohr Oct 30, 2023
46f37df
Add tests for comparison processes
m-mohr Oct 30, 2023
aa308b8
Add tests for reducers etc.
m-mohr Oct 31, 2023
58d10b8
Add further tests
m-mohr Nov 5, 2023
a9291a8
title converted to comments, fix parameter name of exp in tests
m-mohr Dec 7, 2023
574abe8
Fix test for tan
m-mohr Dec 7, 2023
da0c512
Add tests for array_apply, more NaN tests for comparisons, add requir…
m-mohr Dec 11, 2023
4c11c9a
add array_apply and array_filter tests, fix several math tests
m-mohr Dec 11, 2023
b93b1e6
update readme
m-mohr Dec 11, 2023
9262522
Add first datacube test
m-mohr Dec 11, 2023
c24c1ad
Fix apply test
m-mohr Dec 12, 2023
67917fa
Add profile levels
m-mohr Dec 12, 2023
895b40c
Fix test in apply
m-mohr Dec 12, 2023
1967425
Add tests for reduce_dimension and apply_dimension
m-mohr Dec 12, 2023
3b77850
Add more tests
m-mohr Dec 13, 2023
25384af
Fix various test cases
m-mohr Dec 14, 2023
75fb59b
Special no-data handling, fix tests
m-mohr Dec 14, 2023
2e4c55d
Add filter_bands and filter_temporal
m-mohr Dec 14, 2023
1b475fc
Add mask, merge_cubes, refactor datacube object
m-mohr Dec 15, 2023
0702c08
Mark experimental processes
m-mohr Dec 15, 2023
fe5f948
Add additional test for apply_dimension
m-mohr Dec 19, 2023
4e5ae0d
Add tests for apply_kernel
m-mohr Dec 22, 2023
5248a13
Add tests for aggregate_temporal
m-mohr Dec 22, 2023
6e21fc0
Add tests for aggregate_temporal_period, fix aggregate_temporal
m-mohr Dec 22, 2023
17f5a6b
Add tests for filter_bbox, filter_spatial and mask_polygon
m-mohr Jan 2, 2024
81834d6
Use nodata type in tests properly
m-mohr Jan 3, 2024
6508be2
Add changelog entry
m-mohr Jan 3, 2024
20ef8bc
Fix invalid datetimes
m-mohr Jan 17, 2024
788955d
Apply suggestions from code review
m-mohr Jan 25, 2024
65c8534
Add comment
m-mohr Jul 16, 2025
412b6e8
Add consistency checks for test files and fix issues
m-mohr Jul 16, 2025
b88ff6a
Remove list of processes from README
m-mohr Jul 16, 2025
1ec59e8
Merge branch 'draft' into add-tests
m-mohr Aug 22, 2025
7d9c1ec
Update tests/README.md
m-mohr Aug 22, 2025
6442258
Update tests/schema/schema.json
m-mohr Aug 22, 2025
7f4e7b8
Update tests/README.md
m-mohr Aug 22, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .editorconfig
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,11 @@ indent_style = spaces
indent_size = 4
insert_final_newline = true
trim_trailing_whitespace = true

[*.json5]
charset = utf-8
end_of_line = crlf
indent_style = spaces
indent_size = 2
insert_final_newline = true
trim_trailing_whitespace = true
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Added

- Implementation guide for implementing OGC API - Processes in openEO
- Unit Tests (see folder `tests`, moved specification tests and CI tools to `dev`)
- `export_collection`
- `export_workspace`
- `run_ogcapi`
Expand Down
12 changes: 0 additions & 12 deletions absolute.json
Original file line number Diff line number Diff line change
Expand Up @@ -28,12 +28,6 @@
}
},
"examples": [
{
"arguments": {
"x": 0
},
"returns": 0
},
{
"arguments": {
"x": 3.5
Expand All @@ -45,12 +39,6 @@
"x": -0.4
},
"returns": 0.4
},
{
"arguments": {
"x": -3.5
},
"returns": 3.5
}
],
"links": [
Expand Down
4 changes: 2 additions & 2 deletions apply_kernel.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"id": "apply_kernel",
"summary": "Apply a spatial convolution with a kernel",
"description": "Applies a 2D convolution (i.e. a focal operation with a weighted kernel) on the horizontal spatial dimensions (axes `x` and `y`) of a raster data cube.\n\nEach value in the kernel is multiplied with the corresponding pixel value and all products are summed up afterwards. The sum is then multiplied with the factor.\n\nThe process can't handle non-numerical or infinite numerical values in the data cube. Boolean values are converted to integers (`false` = 0, `true` = 1), but all other non-numerical or infinite values are replaced with zeroes by default (see parameter `replace_invalid`).\n\nFor cases requiring more generic focal operations or non-numerical values, see ``apply_neighborhood()``.",
"description": "Applies a 2D convolution (i.e. a focal operation with a weighted kernel) on the horizontal spatial dimensions (axes `x` and `y`) of a raster data cube.\n\nEach value in the kernel is multiplied with the corresponding pixel value and all products are summed up afterwards. The sum is then multiplied with the factor.\n\nThe process can't handle non-numerical or infinite numerical values in the data cube. Boolean values are converted to integers (`false` = 0, `true` = 1), but all other non-numerical, NaN, no-data, or infinite values are replaced with zeroes by default (see parameter `replace_invalid`).\n\nFor cases requiring more generic focal operations or non-numerical values, see ``apply_neighborhood()``.",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these changes look unrelated to the scope of this PR, right?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, they were spotted when creating the tests. They should've been in #490, but it seems I didn't catch them all.

"categories": [
"cubes",
"math > image filter"
Expand Down Expand Up @@ -70,7 +70,7 @@
},
{
"name": "replace_invalid",
"description": "This parameter specifies the value to replace non-numerical or infinite numerical values with. By default, those values are replaced with zeroes.",
"description": "This parameter specifies the value to replace non-numerical, NaN, no-data, or infinite numerical values with. By default, those values are replaced with zeroes.",
"schema": {
"type": "number"
},
Expand Down
68 changes: 68 additions & 0 deletions dev/check-tests.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
// Ensure that each test is valid

const fs = require('fs');
const path = require('path');
const JSON5 = require('json5');
const ajv = require('ajv');

const testsDir = path.join(__dirname, '../tests');

const tests = fs.readdirSync(testsDir).filter(file => file.endsWith('.json5'));

const schemaPath = path.join(testsDir, 'schema', 'schema.json');
const schemaContent = fs.readFileSync(schemaPath, 'utf8');
const schema = JSON.parse(schemaContent);
const validate = new ajv().compile(schema);

const results = {};
for (const testFile of tests) {
const testPath = path.join(testsDir, testFile);

let testData;
// Ensure we can load the test file as JSON5
try {
testData = JSON5.parse(fs.readFileSync(testPath, 'utf8'));
} catch (error) {
results[testFile] = `Invalid JSON5: ${error.message}`;
continue;
}

// Ensure the file is valid against the schema
if (!validate(testData)) {
results[testFile] = `Schema validation failed: ${validate.errors.map(err => err.message).join(', ')}`;
continue;
}

// Make sure the id is the same as filename without extension
const expectedId = path.basename(testFile, '.json5');
if (testData.id !== expectedId) {
results[testFile] = `ID mismatch: expected ${expectedId}, got ${testData.id}`;
continue;
}

// Check if experimental is set to the same value as in the process itself
let processFile = path.join(__dirname, '../', expectedId + '.json');
if (!fs.existsSync(processFile)) {
processFile = path.join(__dirname, '../proposals/', expectedId + '.json');
}
if (fs.existsSync(processFile)) {
const processData = JSON.parse(fs.readFileSync(processFile, 'utf8'));
const expected = processData.experimental || false;
const actual = testData.experimental || false;
if (expected !== actual) {
results[testFile] = `Experimental flag mismatch: expected ${expected}, got ${actual}`;
continue;
}
}
}

if (Object.keys(results).length > 0) {
console.error('The following test files have issues:');
for (const [file, error] of Object.entries(results)) {
console.error(`- ${file}: ${error}`);
}
process.exit(1);
}
else {
console.log('All test files are valid and match the expected schema.');
}
44 changes: 44 additions & 0 deletions dev/has-tests.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
// Ensure that each process has a corresponding file in the tests directory.
// It can be empty, but it must exist to ensure people made that decision consciously.

const fs = require('fs');
const path = require('path');

const processesDir = path.join(__dirname, '../');
const proposalsDir = path.join(__dirname, '../proposals');
const testsDir = path.join(__dirname, '../tests');

const processes = [
...fs.readdirSync(processesDir),
...fs.readdirSync(proposalsDir),
]
.filter(file => file.endsWith('.json'))
.map(file => path.basename(file, '.json'));;
const tests = fs.readdirSync(testsDir)
.filter(file => file.endsWith('.json5'))
.map(file => path.basename(file, '.json5'));

// Check which tests are missing for the processes
const missingTests = processes.filter(process => !tests.includes(process));

if (missingTests.length > 0) {
console.error('The following processes are missing tests:');
missingTests.forEach(process => console.error(`- ${process}`));
}

// Check whether there are tests for non-existing processes
const extraTests = tests.filter(test => !processes.includes(test));
if (extraTests.length > 0) {
console.error('\nThe following tests exist without a corresponding process:');
extraTests.forEach(test => console.error(`- ${test}`));
}

// todo: add check that json5 files are valid

if (missingTests.length === 0 && extraTests.length === 0) {
console.log('All processes have corresponding tests and vice versa.');
process.exit(0);
}
else {
process.exit(1);
}
9 changes: 8 additions & 1 deletion dev/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,15 @@
"http-server": "^14.1.1"
},
"scripts": {
"test": "openeo-processes-lint testConfig.json",
"check-tests": "node check-tests.js",
"has-tests": "node has-tests.js",
"lint": "openeo-processes-lint testConfig.json",
"test": "npm run has-tests && npm run check-tests && npm run lint",
"generate": "concat-json-files \"../{*,proposals/*}.json\" -t \"processes.json\"",
"start": "npm run generate && http-server -p 9876 -o docs.html -c-1"
},
"dependencies": {
"ajv": "^8.17.1",
"json5": "^2.2.3"
}
}
119 changes: 119 additions & 0 deletions tests/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
# Tests

This folder contains test cases for the openEO processes.

## Assumptions

The test cases assume a couple of things as they are an abstraction and not bound to specific implementations:
- The JSON Schema type `number` explicitly includes the values `+Infinity`, `-Infinity` and `NaN`.
- The input and output values for no-data values are `null` by default unless otherwise specified by a runner.
- Input that is not valid according to the schemas, will be rejected upfront and will not be checked on. For example, the absolute process only tests against the data types `number` and `null`. There are no tests for a boolean or string input.
- Numerical data types such as uint8 don't matter, i.e. tests don't check for overflows etc. This suite can't provide such tests as the underlying data type is not known.
- If not otherwise specified for numbers, a precision of 10 decimals is checked so return values should have at least 11 decimals.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure I understand the goal of this. If only up to 10 decimals must be checked, why care about the 11th decimal?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For whatever reaons I sometimes ran into rounding errors or imprecisions, so the 11th number was there to ensure that's actually happening in the 11th digit, not the 10th.


## Test Files

To allow for more data types (e.g. infinity and nan for numbers), all the files are encoded in **JSON5** instead of JSON.

The test files have the following schema: [schema/schema.json](./schema/schema.json)

### No-data values

No-data values have a special encoding in tests (see below).
The encoding is replaced with `null` unless otherwise specified by the runners.

```json5
{
"type": "nodata"
}
```

### Datetimes

Datetimes as strings have a varying precision, especially regarding the milliseconds.
Also, sometimes timezones are differently handled.

Datetimes in return values should be encoded as follows so that the results can be compared better:

```json5
{
"type": "datetime",
"value": "2020-01-01T00:00:00Z"
}
```

### External references

Arguments and return values can point to external files, e.g.

```json5
{
"$ref": "https://host.example/datacube.json"
}
```

The test suite can currently only load JSON and JSON5 files.

### Labeled arrays

Labeled arrays can't be represented in JSON5 and will be provided as an object instead.

```json5
{
"type": "labeled-array",
"data": [
{
"key": "B01",
"value": 1.23
},
{
"key": "B02",
"value": 0.98
}
// ...
]
}
```

### Datacubes

Datacubes can't be represented in JSON5 and will be provided as an object instead.
Vector datacubes are currently not supported.

```json5
{
"type": "datacube",
"data": [
// multi-dimensional array
// can be set to `null` if the data values are irrelevant for the test.
],
"nodata": [
NaN
],
"order": ["bands", "t", "y", "x"],
"dimensions": {
// similar to the STAC datacube extension
// properties: type, axis (if type = spatial), values, and reference_system (optional)
"bands": {
"type": "bands",
"values": ["blue","green","red","nir"]
},
"t": {
"type": "temporal",
"values": ["2020-06-01T00:00:00Z","2020-06-03T00:00:00Z","2020-06-06T00:00:00Z"]
},
"y": {
"type": "spatial",
"axis": "y",
"values": [5757495.0,5757485.0,5757475.0,5757465.0],
"reference_system": "EPSG:25832"
},
"x": {
"type": "spatial",
"axis": "x",
"values": [404835.0,404845.0,404855.0,404865.0,404875.0],
"reference_system": "EPSG:25832"
}
}
}
```
60 changes: 60 additions & 0 deletions tests/absolute.json5
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
{
"id": "absolute",
"level": "L1",
"tests": [
{
"arguments": {
"x": 0
},
"returns": 0
},
{
"arguments": {
"x": 1
},
"returns": 1
},
{
"arguments": {
"x": -1
},
"returns": 1
},
{
"arguments": {
"x": 2.5
},
"returns": 2.5
},
{
"arguments": {
"x": -2.5
},
"returns": 2.5
},
{
"arguments": {
"x": NaN
},
"returns": NaN
},
{
"arguments": {
"x": Infinity
},
"returns": Infinity
},
{
"arguments": {
"x": -Infinity
},
"returns": Infinity
},
{
"arguments": {
"x": {"type": "nodata"}
},
"returns": {"type": "nodata"}
}
]
}
Loading