Skip to content

Commit 5e17e18

Browse files
committed
Issue #294 blacken code snippets in docs
cherry-picked improvements from running blacken-docs tool on docs with git ls-files -z -- '*.rst' | xargs -0 blacken-docs`
1 parent 04d5823 commit 5e17e18

File tree

8 files changed

+70
-48
lines changed

8 files changed

+70
-48
lines changed

docs/basics.rst

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -126,8 +126,8 @@ We load an initial small spatio-temporal slice (a data cube) as follows:
126126
sentinel2_cube = connection.load_collection(
127127
"SENTINEL2_L2A",
128128
spatial_extent={"west": 5.14, "south": 51.17, "east": 5.17, "north": 51.19},
129-
temporal_extent = ["2021-02-01", "2021-04-30"],
130-
bands=["B02", "B04", "B08"]
129+
temporal_extent=["2021-02-01", "2021-04-30"],
130+
bands=["B02", "B04", "B08"],
131131
)
132132
133133
Note how we specify a the region of interest, a time range and a set of bands to load.
@@ -279,8 +279,8 @@ First, we load a new ``SENTINEL2_L2A`` based data cube with this specific ``SCL`
279279
s2_scl = connection.load_collection(
280280
"SENTINEL2_L2A",
281281
spatial_extent={"west": 5.14, "south": 51.17, "east": 5.17, "north": 51.19},
282-
temporal_extent = ["2021-02-01", "2021-04-30"],
283-
bands=["SCL"]
282+
temporal_extent=["2021-02-01", "2021-04-30"],
283+
bands=["SCL"],
284284
)
285285
286286
Now we can use the compact "band math" feature again to build a
@@ -362,7 +362,7 @@ Building on the experience from previous sections, we first build a masked EVI c
362362
sentinel2_cube = connection.load_collection(
363363
"SENTINEL2_L2A",
364364
spatial_extent={"west": 5.14, "south": 51.17, "east": 5.17, "north": 51.19},
365-
temporal_extent = ["2020-01-01", "2021-12-31"],
365+
temporal_extent=["2020-01-01", "2021-12-31"],
366366
bands=["B02", "B04", "B08", "SCL"],
367367
)
368368
@@ -413,6 +413,7 @@ which we massage a bit more:
413413
from openeo.rest.conversions import timeseries_json_to_pandas
414414
415415
import json
416+
416417
with open("evi-aggregation.json") as f:
417418
data = json.load(f)
418419

docs/cookbook/job_manager.rst

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -77,9 +77,11 @@ Basic usage example with a remote process definition:
7777
7878
# Initialize job database from a dataframe,
7979
# with desired parameter values to fill in.
80-
df = pd.DataFrame({
81-
"start_date": ["2021-01-01", "2021-02-01", "2021-03-01"],
82-
})
80+
df = pd.DataFrame(
81+
{
82+
"start_date": ["2021-01-01", "2021-02-01", "2021-03-01"],
83+
}
84+
)
8385
job_db = create_job_db("jobs.csv").initialize_from_df(df)
8486
8587
# Create and run job manager,

docs/cookbook/localprocessing.rst

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -48,11 +48,12 @@ The following code snippet loads Sentinel-2 L2A data from a public STAC Catalog,
4848
>>> temporal_extent = ["2019-01-01", "2019-06-15"]
4949
>>> bands = ["red"]
5050
>>> properties = {"eo:cloud_cover": dict(lt=50)}
51-
>>> s2_cube = local_conn.load_stac(url=url,
52-
... spatial_extent=spatial_extent,
53-
... temporal_extent=temporal_extent,
54-
... bands=bands,
55-
... properties=properties,
51+
>>> s2_cube = local_conn.load_stac(
52+
... url=url,
53+
... spatial_extent=spatial_extent,
54+
... temporal_extent=temporal_extent,
55+
... bands=bands,
56+
... properties=properties,
5657
... )
5758
>>> s2_cube.execute()
5859
<xarray.DataArray 'stackstac-08730b1b5458a4ed34edeee60ac79254' (time: 177,
@@ -94,6 +95,7 @@ With some sample data we can now check the STAC metadata for the local files by
9495
.. code:: python
9596
9697
from openeo.local import LocalConnection
98+
9799
local_data_folders = [
98100
"./openeo-localprocessing-data/sample_netcdf",
99101
"./openeo-localprocessing-data/sample_geotiff",
@@ -162,10 +164,11 @@ We can perform the same example using data provided by STAC Collection:
162164
.. code:: python
163165
164166
from openeo.local import LocalConnection
167+
165168
local_conn = LocalConnection("./")
166169
167170
url = "https://earth-search.aws.element84.com/v1/collections/sentinel-2-l2a"
168-
spatial_extent = {"east": 11.40, "north": 46.52, "south": 46.46, "west": 11.25}
171+
spatial_extent = {"east": 11.40, "north": 46.52, "south": 46.46, "west": 11.25}
169172
temporal_extent = ["2022-06-01", "2022-06-30"]
170173
bands = ["red", "nir"]
171174
properties = {"eo:cloud_cover": dict(lt=80)}

docs/cookbook/tricks.rst

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -69,11 +69,13 @@ For example:
6969
.. code-block:: python
7070
7171
# `execute` with raw JSON string
72-
connection.execute("""
72+
connection.execute(
73+
"""
7374
{
7475
"add": {"process_id": "add", "arguments": {"x": 3, "y": 5}, "result": true}
7576
}
76-
""")
77+
"""
78+
)
7779
7880
# `download` with local path to JSON file
7981
connection.download("path/to/my-process-graph.json")
@@ -99,7 +101,7 @@ where you could pass a *backend-side* path as ``geometries``, e.g.:
99101
100102
cube = cube.aggregate_spatial(
101103
geometries="/backend/path/to/geometries.json",
102-
reducer="mean"
104+
reducer="mean",
103105
)
104106
105107
The client would handle this by automatically adding a ``read_vector`` process
@@ -124,7 +126,7 @@ for example as follows:
124126
125127
cube = cube.aggregate_spatial(
126128
geometries=process("read_vector", filename="/backend/path/to/geometries.json"),
127-
reducer="mean"
129+
reducer="mean",
128130
)
129131
130132
Note that this is also works with older versions of the openEO Python client library.

docs/data_access.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -274,7 +274,7 @@ For example, to filter on the relative orbit number of SAR data:
274274
"SENTINEL1_GRD",
275275
...,
276276
properties={
277-
"relativeOrbitNumber": lambda x: x==116
277+
"relativeOrbitNumber": lambda x: x == 116,
278278
},
279279
)
280280

docs/federation-extension.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ using :py:meth:`OpenEoCapabilities.ext_federation_backend_details() <openeo.rest
3737
.. code-block:: python
3838
3939
import openeo
40+
4041
connection = openeo.connect(url=...)
4142
capabilities = connection.capabilities()
4243
print("Federated backends:", capabilities.ext_federation_backend_details())
@@ -57,6 +58,7 @@ and can be inspected as follows:
5758
.. code-block:: python
5859
5960
import openeo
61+
6062
connection = openeo.connect(url=...)
6163
collections = connection.list_collections()
6264
print("Number of collections:", len(collections))

docs/udf.rst

Lines changed: 39 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -28,13 +28,15 @@ using the openEO Python Client library:
2828
import openeo
2929
3030
# Build a UDF object from an inline string with Python source code.
31-
udf = openeo.UDF("""
32-
import xarray
31+
udf = openeo.UDF(
32+
"""
33+
import xarray
3334
34-
def apply_datacube(cube: xarray.DataArray, context: dict) -> xarray.DataArray:
35-
cube.values = 0.0001 * cube.values
36-
return cube
37-
""")
35+
def apply_datacube(cube: xarray.DataArray, context: dict) -> xarray.DataArray:
36+
cube.values = 0.0001 * cube.values
37+
return cube
38+
"""
39+
)
3840
3941
# Or load the UDF code from a separate file.
4042
# udf = openeo.UDF.from_file("udf-code.py")
@@ -173,7 +175,7 @@ In most of the examples here, we will start from an initial Sentinel2 data cube
173175
"SENTINEL2_L2A",
174176
spatial_extent={"west": 4.00, "south": 51.04, "east": 4.10, "north": 51.1},
175177
temporal_extent=["2022-03-01", "2022-03-31"],
176-
bands=["B02", "B03", "B04"]
178+
bands=["B02", "B03", "B04"],
177179
)
178180
179181
@@ -203,6 +205,7 @@ The UDF code is this short script (the part that does the actual value rescaling
203205
204206
import xarray
205207
208+
206209
def apply_datacube(cube: xarray.DataArray, context: dict) -> xarray.DataArray:
207210
cube.values = 0.0001 * cube.values
208211
return cube
@@ -244,17 +247,19 @@ The UDF-specific part is highlighted.
244247
"SENTINEL2_L2A",
245248
spatial_extent={"west": 4.00, "south": 51.04, "east": 4.10, "north": 51.1},
246249
temporal_extent=["2022-03-01", "2022-03-31"],
247-
bands=["B02", "B03", "B04"]
250+
bands=["B02", "B03", "B04"],
248251
)
249252
250253
# Create a UDF object from inline source code.
251-
udf = openeo.UDF("""
252-
import xarray
254+
udf = openeo.UDF(
255+
"""
256+
import xarray
253257
254-
def apply_datacube(cube: xarray.DataArray, context: dict) -> xarray.DataArray:
255-
cube.values = 0.0001 * cube.values
256-
return cube
257-
""")
258+
def apply_datacube(cube: xarray.DataArray, context: dict) -> xarray.DataArray:
259+
cube.values = 0.0001 * cube.values
260+
return cube
261+
"""
262+
)
258263
259264
# Pass UDF object as child process to `apply`.
260265
rescaled = s2_cube.apply(process=udf)
@@ -309,13 +314,15 @@ To invoke a UDF like this, the apply_neighborhood method is most suitable:
309314

310315
.. code-block:: python
311316
312-
udf_code = Path('udf_modify_spatial.py').read_text()
317+
udf_code = Path("udf_modify_spatial.py").read_text()
313318
cube_updated = cube.apply_neighborhood(
314-
lambda data: data.run_udf(udf=udf_code, runtime='Python-Jep', context=dict()),
319+
lambda data: data.run_udf(udf=udf_code, runtime="Python-Jep", context=dict()),
315320
size=[
316-
{'dimension': 'x', 'value': 128, 'unit': 'px'},
317-
{'dimension': 'y', 'value': 128, 'unit': 'px'}
318-
], overlap=[])
321+
{"dimension": "x", "value": 128, "unit": "px"},
322+
{"dimension": "y", "value": 128, "unit": "px"},
323+
],
324+
overlap=[],
325+
)
319326
320327
321328
@@ -350,13 +357,17 @@ the datacube.
350357

351358
.. code-block:: python
352359
353-
output_cube = inputs_cube.apply_neighborhood(my_udf, size=[
354-
{'dimension': 'x', 'value': 112, 'unit': 'px'},
355-
{'dimension': 'y', 'value': 112, 'unit': 'px'}
356-
], overlap=[
357-
{'dimension': 'x', 'value': 8, 'unit': 'px'},
358-
{'dimension': 'y', 'value': 8, 'unit': 'px'}
359-
])
360+
output_cube = inputs_cube.apply_neighborhood(
361+
my_udf,
362+
size=[
363+
{"dimension": "x", "value": 112, "unit": "px"},
364+
{"dimension": "y", "value": 112, "unit": "px"},
365+
],
366+
overlap=[
367+
{"dimension": "x", "value": 8, "unit": "px"},
368+
{"dimension": "y", "value": 8, "unit": "px"},
369+
],
370+
)
360371
361372
362373
@@ -396,7 +407,7 @@ and apply it along a dimension:
396407

397408
.. code-block:: python
398409
399-
smoothing_udf = openeo.UDF.from_file('smooth_savitzky_golay.py')
410+
smoothing_udf = openeo.UDF.from_file("smooth_savitzky_golay.py")
400411
smoothed_evi = evi_cube_masked.apply_dimension(smoothing_udf, dimension="t")
401412
402413
@@ -630,6 +641,7 @@ For example: to discover the shape of the data cube chunk that you receive in yo
630641
from openeo.udf import inspect
631642
import xarray
632643
644+
633645
def apply_datacube(cube: xarray.DataArray, context: dict) -> xarray.DataArray:
634646
inspect(data=[cube.shape], message="UDF logging shape of my cube")
635647
cube.values = 0.0001 * cube.values

docs/udp.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -474,12 +474,12 @@ building, storing, and finally executing the UDP.
474474
name="temporal_extent",
475475
description="The date range to calculate the EVI for.",
476476
schema={"type": "array", "subtype": "temporal-interval"},
477-
default =["2018-06-15", "2018-06-27"]
477+
default=["2018-06-15", "2018-06-27"],
478478
)
479479
geometry = Parameter(
480480
name="geometry",
481481
description="The geometry (a single (multi)polygon or a feature collection of (multi)polygons) of to calculate the EVI for.",
482-
schema={"type": "object", "subtype": "geojson"}
482+
schema={"type": "object", "subtype": "geojson"},
483483
)
484484
485485
# Load raw SENTINEL2_L2A data

0 commit comments

Comments
 (0)