You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: api-spec/filters.md
+42-19
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ and are required to be supported.
6
6
7
7
The filters are designed to be simple as possible for a client to construct. To match the default JSON responses the
8
8
encoding of filters is done by default in JSON. This allows clients to support filtering without additional tools. The
9
-
search enpoint will accept application/json format queries, and GET on the collection will support URL encoded JSON. POST
9
+
search endpoint will accept application/json format queries, and GET on the collection will support URL encoded JSON. POST
10
10
search the is recommended way to filter results to avoid the URL encoding issues that can happen with GET.
11
11
12
12
Searching using POST will accept a JSON object where the top level keys are specifying which type of filter
@@ -20,33 +20,60 @@ This query will perform an intersects operation on the geometry values of the it
20
20
objects may provide a bbox property in addition to geometry, but it should not be used for the bbox filter since
21
21
it is an optional field in GeoJSON.
22
22
23
-
example
23
+
###### examples: ######
24
+
POST:
24
25
```
25
26
{
26
27
"bbox": [-180,-90,180,90]
27
28
}
28
29
```
29
30
30
-
The temporal query will be based on [RFC 3339](https://tools.ietf.org/html/rfc3339) and should support time ranges as well as equality. To support range
31
-
queries, we are using a simple JSON based language. Ranges will be specified as an object with keys indicating the comparison to use.
31
+
GET:
32
+
```
33
+
?bbox=[-180,-90,180,90]
34
+
```
35
+
36
+
The temporal query will be based on [RFC 3339](https://tools.ietf.org/html/rfc3339) and should support time ranges as well as equality. It will compare against the datetime property of the STAC Item.
37
+
38
+
###### To find items with an exact date ######
39
+
40
+
POST:
41
+
```
42
+
{
43
+
"time": "2007-03-01T13:00:00Z"
44
+
}
45
+
```
32
46
33
-
Equality is specified as `{"time": "2018-03-20T16:11:44.353Z"}`
34
-
Before is `{"time":{"lt":"2018-03-20T16:11:44.353Z"}}`
35
-
After is `{"time":{"gt":"2018-03-20T16:11:44.353Z"}}`
36
-
Before with Equality is `{"time":{"lte":"2018-03-20T16:11:44.353Z"}}`
37
-
After with Equality is `{"time":{"gte":"2018-03-20T16:11:44.353Z"}}`
47
+
GET:
48
+
```
49
+
?time=2007-03-01T13:00:00Z
50
+
```
38
51
39
-
These queries can be combined to specify a search range:
52
+
###### To specify a time range, use the interval syntax: ######
Copy file name to clipboardExpand all lines: api-spec/wfs-stac.md
+42-3
Original file line number
Diff line number
Diff line change
@@ -27,9 +27,48 @@ STAC endpoint.
27
27
28
28
### WFS Structure
29
29
30
-
TODO: Give an overview of the main WFS endpoints
30
+
A Web Feature Service is a standard API that represents collections of geospatial data.
31
+
32
+
```
33
+
GET /collections
34
+
```
35
+
36
+
Lists the collections of data on the server that can be queried ([7.11](https://rawgit.com/opengeospatial/WFS_FES/master/docs/17-069.html#_feature_collections_metadata)),
37
+
and each describes basic information about the geospatial data collection, like its name and description, as well as the
38
+
spatial and temporal extents of all the data contained. A STAC search extension would only query those collections which
39
+
have data that validates as STAC `Items` - with a datetime field and references to assets. But a STAC can live alongside
40
+
other WFS collections, like an organization might choose to have their building and road data in WFS collections, alongside
41
+
their STAC-compatible imagery data.
42
+
43
+
```
44
+
GET /collections/{name}/items?bbox=160.6,-55.95,-170,-25.89
45
+
```
46
+
47
+
Requests all the data in the collection that is in New Zealand. The filtering is made to be compatible with the STAC API,
48
+
and the two specs seek to share the general query and filtering patterns. The key difference is that a STAC search endpoint
49
+
will do cross collection search. A typical WFS will have multiple collections, and each will just offer search for its particular
50
+
collection.
51
+
31
52
32
53
### Strongly Typed STAC data
33
54
34
-
TODO: explain the advantages of using WFS to provide schema information at a collection level, for stronger typing
35
-
of data. Plus transactions.
55
+
The scenario that using a WFS with a STAC search endpoint that makes the most sense is when a data provider wants to provide more
56
+
insight in to heterogenous data that is exposed on a STAC search. For example they might have imagery data from different satellite providers
57
+
and even some drone data. These will all have different fields. A single STAC endpoint can be used to expose them all. But it can be quite
58
+
useful to let users inspect a particular data type. That area of the `/collections/{name}` hierarchy can be used to expose additional
59
+
metadata and validation schemas that give more insight in to that data, as well as a place to query just that data.
60
+
61
+
In general it is recommended to provide as much information about different types of data as possible, so using WFS is recommended. But
62
+
the standalone option is there for those who just want to expose their data as quickly and easily as possible. Note a WFS can
63
+
provide heterogenous data from any of its collections endpoints, but the STAC API recommendation is to use one collection per
64
+
logical data type.
65
+
66
+
### Potential Transaction Extension
67
+
68
+
The other benefit of individual collection endpoints is that it gives a logical location for simple RESTful transactions
69
+
70
+
```
71
+
POST /collections/landsat/items/
72
+
```
73
+
74
+
There have been a couple implementations that have done transactions, and soon will contribute an extension.
"collection_description": "Landat 8 imagery that is radiometrically calibrated and orthorectified using gound points and Digital Elevation Model (DEM) data to correct relief displacement.",
0 commit comments