Skip to content

Commit 8c68300

Browse files
author
Ariana Barzinpour
committed
Merge branch 'integrate_1.9' of https://github.com/opendatacube/datacube-explorer into integrate_1.9
2 parents 4ed52f9 + 587bae5 commit 8c68300

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

61 files changed

+6922
-2477
lines changed

.docker/create_db.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
#!/usr/bin/env bash
22

3-
PGPASSWORD=${DB_PASSWORD} psql -h ${DB_HOSTNAME} -U ${DB_USERNAME} -c 'create database opendatacube_test'
3+
PGPASSWORD=${POSTGRES_PASSWORD} psql -h ${POSTGRES_HOSTNAME} -U ${POSTGRES_USER} -c 'create database opendatacube_test'

.github/dependabot.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ updates:
44
directory: "/"
55
schedule:
66
interval: "daily"
7-
target-branch: "develop"
7+
target-branch: "develop" # should this be removed? changed? duplicated?
88
- package-ecosystem: docker
99
directory: "/"
1010
schedule:

.github/workflows/deployment_test.yaml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@ on: # yamllint disable-line rule:truthy
99
push:
1010
branches:
1111
- develop
12+
- integrate_1.9
1213
paths:
1314
- '**'
1415

.github/workflows/docker.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@ on:
77
push:
88
branches:
99
- develop
10+
- integrate_1.9
1011
paths:
1112
- "**"
1213

.github/workflows/publish-pypi.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ jobs:
1717
- name: Setup Python
1818
uses: actions/setup-python@v5
1919
with:
20-
python-version: 3.8
20+
python-version: 3.10
2121

2222
- name: Install Twine
2323
run: |

.github/workflows/test.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@ on:
99
push:
1010
branches:
1111
- develop
12+
- integrate_1.9
1213
paths:
1314
- '**'
1415

README.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -81,8 +81,6 @@ with [pyflakes](https://github.com/PyCQA/pyflakes).
8181

8282
They are included when installing the test dependencies:
8383

84-
pip install --upgrade --no-deps --extra-index-url https://packages.dea.ga.gov.au/ 'datacube' 'digitalearthau'
85-
8684
pip install -e .[test]
8785

8886
Run `make lint` to check your changes, and `make format` to format your code
@@ -118,11 +116,11 @@ version uses virtualenvs which are incompatible with Conda's environments)
118116

119117
Set ODC's environment variable before running the server:
120118

121-
export DATACUBE_ENVIRONMENT=staging
119+
export ODC_ENVIRONMENT=staging
122120

123121
You can always see which environment/settings will be used by running `datacube system check`.
124122

125-
See the ODC documentation for config and [datacube environments](https://datacube-core.readthedocs.io/en/latest/user/config.html#runtime-config)
123+
See the ODC documentation for [datacube configuration and environments](https://opendatacube.readthedocs.io/en/latest/installation/database/configuration.html)
126124

127125
### How can I set different timezone
128126

@@ -218,8 +216,8 @@ Three roles are created:
218216
- **explorer-generator**: Suitable for generating and updating summaries (ie. Running `cubedash-gen`)
219217
- **explorer-owner**: For creating and updating the schema. (ie. Running `cubedash-gen --init`)
220218

221-
Note that these roles extend the built-in datacube role `agdc_user`. If you
222-
created your datacube without permissions, a stand-alone creator of the `agdc_user`
219+
Note that these roles extend the built-in datacube role `agdc_user` (using postgres) or `odc_user` (using postgis).
220+
If you created your datacube without permissions, a stand-alone creator of the appropriate
223221
role is available as a prerequisite in the same [roles](cubedash/summary/roles)
224222
directory.
225223

cubedash/_audit.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ def legacy_product_audit_page():
6969
@bp.route("/audit/datasets-metadata")
7070
def datasets_metadata_page():
7171
store = _model.STORE
72-
all_products = {p.name for p in store.index.products.get_all()}
72+
all_products = {p.name for p in store.all_products()}
7373
summarised_products = set(store.list_complete_products())
7474
unsummarised_product_names = all_products - summarised_products
7575

cubedash/_dataset.py

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -37,8 +37,8 @@ def dataset_page(id_):
3737
def dataset_full_page(product_name: str, id_: UUID):
3838
derived_dataset_overflow = source_dataset_overflow = 0
3939

40-
index = _model.STORE.index
41-
dataset = index.datasets.get(id_, include_sources=False)
40+
store = _model.STORE
41+
dataset = store.index.datasets.get(id_, include_sources=False)
4242

4343
if dataset is None:
4444
abort(404, f"No dataset found with id {id_}")
@@ -59,19 +59,19 @@ def dataset_full_page(product_name: str, id_: UUID):
5959
provenance_display_limit = current_app.config.get(
6060
"CUBEDASH_PROVENANCE_DISPLAY_LIMIT", PROVENANCE_DISPLAY_LIMIT
6161
)
62-
source_datasets, source_dataset_overflow = utils.get_dataset_sources(
63-
index, id_, limit=provenance_display_limit
62+
source_datasets, source_dataset_overflow = store.e_index.get_dataset_sources(
63+
id_, limit=provenance_display_limit
6464
)
6565

6666
dataset.metadata.sources = {}
6767
ordered_metadata = utils.prepare_dataset_formatting(dataset)
6868

69-
derived_datasets, derived_dataset_overflow = utils.get_datasets_derived(
70-
index, id_, limit=provenance_display_limit
69+
derived_datasets, derived_dataset_overflow = store.e_index.get_datasets_derived(
70+
id_, limit=provenance_display_limit
7171
)
7272
derived_datasets.sort(key=utils.dataset_label)
7373

74-
footprint, region_code = _model.STORE.get_dataset_footprint_region(id_)
74+
footprint, region_code = store.get_dataset_footprint_region(id_)
7575
# We only have a footprint in the spatial table above if summarisation has been
7676
# run for the product (...and done so after the dataset was added).
7777
#

cubedash/_filters.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ def _format_datetime(date):
4444

4545
@bp.app_template_filter("metadata_center_time")
4646
def _get_metadata_center_time(dataset):
47-
return utils.center_time_from_metadata(dataset)
47+
return utils.datetime_from_metadata(dataset)
4848

4949

5050
@bp.app_template_filter("localised_metadata_center_time")
@@ -141,7 +141,7 @@ def _all_values_none(d: Mapping):
141141

142142
@bp.app_template_filter("dataset_day_link")
143143
def _dataset_day_link(dataset: Dataset, timezone=None):
144-
t = utils.center_time_from_metadata(dataset)
144+
t = utils.datetime_from_metadata(dataset)
145145
if t is None:
146146
return "(unknown time)"
147147
if timezone:

0 commit comments

Comments
 (0)