Skip to content

Commit a10e8dd

Browse files
authored
Complete full stack environment for dev & CI (#502)
* Diagram of primary models & relationships * Add helpful commands to README, begin quickstart * Simplify local dev by renaming local.yml to the default name * Use env variable to specify API URL for backend * Add compose config for CI workflows * Bring back backend tests, fix most of them, disable a few of them * Create sample data after migrations run
1 parent 7efa307 commit a10e8dd

File tree

21 files changed

+454
-226
lines changed

21 files changed

+454
-226
lines changed

.envs/.ci/.django

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
# General
2+
# ------------------------------------------------------------------------------
3+
USE_DOCKER=yes
4+
DJANGO_SETTINGS_MODULE="config.settings.local"
5+
6+
# Redis
7+
# ------------------------------------------------------------------------------
8+
REDIS_URL=redis://redis:6379/0
9+
10+
DJANGO_CSRF_TRUSTED_ORIGINS=http://localhost:3000,
11+
12+
MINIO_ENDPOINT=http://minio:9000
13+
MINIO_ROOT_USER=amistorage
14+
MINIO_ROOT_PASSWORD=amistorage
15+
MINIO_DEFAULT_BUCKET=ami
16+
MINIO_STORAGE_USE_HTTPS=False
17+
MINIO_TEST_BUCKET=ami-test
18+
MINIO_BROWSER_REDIRECT_URL=http://minio:9001

.envs/.ci/.postgres

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
POSTGRES_HOST=postgres
2+
POSTGRES_PORT=5432
3+
POSTGRES_DB=ami-ci
4+
POSTGRES_USER=4JXkOnTAeDmDyIapSRrGEE
5+
POSTGRES_PASSWORD=d4xojpnJU3OzPQ0apSCLP1oHR1TYvyMzAlF5KpE9HFL6MPlnbDibwI

.envs/.local/.django

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,10 +15,6 @@ REDIS_URL=redis://redis:6379/0
1515
CELERY_FLOWER_USER=QSocnxapfMvzLqJXSsXtnEZqRkBtsmKT
1616
CELERY_FLOWER_PASSWORD=BEQgmCtgyrFieKNoGTsux9YIye0I7P5Q7vEgfJD2C4jxmtHDetFaE2jhS7K7rxaf
1717

18-
# Monitoring
19-
NEW_RELIC_CONFIG_FILE=newrelic.ini
20-
NEW_RELIC_ENVIRONMENT=development
21-
2218
DJANGO_CSRF_TRUSTED_ORIGINS=http://localhost:3000,
2319

2420
MINIO_ENDPOINT=http://minio:9000

.envs/.local/.postgres

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,3 @@
1-
# PostgreSQL
2-
# ------------------------------------------------------------------------------
31
POSTGRES_HOST=postgres
42
POSTGRES_PORT=5432
53
POSTGRES_DB=ami

.github/workflows/test.backend.yml

Lines changed: 13 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -33,21 +33,20 @@ jobs:
3333
- name: Run pre-commit
3434
uses: pre-commit/action@v3.0.1
3535

36-
# With no caching at all the entire ci process takes 4m 30s to complete!
37-
# test:
38-
# runs-on: ubuntu-latest
39-
# steps:
40-
# - name: Checkout Code Repository
41-
# uses: actions/checkout@v4
36+
test:
37+
runs-on: ubuntu-latest
38+
steps:
39+
- name: Checkout Code Repository
40+
uses: actions/checkout@v4
4241

43-
# - name: Build the Stack
44-
# run: docker-compose -f local.yml build
42+
- name: Build the Stack
43+
run: docker compose -f docker-compose.ci.yml build
4544

46-
# - name: Run DB Migrations
47-
# run: docker-compose -f local.yml run --rm django python manage.py migrate
45+
- name: Run DB Migrations
46+
run: docker compose -f docker-compose.ci.yml run --rm django python manage.py migrate
4847

49-
# - name: Run Django Tests
50-
# run: docker-compose -f local.yml run --rm django python manage.py test
48+
- name: Run Django Tests
49+
run: docker compose -f docker-compose.ci.yml run --rm django python manage.py test
5150

52-
# - name: Tear down the Stack
53-
# run: docker-compose -f local.yml down
51+
- name: Tear down the Stack
52+
run: docker compose -f docker-compose.ci.yml down

README.md

Lines changed: 100 additions & 51 deletions
Original file line numberDiff line numberDiff line change
@@ -1,101 +1,122 @@
11
# Automated Monitoring of Insects ML Platform
22

3-
Platform for processing and reviewing images from automated insect monitoring stations.
3+
Platform for processing and reviewing images from automated insect monitoring stations. Intended for collaborating on multi-deployment projects, maintaining metadata and orchestrating multiple machine learning pipelines for analysis.
44

5-
[![Built with Cookiecutter Django](https://img.shields.io/badge/built%20with-Cookiecutter%20Django-ff69b4.svg?logo=cookiecutter)](https://github.com/cookiecutter/cookiecutter-django/)
65
[![Black code style](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)
76

8-
License: MIT
7+
## Quick Start
98

10-
## Settings
9+
The platform uses Docker Compose to run all backend services. To run all services locally, install Docker and run the following command:
1110

12-
Moved to [settings](http://cookiecutter-django.readthedocs.io/en/latest/settings.html).
11+
$ docker compose up
1312

14-
## Basic Commands
13+
Explore the API
1514

16-
### Setting Up Your Users
15+
- Rest Framework: http://localhost:8000/api/v2/
16+
- Access the Django admin: http://localhost:8000/admin/
17+
- OpenAPI / Swagger: http://localhost:8000/api/v2/docs/
1718

18-
- To create a **normal user account**, just go to Sign Up and fill out the form. Once you submit it, you'll see a "Verify Your E-mail Address" page. Go to your console to see a simulated email verification message. Copy the link into your browser. Now the user's email should be verified and ready to go.
19+
Install and run the frontend:
1920

20-
- To create a **superuser account**, use this command:
21+
```bash
22+
# Enter into the ui directory
23+
cd ui
24+
# Install Node Version Manager
25+
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
26+
# Install required Node.js version
27+
nvm install
28+
# Install Yarn dependencies
29+
yarn install
30+
# Start the frontend
31+
yarn start
32+
```
2133

22-
$ python manage.py createsuperuser
34+
Visit http://localhost:3000/
2335

24-
For convenience, you can keep your normal user logged in on Chrome and your superuser logged in on Firefox (or similar), so that you can see how the site behaves for both kinds of users.
36+
_TODO! Make a pre-built frontend available in the Docker compose stack._
2537

26-
### Type checks
38+
## Development
2739

28-
Running type checks with mypy:
40+
### Frontend
2941

30-
$ mypy ami
42+
#### Dependencies
3143

32-
### Test coverage
44+
- [Node.js](https://nodejs.org/en/download/)
45+
- [Yarn](https://yarnpkg.com/getting-started/install)
3346

34-
To run the tests, check your test coverage, and generate an HTML coverage report:
47+
#### Configuration
3548

36-
$ coverage run -m pytest
37-
$ coverage html
38-
$ open htmlcov/index.html
49+
By default this will try to connect to http://localhost:8000 for the backend API. Use the env var `API_PROXY_TARGET` to change this. You can create multiple `.env` files in the `ui/` directory for different environments or configurations. For example, use `yarn start --mode staging` to load `.env.staging` and point the `API_PROXY_TARGET` to a remote backend.
3950

40-
#### Running tests with pytest
51+
### Backend
4152

42-
$ pytest
53+
#### Dependencies
4354

44-
### Live reloading and Sass CSS compilation
55+
- [Docker](https://docs.docker.com/get-docker/)
56+
- [Docker Compose](https://docs.docker.com/compose/install/)
4557

46-
Moved to [Live reloading and SASS compilation](https://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html#sass-compilation-live-reloading).
4758

48-
### Celery
59+
#### Helpful Commands
4960

50-
This app comes with Celery.
61+
##### Watch the logs of Django & the backend workers
5162

52-
To run a celery worker:
63+
docker compose logs -f django celeryworker
64+
65+
##### Watch the logs of all services:
66+
67+
docker compose logs -f
68+
69+
##### Create a super user account:
70+
71+
docker compose exec django python manage.py createsuperuser
5372

54-
```bash
55-
cd ami
56-
celery -A config.celery_app worker -l info
57-
```
5873

59-
Please note: For Celery's import magic to work, it is important _where_ the celery commands are run. If you are in the same folder with _manage.py_, you should be right.
6074

61-
To run [periodic tasks](https://docs.celeryq.dev/en/stable/userguide/periodic-tasks.html), you'll need to start the celery beat scheduler service. You can start it as a standalone process:
75+
##### Run tests
6276

6377
```bash
64-
cd ami
65-
celery -A config.celery_app beat
78+
docker compose run --rm django python manage.py test
6679
```
6780

68-
or you can embed the beat service inside a worker with the `-B` option (not recommended for production use):
81+
##### Run tests with a specific pattern in the test name
6982

7083
```bash
71-
cd ami
72-
celery -A config.celery_app worker -B -l info
84+
docker compose run --rm django python manage.py test -k pattern
7385
```
7486

75-
### Email Server
87+
##### Launch the Django shell:
7688

77-
In development, it is often nice to be able to see emails that are being sent from your application. For that reason local SMTP server [MailHog](https://github.com/mailhog/MailHog) with a web interface is available as docker container.
89+
docker-compose exec django python manage.py shell
7890

79-
Container mailhog will start automatically when you will run all docker containers.
80-
Please check [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html) for more details how to start all containers.
91+
>>> from ami.main.models import SourceImage, Occurrence
92+
>>> SourceImage.objects.all(project__name='myproject')
8193

82-
With MailHog running, to view messages that are sent by your application, open your browser and go to `http://127.0.0.1:8025`
94+
##### Install backend dependencies locally for IDE support (Intellisense, etc):
8395

84-
### Sentry
85-
86-
Sentry is an error logging aggregator service. You can sign up for a free account at <https://sentry.io/signup/?code=cookiecutter> or download and host it yourself.
87-
The system is set up with reasonable defaults, including 404 logging and integration with the WSGI application.
96+
```bash
97+
python -m venv venv
98+
source venv/bin/activate
99+
pip install -r requirements/local.txt
100+
```
88101

89-
You must set the DSN url in production.
102+
##### Generate OpenAPI schema
90103

91-
## Deployment
104+
```bash
105+
docker compose run --rm django python manage.py spectacular --api-version 'api' --format openapi --file ami-openapi-schema.yaml
106+
```
92107

93-
The following details how to deploy this application.
108+
##### Generate TypeScript types from OpenAPI schema
94109

95-
### Docker
110+
```bash
111+
docker run --rm -v ${PWD}:/local openapitools/openapi-generator-cli generate -i /local/ami-openapi-schema.yaml -g typescript-axios -o /local/ui/src/api-schema.d.ts
112+
```
96113

97-
See detailed [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html).
114+
##### Generate diagram graph of Django models & relationships (Graphviz required)
98115

116+
```bash
117+
docker compose run --rm django python manage.py graph_models -a -o models.dot --dot
118+
dot -Tsvg models.dot > models.svg
119+
```
99120

100121
## Project Data Storage
101122

@@ -118,3 +139,31 @@ Bucket: ami
118139
- Upload some test images to a subfolder in the `ami` bucket (one subfolder per deployment)
119140
- Give the bucket or folder anonymous access using the "Anonymous access" button in the Minio web interface.
120141
- You _can_ test private buckets and presigned URLs, but you will need to add an entry to your local /etc/hosts file to map the `minio` hostname to localhost.
142+
143+
## Email
144+
145+
The local environment uses the `console` email backend. To view emails sent by the platform, check the console output (run the `docker compose logs -f django celeryworker` command).
146+
147+
## Database
148+
149+
The local environment uses a local PostgreSQL database in a Docker container.
150+
151+
### Backup and Restore
152+
153+
docker compose run --rm postgres backup
154+
155+
### Reset the database
156+
157+
docker compose run --rm django python manage.py reset_db
158+
159+
### Show backups
160+
161+
docker compose run --rm postgres backups
162+
163+
### Restore a backup
164+
165+
docker compose run --rm postgres restore <backup_file_name>
166+
167+
### Load fixtures with test data
168+
169+
docker compose run --rm django python manage.py migrate

ami/jobs/tests.py

Lines changed: 36 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3,15 +3,25 @@
33

44
from ami.base.serializers import reverse_with_params
55
from ami.jobs.models import Job, JobProgress, JobState
6-
from ami.main.models import Project
6+
from ami.main.models import Project, SourceImageCollection
7+
from ami.ml.models import Pipeline
78
from ami.users.models import User
89

910
# from rich import print
1011

1112

1213
class TestJobProgress(TestCase):
1314
def setUp(self):
14-
self.project = Project.objects.create(name="Job test")
15+
self.project = Project.objects.create(name="Test project")
16+
self.source_image_collection = SourceImageCollection.objects.create(
17+
name="Test collection",
18+
project=self.project,
19+
)
20+
self.pipeline = Pipeline.objects.create(
21+
name="Test ML pipeline",
22+
description="Test ML pipeline",
23+
)
24+
self.pipeline.projects.add(self.project)
1525

1626
def test_create_job(self):
1727
job = Job.objects.create(project=self.project, name="Test job")
@@ -20,7 +30,13 @@ def test_create_job(self):
2030
self.assertEqual(job.progress.stages, [])
2131

2232
def test_create_job_with_delay(self):
23-
job = Job.objects.create(project=self.project, name="Test job", delay=1)
33+
job = Job.objects.create(
34+
project=self.project,
35+
name="Test job",
36+
delay=1,
37+
pipeline=self.pipeline,
38+
source_image_collection=self.source_image_collection,
39+
)
2440
self.assertEqual(job.progress.stages[0].key, "delay")
2541
self.assertEqual(job.progress.stages[0].progress, 0)
2642
self.assertEqual(job.progress.stages[0].status, JobState.CREATED)
@@ -45,7 +61,17 @@ class TestJobView(APITestCase):
4561

4662
def setUp(self):
4763
self.project = Project.objects.create(name="Jobs Test Project")
48-
self.job = Job.objects.create(project=self.project, name="Test job", delay=0)
64+
self.job = Job.objects.create(
65+
project=self.project,
66+
name="Test job",
67+
delay=0,
68+
pipeline=Pipeline.objects.create(name="Test pipeline"),
69+
source_image_collection=SourceImageCollection.objects.create(
70+
name="Test collection",
71+
project=self.project,
72+
),
73+
)
74+
4975
self.user = User.objects.create_user( # type: ignore
5076
email="testuser@insectai.org",
5177
is_staff=True,
@@ -82,9 +108,12 @@ def test_create_job(self):
82108
# request = self.factory.post(jobs_create_url, {"project": self.project.pk, "name": "Test job 2"})
83109
self.client.force_authenticate(user=self.user)
84110
job_data = {
85-
"project_id": self.project.pk,
111+
"project_id": self.job.project.pk,
86112
"name": "Test job 2",
113+
"pipeline_id": self.job.pipeline.pk, # type: ignore
114+
"collection_id": self.job.source_image_collection.pk, # type: ignore
87115
"delay": 0,
116+
"start_now": False,
88117
}
89118
resp = self.client.post(jobs_create_url, job_data)
90119
self.client.force_authenticate(user=None)
@@ -94,7 +123,8 @@ def test_create_job(self):
94123
self.assertEqual(data["name"], "Test job 2")
95124
# self.assertEqual(data["progress"]["status"], "CREATED")
96125
progress = JobProgress(**data["progress"])
97-
self.assertEqual(progress.summary.status, JobState.CREATED)
126+
127+
self.assertEqual(progress.summary.status, JobState.SUCCESS)
98128

99129
def test_run_job(self):
100130
jobs_run_url = reverse_with_params("api:job-run", args=[self.job.pk], params={"no_async": True})

ami/main/apps.py

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,13 @@
11
from django.apps import AppConfig
2+
from django.db.models.signals import post_migrate
23
from django.utils.translation import gettext_lazy as _
34

45

5-
class UsersConfig(AppConfig):
6+
class MainConfig(AppConfig):
67
name = "ami.main"
78
verbose_name = _("Main")
9+
10+
def ready(self):
11+
from tests.fixtures.signals import setup_complete_test_project
12+
13+
post_migrate.connect(setup_complete_test_project, sender=self)

0 commit comments

Comments
 (0)