Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
141 changes: 91 additions & 50 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,101 +1,142 @@
# Automated Monitoring of Insects ML Platform

Platform for processing and reviewing images from automated insect monitoring stations.
Platform for processing and reviewing images from automated insect monitoring stations. Intended for collaborating on multi-deployment projects, maintaining metadata and orchestrating multiple machine learning pipelines for analysis.

[![Built with Cookiecutter Django](https://img.shields.io/badge/built%20with-Cookiecutter%20Django-ff69b4.svg?logo=cookiecutter)](https://github.com/cookiecutter/cookiecutter-django/)
[![Black code style](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)

License: MIT
## Quick Start

## Settings
The platform uses Docker Compose to run all services. To run all services locally, install Docker and run the following command:

Moved to [settings](http://cookiecutter-django.readthedocs.io/en/latest/settings.html).
$ docker compose up

## Basic Commands
Explore the API

### Setting Up Your Users
- Rest Framework: http://localhost:8000/api/v2/
- OpenAPI / Swagger: http://localhost:8000/api/v2/docs/

- To create a **normal user account**, just go to Sign Up and fill out the form. Once you submit it, you'll see a "Verify Your E-mail Address" page. Go to your console to see a simulated email verification message. Copy the link into your browser. Now the user's email should be verified and ready to go.
Install and run the frontend:

- To create a **superuser account**, use this command:

$ python manage.py createsuperuser
```bash
cd ui
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
nvm install
yarn install
yarn start
```

For convenience, you can keep your normal user logged in on Chrome and your superuser logged in on Firefox (or similar), so that you can see how the site behaves for both kinds of users.
Visit http://localhost:3000/

### Type checks
By default this will try to connect to http://localhost:8000 for the backend API. Use the env var `API_PROXY_TARGET` to change this.

Running type checks with mypy:
Create a super user account:

$ mypy ami
docker compose exec django python manage.py createsuperuser

### Test coverage
Access the Django admin:

To run the tests, check your test coverage, and generate an HTML coverage report:
http://localhost:8000/admin/

$ coverage run -m pytest
$ coverage html
$ open htmlcov/index.html
## Helpful Commands

#### Running tests with pytest
Generate OpenAPI schema

$ pytest
```bash
docker-compose run --rm django python manage.py spectacular --api-version 'api' --format openapi --file ami-openapi-schema.yaml
```

### Live reloading and Sass CSS compilation
Generate TypeScript types from OpenAPI schema

Moved to [Live reloading and SASS compilation](https://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html#sass-compilation-live-reloading).
```bash
docker run --rm -v ${PWD}:/local openapitools/openapi-generator-cli generate -i /local/ami-openapi-schema.yaml -g typescript-axios -o /local/ui/src/api-schema.d.ts
```

### Celery
Generate diagram graph of Django models & relationships (Graphviz required)

This app comes with Celery.
```bash
docker compose run --rm django python manage.py graph_models -a -o models.dot --dot
dot -Tsvg models.dot > models.svg
```

To run a celery worker:
Run tests

```bash
cd ami
celery -A config.celery_app worker -l info
docker-compose run --rm django python manage.py test
```

Please note: For Celery's import magic to work, it is important _where_ the celery commands are run. If you are in the same folder with _manage.py_, you should be right.

To run [periodic tasks](https://docs.celeryq.dev/en/stable/userguide/periodic-tasks.html), you'll need to start the celery beat scheduler service. You can start it as a standalone process:
Run tests with a specific pattern in the test name

```bash
cd ami
celery -A config.celery_app beat
docker-compose run --rm django python manage.py test -k pattern
```

or you can embed the beat service inside a worker with the `-B` option (not recommended for production use):
Launch the Django shell:

docker-compose exec django python manage.py shell

Install dependencies locally for IDE support (Intellisense, etc):

```bash
cd ami
celery -A config.celery_app worker -B -l info
python -m venv venv
source venv/bin/activate
pip install -r requirements/local.txt
```

### Email Server
## Dependencies

### Backend

- [Docker](https://docs.docker.com/get-docker/)
- [Docker Compose](https://docs.docker.com/compose/install/)

### Frontend

- [Node.js](https://nodejs.org/en/download/)
- [Yarn](https://yarnpkg.com/getting-started/install)

0. Change to the frontend directory:

```bash
$ cd ui
```

1. Install Node Version Manager:

```bash
$ curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
```

In development, it is often nice to be able to see emails that are being sent from your application. For that reason local SMTP server [MailHog](https://github.com/mailhog/MailHog) with a web interface is available as docker container.
2. Install Node.js:

Container mailhog will start automatically when you will run all docker containers.
Please check [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html) for more details how to start all containers.
```bash
$ nvm install
```

With MailHog running, to view messages that are sent by your application, open your browser and go to `http://127.0.0.1:8025`
3. Install Yarn:

### Sentry
```bash
$ npm install --global yarn
```

Sentry is an error logging aggregator service. You can sign up for a free account at <https://sentry.io/signup/?code=cookiecutter> or download and host it yourself.
The system is set up with reasonable defaults, including 404 logging and integration with the WSGI application.
4. Install the dependencies:

You must set the DSN url in production.
```bash
$ yarn install
```

## Deployment
5. Create a `.env` file in the `frontend` directory with the following content:

The following details how to deploy this application.
```bash
REACT_APP_API_URL=http://localhost:8000
```

### Docker
6. Start the frontend:

See detailed [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html).
```bash
$ yarn start
```

[Further documentation about using Django within this compose setup](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html).

## Project Data Storage

Expand Down
11 changes: 8 additions & 3 deletions ami/main/tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -517,11 +517,16 @@ def test_occurrences_for_project(self):
self.assertEqual(response.json()["count"], Occurrence.objects.filter(project=project).count())

def test_taxa_list(self):
from ami.main.models import Taxon
# This currently fails! @TODO investigate

response = self.client.get("/api/v2/taxa/")
response = self.client.get("/api/v2/taxa/", {"project": self.project_one.pk})
taxa_for_project = self.project_one.taxa.all()
self.assertEqual(response.status_code, 200)
self.assertEqual(response.json()["count"], Taxon.objects.count())
# Compare lists of taxa:
self.assertListEqual(
[taxon.name for taxon in taxa_for_project],
[taxon["name"] for taxon in response.json()["results"]],
)

def _test_taxa_for_project(self, project: Project):
"""
Expand Down
2 changes: 2 additions & 0 deletions local.yml → docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,8 @@ services:
build:
context: .
dockerfile: ./compose/local/postgres/Dockerfile
ports:
- "5444:5432"
volumes:
- ami_local_postgres_data:/var/lib/postgresql/data
- ami_local_postgres_data_backups:/backups
Expand Down
21 changes: 21 additions & 0 deletions docs/diagrams/models.mermaid
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
classDiagram
Project -- Deployment
Site -- Deployment
DeviceType -- Deployment
Deployment -- Event
Project -- Site
Event -- SourceImage
Event -- SourceImageCollection
SourceImageCollection -- SourceImage
User -- Identification
Identification -- Determination
Pipeline -- Algorithms
Job --> Detections : bbox, image embedding
Job --> Predictions : classification scores
Detections --> Occurrence : tracking
Occurrence -- Determination
Job <-- SourceImageCollection
Job <-- Pipeline
Project -- TaxaList
TaxaList -- Determination
Predictions -- Determination
1 change: 1 addition & 0 deletions docs/diagrams/models.mermaid.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
58 changes: 31 additions & 27 deletions ui/vite.config.ts
Original file line number Diff line number Diff line change
@@ -1,38 +1,42 @@
import react from '@vitejs/plugin-react'
import childProcees from 'child_process'
import { defineConfig } from 'vite'
import childProcess from 'child_process'
import { defineConfig, loadEnv } from 'vite'
import eslint from 'vite-plugin-eslint'
import svgr from 'vite-plugin-svgr'
import viteTsconfigPaths from 'vite-tsconfig-paths'

const commitHash = childProcees
const commitHash = childProcess
.execSync('git rev-parse --short HEAD')
.toString()

export default defineConfig({
base: '/',
build: {
outDir: './build',
},
plugins: [
react(),
viteTsconfigPaths(),
svgr({ include: '**/*.svg?react' }),
eslint({ exclude: ['/virtual:/**', 'node_modules/**'] }),
],
define: {
__COMMIT_HASH__: JSON.stringify(commitHash),
},
server: {
open: true,
port: 3000,
proxy: {
'/api': {
// target: 'https://api.dev.insectai.org',
// target: 'http://localhost:5001',
target: 'https://api.beluga.insectai.org',
changeOrigin: true,
export default defineConfig(({ mode }) => {
// Load env file based on `mode` in the current working directory.
// Set the third parameter to '' to load all env regardless of the `VITE_` prefix.
const env = loadEnv(mode, process.cwd(), '')

return {
base: '/',
build: {
outDir: './build',
},
plugins: [
react(),
viteTsconfigPaths(),
svgr({ include: '**/*.svg?react' }),
eslint({ exclude: ['/virtual:/**', 'node_modules/**'] }),
],
define: {
__COMMIT_HASH__: JSON.stringify(commitHash),
},
server: {
open: true,
port: 3000,
proxy: {
'/api': {
target: env.API_PROXY_TARGET || 'http://localhost:8000',
changeOrigin: true,
},
},
},
},
}
})