From de771f0b1714c4a644fd5089e87fda82e60f4c23 Mon Sep 17 00:00:00 2001 From: mart-r Date: Thu, 20 Feb 2025 10:59:03 +0000 Subject: [PATCH 1/4] Add section in README regarding feature / component summary --- README.md | 15 +++++++++++++++ 1 file changed, 15 insertions(+) diff --git a/README.md b/README.md index 7686432..b8ebdf1 100644 --- a/README.md +++ b/README.md @@ -62,6 +62,21 @@ The following table summarises the servable model types with their respective ou ## Run ModelServe in the container environment: +### Component / feature summary + +The core functionality is provided by services defined in `docker-compose.yml`. +Additional features generally require running services in extra compose files. + +| Feature | Category | Additional compose file | Feature description | +|:--------------:|:---------:|:---------------------------:|:------------------------------------------------------------:| +| Serving | Core | N/A | Enables serving the model for inference | +| Evaluating | Core | N/A | Enables evaluating model performance | +| Training | Auxiliary | `docker-compose-mlflow.yml` | Enables model training and lifecycle tracking through MLFlow | +| Monitoring | Auxiliary | `docker-compose-mon.yml` | Enables monitoring the HTTP API usage | +| Logging | Auxiliary | `docker-compose-log.yml` | Enable centralised logging and log analysis | +| Proxying | Auxiliary | `docker-compose-proxy.yml` | Reverse proxy service | +| Authentication | Auxiliary | `docker-compose-auth.yml` | Enable user authentication | + ### Configuration: Default configuration properties can be found and customised in `./docker//.envs` From 2e5c3a17c1113f32229fa1119b1f345c034b1c4a Mon Sep 17 00:00:00 2001 From: mart-r Date: Thu, 20 Feb 2025 11:09:08 +0000 Subject: [PATCH 2/4] Add note about environmental variables for different compose files / services --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index b8ebdf1..dc54766 100644 --- a/README.md +++ b/README.md @@ -66,6 +66,8 @@ The following table summarises the servable model types with their respective ou The core functionality is provided by services defined in `docker-compose.yml`. Additional features generally require running services in extra compose files. +Most additional services (as well as the core services) require specific environment variables to be set before running. +See the relevant sections below for details. | Feature | Category | Additional compose file | Feature description | |:--------------:|:---------:|:---------------------------:|:------------------------------------------------------------:| From 3a76ea040d7c665389884ab985e19c5b9c342832 Mon Sep 17 00:00:00 2001 From: mart-r Date: Fri, 21 Feb 2025 11:38:50 +0000 Subject: [PATCH 3/4] Make note of Training being a core component --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index dc54766..c737392 100644 --- a/README.md +++ b/README.md @@ -73,7 +73,7 @@ See the relevant sections below for details. |:--------------:|:---------:|:---------------------------:|:------------------------------------------------------------:| | Serving | Core | N/A | Enables serving the model for inference | | Evaluating | Core | N/A | Enables evaluating model performance | -| Training | Auxiliary | `docker-compose-mlflow.yml` | Enables model training and lifecycle tracking through MLFlow | +| Training | Core | N/A | Enables model training and lifecycle tracking through MLFlow | | Monitoring | Auxiliary | `docker-compose-mon.yml` | Enables monitoring the HTTP API usage | | Logging | Auxiliary | `docker-compose-log.yml` | Enable centralised logging and log analysis | | Proxying | Auxiliary | `docker-compose-proxy.yml` | Reverse proxy service | From 4203d4f33ca79dbb712759cd75a68268e530be1a Mon Sep 17 00:00:00 2001 From: mart-r Date: Fri, 21 Feb 2025 12:27:56 +0000 Subject: [PATCH 4/4] Add optional workaround for running with mlflow component --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index c737392..fc6093a 100644 --- a/README.md +++ b/README.md @@ -88,6 +88,8 @@ To serve NLP models through a container, run the following commands: export MODEL_PACKAGE_FULL_PATH= export CMS_UID=$(id -u $USER) export CMS_GID=$(id -g $USER) +# NOTE: use if you wish to save models locally (i.e run without the mlflow component) +# export export MLFLOW_TRACKING_URI="file:///tmp/mlruns/" docker compose -f docker-compose.yml up -d ``` Then the API docs will be accessible at localhost on the mapped port specified in `docker-compose.yml`. The container runs