This repository provides a setup for integrating OLLAMA, OPEN-WEBUI, PIPELINES, and LANGFUSE using Docker. Follow the steps below to get everything up and running.
- Docker and required GPU drivers installed on your system.
-
Clone this repository:
git clone https://github.com/karaketir16/openwebui-langfuse.git cd openwebui-langfuse
-
Run the setup script:
./run-compose.sh
or
docker compose -f docker-compose.yaml -f langfuse-v3.yaml up -d # default driver is nvidia
-
Documentation
- You can find up-to-date documentation here.
-
Download the
langfuse_filter_pipeline.py
file (only if offline):- If your setup does not have internet access:
- You can manually download the script from:
https://github.com/open-webui/pipelines/blob/main/examples/filters/langfuse_filter_pipeline.py
- Or use the local copy provided at:
example/langfuse_filter_pipeline.py
- You can manually download the script from:
- If your setup does not have internet access:
-
Access Langfuse:
- Open your browser and go to
http://localhost:4000
.
- Open your browser and go to
-
Create an Admin Account and Project:
- Create an admin account and then create an organization and a project.
- Go to Project Settings and create an API key.
- Retrieve the secret key and public key.
-
Access Open-WebUI:
- Open your browser and go to
http://localhost:3000
.
- Open your browser and go to
-
Create an Admin Account:
- Create an admin account.
-
Upload the Pipeline Script:
- Go to
Settings -> Admin Settings -> Pipelines
. - If online, paste this URL:
into the
https://raw.githubusercontent.com/open-webui/pipelines/refs/heads/main/examples/filters/langfuse_filter_pipeline.py
Install from Github URL
field and click the download button. - If offline or using a custom script, upload
langfuse_filter_pipeline.py
from your local machine via theUpload Pipeline
section.
- Go to
-
Configure the Script:
- After uploading the pipeline, edit its configuration in the UI.
- Replace the placeholder values as follows:
your-secret-key-here
→ your Langfuse secret keyyour-public-key-here
→ your Langfuse public keyhttps://cloud.langfuse.com
→http://langfuse-web:4000
(local address)
-
Monitor Usage:
- You can now monitor Open-WebUI usage statistics from Langfuse.
-
Access Open-WebUI:
- Open your browser and go to
http://localhost:3000
.
- Open your browser and go to
-
Create an Admin Account:
- Create an admin account if you haven’t already.
-
Pull Models:
- Navigate to
Settings -> Admin Settings -> Models
. - Enter a model tag to pull from the Ollama library (e.g.,
phi3:mini
). - Press the pull button.
- Navigate to