Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
66 changes: 66 additions & 0 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -84,10 +84,76 @@ jobs:
run: cargo binstall --force --locked cargo-component@0.20.0
- name: Build all test components
run: cargo make build-test-components
ollama-integration-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/cache@v3
with:
path: |
~/.cargo/bin/
~/.cargo/registry/index/
~/.cargo/registry/cache/
~/.cargo/git/db/
target/
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}-ollama-integration
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: stable
override: true
- uses: davidB/rust-cargo-make@v1
- uses: cargo-bins/cargo-binstall@main
- name: Install tools
run: |
set -e
cargo binstall --force --locked cargo-component@0.20.0
cargo binstall golem-cli@1.2.2-dev.11 --locked --force --no-confirm
cargo binstall wac-cli --locked --force --no-confirm
- name: Start Ollama in Docker
run: |
set -e
docker run -d --name ollama -p 11434:11434 ollama/ollama:latest
timeout 60 bash -c 'until curl -f http://localhost:11434/api/version; do sleep 2; done'
echo "Pulling Qwen2.5:1.5b"
docker exec ollama ollama pull qwen2.5:1.5b
echo "Pulling Gemma2:2b"
docker exec ollama ollama pull gemma2:2b
echo "Verifying models are available"
docker exec ollama ollama list | grep -q "qwen2.5:1.5b" || exit 1
docker exec ollama ollama list | grep -q "gemma2:2b" || exit 1
echo "Ollama setup completed."
- name: Install and Run latest Golem Server
run: |
set -e
echo "Installing Golem server"
sudo curl -L https://github.com/golemcloud/golem-cli/releases/download/v1.2.3/golem-x86_64-unknown-linux-gnu -o ./golem
sudo chmod +x ./golem
sudo mv ./golem /usr/local/bin/golem
golem --version
golem profile switch local
nohup golem server run >golem-server.log 2>&1 &
echo "Golem server started."
- name: Build and test Ollama integration
run: |
set -e
cargo make build-ollama
cd test
golem-cli app build -b ollama-debug
golem-cli app deploy -b ollama-debug
golem-cli worker new -e GOLEM_OLLAMA_BASE_URL=http://localhost:11434 test:llm/ollama-1
golem-cli worker invoke test:llm/ollama-1 test1
golem-cli worker invoke test:llm/ollama-1 test2
golem-cli worker invoke test:llm/ollama-1 test3
golem-cli worker invoke test:llm/ollama-1 test4
golem-cli worker invoke test:llm/ollama-1 test5
golem-cli worker invoke test:llm/ollama-1 test6
golem-cli worker invoke test:llm/ollama-1 test7
publish-all:
needs:
- tests
- build-test-components
- ollama-integration-tests
runs-on: ubuntu-latest
permissions:
contents: write
Expand Down
48 changes: 43 additions & 5 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[workspace]
resolver = "2"

members = ["llm", "llm-anthropic", "llm-grok", "llm-openai", "llm-openrouter"]
members = ["llm", "llm-anthropic", "llm-grok", "llm-ollama", "llm-openai", "llm-openrouter"]

[profile.release]
debug = false
Expand Down
39 changes: 39 additions & 0 deletions Makefile.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,18 @@ args = ["clean"]
command = "cargo"
args = ["test"]

[tasks.build-ollama]
install_crate = { crate_name = "cargo-component", version = "0.20.0" }
command = "cargo-component"
args = ["build", "-p", "golem-llm-ollama"]


[tasks.build-ollama-portable]
install_crate = { crate_name = "cargo-component", version = "0.20.0" }
command = "cargo-component"
args = ["build", "-p", "golem-llm-ollama", "--no-default-features"]


[tasks.build-anthropic]
install_crate = { crate_name = "cargo-component", version = "0.20.0" }
command = "cargo-component"
Expand Down Expand Up @@ -56,6 +68,7 @@ dependencies = [
"build-grok",
"build-openai",
"build-openrouter",
"build-ollama",
]

[tasks.build-portable]
Expand All @@ -64,6 +77,7 @@ dependencies = [
"build-grok-portable",
"build-openai-portable",
"build-openrouter-portable",
"build-ollama-portable",
]

[tasks.build-all]
Expand All @@ -78,6 +92,7 @@ cp target/wasm32-wasip1/debug/golem_llm_anthropic.wasm components/debug/golem_ll
cp target/wasm32-wasip1/debug/golem_llm_grok.wasm components/debug/golem_llm_grok.wasm
cp target/wasm32-wasip1/debug/golem_llm_openai.wasm components/debug/golem_llm_openai.wasm
cp target/wasm32-wasip1/debug/golem_llm_openrouter.wasm components/debug/golem_llm_openrouter.wasm
cp target/wasm32-wasip1/debug/golem_llm_ollama.wasm components/debug/golem_llm_ollama.wasm

cm_run_task clean
cm_run_task build-portable
Expand All @@ -86,8 +101,20 @@ cp target/wasm32-wasip1/debug/golem_llm_anthropic.wasm components/debug/golem_ll
cp target/wasm32-wasip1/debug/golem_llm_grok.wasm components/debug/golem_llm_grok-portable.wasm
cp target/wasm32-wasip1/debug/golem_llm_openai.wasm components/debug/golem_llm_openai-portable.wasm
cp target/wasm32-wasip1/debug/golem_llm_openrouter.wasm components/debug/golem_llm_openrouter-portable.wasm
cp target/wasm32-wasip1/debug/golem_llm_ollama.wasm components/debug/golem_llm_ollama-portable.wasm
'''

[tasks.release-build-ollama]
install_crate = { crate_name = "cargo-component", version = "0.20.0" }
command = "cargo-component"
args = ["build", "-p", "golem-llm-ollama", "--release"]

[tasks.release-build-ollama-portable]
install_crate = { crate_name = "cargo-component", version = "0.20.0" }
command = "cargo-component"
args = ["build", "-p", "golem-llm-ollama", "--release", "--no-default-features"]


[tasks.release-build-anthropic]
install_crate = { crate_name = "cargo-component", version = "0.20.0" }
command = "cargo-component"
Expand Down Expand Up @@ -146,6 +173,7 @@ dependencies = [
"release-build-grok",
"release-build-openai",
"release-build-openrouter",
"release-build-ollama",
]

[tasks.release-build-portable]
Expand All @@ -154,6 +182,7 @@ dependencies = [
"release-build-grok-portable",
"release-build-openai-portable",
"release-build-openrouter-portable",
"release-build-ollama-portable",
]

[tasks.release-build-all]
Expand All @@ -170,6 +199,7 @@ cp target/wasm32-wasip1/release/golem_llm_anthropic.wasm components/release/gole
cp target/wasm32-wasip1/release/golem_llm_grok.wasm components/release/golem_llm_grok.wasm
cp target/wasm32-wasip1/release/golem_llm_openai.wasm components/release/golem_llm_openai.wasm
cp target/wasm32-wasip1/release/golem_llm_openrouter.wasm components/release/golem_llm_openrouter.wasm
cp target/wasm32-wasip1/release/golem_llm_ollama.wasm components/release/golem_llm_ollama.wasm

cm_run_task clean
cm_run_task release-build-portable
Expand All @@ -178,6 +208,7 @@ cp target/wasm32-wasip1/release/golem_llm_anthropic.wasm components/release/gole
cp target/wasm32-wasip1/release/golem_llm_grok.wasm components/release/golem_llm_grok-portable.wasm
cp target/wasm32-wasip1/release/golem_llm_openai.wasm components/release/golem_llm_openai-portable.wasm
cp target/wasm32-wasip1/release/golem_llm_openrouter.wasm components/release/golem_llm_openrouter-portable.wasm
cp target/wasm32-wasip1/release/golem_llm_ollama.wasm components/release/golem_llm_ollama-portable.wasm
'''

[tasks.wit-update]
Expand Down Expand Up @@ -221,6 +252,11 @@ rm -r llm-openrouter/wit/deps
mkdir llm-openrouter/wit/deps/golem-llm
cp wit/golem-llm.wit llm-openrouter/wit/deps/golem-llm/golem-llm.wit
cp wit/deps/wasi:io llm-openrouter/wit/deps
rm -r llm-ollama/wit/deps
mkdir llm-ollama/wit/deps/golem-llm
cp wit/golem-llm.wit llm-ollama/wit/deps/golem-llm/golem-llm.wit
cp wit/deps/wasi:io llm-ollama/wit/deps


rm -r test/wit
mkdir test/wit/deps/golem-llm
Expand Down Expand Up @@ -289,8 +325,11 @@ golem-cli app clean
golem-cli app build -b openai-debug
golem-cli app clean
golem-cli app build -b openrouter-debug
golem-cli app clean
golem-cli app build -b ollama-debug
'''


[tasks.set-version]
description = "Sets the version in all Cargo.toml files to the value of the VERSION environment variable"
condition = { env_set = ["VERSION"] }
Expand Down
5 changes: 5 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,12 @@ There are 8 published WASM files for each release:
| Name | Description |
|--------------------------------------|--------------------------------------------------------------------------------------|
| `golem-llm-anthropic.wasm` | LLM implementation for Anthropic AI, using custom Golem specific durability features |
| `golem-llm-ollama.wasm` | LLM implementation for Ollama, using custom Golem specific durability features |
| `golem-llm-grok.wasm` | LLM implementation for xAI (Grok), using custom Golem specific durability features |
| `golem-llm-openai.wasm` | LLM implementation for OpenAI, using custom Golem specific durability features |
| `golem-llm-openrouter.wasm` | LLM implementation for OpenRouter, using custom Golem specific durability features |
| `golem-llm-anthropic-portable.wasm` | LLM implementation for Anthropic AI, with no Golem specific dependencies. |
| `golem-llm-ollama-portable.wasm` | LLM implementation for Ollama, with no Golem specific dependencies. |
| `golem-llm-grok-portable.wasm` | LLM implementation for xAI (Grok), with no Golem specific dependencies. |
| `golem-llm-openai-portable.wasm` | LLM implementation for OpenAI, with no Golem specific dependencies. |
| `golem-llm-openrouter-portable.wasm` | LLM implementation for OpenRouter, with no Golem specific dependencies. |
Expand All @@ -34,6 +36,7 @@ Each provider has to be configured with an API key passed as an environment vari
| Grok | `XAI_API_KEY` |
| OpenAI | `OPENAI_API_KEY` |
| OpenRouter | `OPENROUTER_API_KEY` |
| Ollama | `GOLEM_OLLAMA_BASE_URL` |

Additionally, setting the `GOLEM_LLM_LOG=trace` environment variable enables trace logging for all the communication
with the underlying LLM provider.
Expand Down Expand Up @@ -134,6 +137,8 @@ Then build and deploy the _test application_. Select one of the following profil
|--------------|-----------------------------------------------------------------------------------------------|
| `anthropic-debug` | Uses the Anthropic LLM implementation and compiles the code in debug profile |
| `anthropic-release` | Uses the Anthropic LLM implementation and compiles the code in release profile |
| `ollama-debug` | Uses the Ollama LLM implementation and compiles the code in debug profile |
| `ollama-release` | Uses the Ollama LLM implementation and compiles the code in release profile |
| `grok-debug` | Uses the Grok LLM implementation and compiles the code in debug profile |
| `grok-release` | Uses the Grok LLM implementation and compiles the code in release profile |
| `openai-debug` | Uses the OpenAI LLM implementation and compiles the code in debug profile |
Expand Down
Loading
Loading