Skip to content

v0.1.8

Compare
Choose a tag to compare
@KodyKendall KodyKendall released this 02 Jul 23:18
· 47 commits to main since this release
fcf2d21

🧩 llama_bot_rails – v0.1.8 – Developer Experience + Docker Support

This release is the first public OSS-ready version of the llama_bot_rails gem, built to integrate seamlessly with the LlamaBot backend and support full-stack LangGraph agents inside Rails apps.

🎯 What's Included:

🧠 Agent Chat UI: Adds /llama_bot/agent/chat route and controller to talk to your backend agent from within Rails

🧰 Install Generator: rails generate llama_bot:install adds:

llama_bot.rb initializer with api_base_url and allowed_routes DSL

Auto-injection of config.hosts << /host.docker.internal/ in development.rb (Docker-safe!)

🐳 Docker-Friendly Defaults: Built to work with the public Docker backend image:

docker run -e OPENAI_API_KEY=sk-... -p 8000:8000 kody06/llamabot-backend:v0.1.0

🛠 Prerequisites
Rails 6.1+

Compatible with any backend that speaks the LlamaBot HTTP spec

Uses http://host.docker.internal to connect Rails → FastAPI in dev

📈 What’s Next

🧩 Whitelisted Tool DSL: Configure exactly which routes and verbs the agent can access:

config.allowed_routes = {
"refund_user" => { verb: :post, path: "/agent/users/:id/refund" }
}

More scaffold generators (rails g llama_bot:action foo)

Fly.io one-click deploy support

Deeper multi-tenant awareness

📥 Feedback? Open an issue or ping us in Discord