Imagine: your NPC merchant checks inventory, haggles on price, applies a discount, and processes the purchase — all through real function calls to your game code, streamed token-by-token into a chat bubble. No scripted branches. No fake replies. That's this package.
This is the Unity half of CoreAI: MEAI clients, VContainer wiring, UI Toolkit chat, streaming filters, production error diagnostics, and Editor menus that spare you copy-paste.
| Package | Depends on | Status |
|---|---|---|
com.nexoider.coreaiunity — package.json (version) |
com.nexoider.coreai — core package.json |
✅ Production |
Changelog: CHANGELOG.md (release notes; keep version in package.json in sync when you ship).
First time? Open DOCS_INDEX or go straight to QUICK_START. Need a one-liner from code? See COREAI_SINGLETON_API.
| CoreAi | Static Ask / Stream / orchestration — section below |
| Agent | AgentBuilder, tools, memory |
| Chat | One-click demo + CoreAiChatPanel |
| Streaming | HTTP / LLMUnity, filters, cancel |
| LLM modes | LocalModel, ClientOwnedApi, ClientLimited, ServerManagedApi, mixed routing |
| Docs · Tests · Install | End of this file |
Call the LLM from any script without DI boilerplate:
using CoreAI;
string reply = await CoreAi.AskAsync("Hello!");
await foreach (var chunk in CoreAi.StreamAsync("Tell a story", "PlayerChat"))
label.text += chunk;
if (CoreAi.TryGetChatService(out var chat)) { /* optional AI */ }Full guide (beginner checklist + pro patterns): COREAI_SINGLETON_API
Release notes and version bumps live in CHANGELOG.md only (this file does not duplicate them). Bump version in package.json when you ship.
Current stable line: 1.0.0. It introduces public LLM execution modes and multi-mode role routing.
var blacksmith = new AgentBuilder("Blacksmith")
.WithSystemPrompt("You are a blacksmith. Sell weapons and remember purchases.")
.WithTool(new InventoryLlmTool(myInventory))
.WithMemory()
.WithMode(AgentMode.ToolsAndChat)
.WithStreaming(true) // per-agent override (0.20+)
.Build();
blacksmith.ApplyToPolicy(CoreAIAgent.Policy);
await blacksmith.Ask("Show me your swords");Docs: AGENT_BUILDER · TOOL_CALL_SPEC · MemorySystem
CoreAI → Setup → Create Chat Demo Scene
Generates a ready scene with CoreAILifetimeScope, CoreAiChatPanel, panel settings and a CoreAiChatConfig_Demo asset. Just set your backend in CoreAISettings and press Play.
Manual setup, configuration hierarchy and styling: README_CHAT.
| Tool | Purpose |
|---|---|
🧠 MemoryTool |
Per-role JSON memory on disk |
📜 LuaTool |
Sandboxed Lua execution (steps/timeout guard, <think> stripped) |
🎒 InventoryLlmTool |
NPC inventory queries |
⚙️ GameConfigTool |
Read/modify game configs |
🌍 SceneLlmTool |
Hierarchy & transforms in PlayMode |
📸 CameraLlmTool |
Base64 JPEG screenshots for Vision models |
Create your own — implement ILlmTool and register via AgentBuilder.WithTool(...).
ClientOwnedApi,ClientLimited,ServerManagedApi:MeaiOpenAiChatClientparses OpenAI-compatible SSE. Cancellation abortsUnityWebRequestimmediately.LocalModel:LlmUnityMeaiChatClientbridges LLMUnity's frame callbacks toIAsyncEnumerable.- Both paths run through
ThinkBlockStreamFilter— a state machine that removes<think>…</think>blocks even when tags are split across chunks.
Priority: UI toggle → AgentMemoryPolicy.SetStreamingEnabled(role, bool) → AgentBuilder.WithStreaming(bool) → CoreAISettings.EnableStreaming (default true).
Deep dive: STREAMING_ARCHITECTURE.
| Level | Documents |
|---|---|
| 🟢 Beginner | QUICK_START · QUICK_START_FULL · COREAI_SINGLETON_API · AGENT_BUILDER · COREAI_SETTINGS · EXAMPLES |
| 💬 Chat & streaming | README_CHAT · STREAMING_ARCHITECTURE |
| 🟡 Intermediate | TOOL_CALL_SPEC · MemorySystem · AI_AGENT_ROLES · WORLD_COMMANDS · TROUBLESHOOTING |
| 🔴 Architecture | DEVELOPER_GUIDE · DGF_SPEC · MEAI_TOOL_CALLING · MULTIPLAYER_AI |
Full map: DOCS_INDEX.
| Model | Size | Tool calling | Notes |
|---|---|---|---|
| Qwen3.5-4B | 4B | ✅ Excellent | Recommended local GGUF |
| Qwen3.5-35B (MoE) via API | 35B/3A | ✅ Excellent | Fast as 4B, accurate as 35B |
| Gemma 4 26B (LM Studio) | 26B | ✅ Excellent | Great over HTTP API |
| Qwen3.5-2B | 2B | Occasional mistakes in multi-step | |
| Qwen3.5-0.8B | 0.8B | Most tests pass |
🏆 Qwen3.5-4B passes the full PlayMode suite and is the production minimum.
Unity → Window → General → Test Runner
├── EditMode — large fast suite (no real LLM): streaming, Lua, tools, rate limit, CoreAi facade, orchestrator streaming, …
└── PlayMode — integration tests; needs HTTP (env vars) or local GGUF
Details: LLMUNITY_SETUP_AND_MODELS §7 (COREAI_OPENAI_TEST_* for HTTP).
Add via Unity Package Manager → Add package from Git URL:
https://github.com/NeoXider/CoreAI.git?path=Assets/CoreAI # core first
https://github.com/NeoXider/CoreAI.git?path=Assets/CoreAiUnity # then Unity layer
NuGet DLLs and Git dependencies for VContainer/MoonSharp/UniTask/MessagePipe/LLMUnity — see the root README §Quick Start.
Neoxider · NeoxiderTools · License: PolyForm Noncommercial 1.0
🎮 CoreAI Unity — stop writing dialogue trees. Wire the model once — ship chat, tools, and streaming without losing weekends to plumbing.