Skip to content

Latest commit

 

History

History
171 lines (112 loc) · 7.08 KB

File metadata and controls

171 lines (112 loc) · 7.08 KB

🎮 CoreAI Unity — where the LLM meets your scene

Imagine: your NPC merchant checks inventory, haggles on price, applies a discount, and processes the purchase — all through real function calls to your game code, streamed token-by-token into a chat bubble. No scripted branches. No fake replies. That's this package.

This is the Unity half of CoreAI: MEAI clients, VContainer wiring, UI Toolkit chat, streaming filters, production error diagnostics, and Editor menus that spare you copy-paste.

Package Depends on Status
com.nexoider.coreaiunitypackage.json (version) com.nexoider.coreaicore package.json ✅ Production

Changelog: CHANGELOG.md (release notes; keep version in package.json in sync when you ship).

Languages: English · Russian

First time? Open DOCS_INDEX or go straight to QUICK_START. Need a one-liner from code? See COREAI_SINGLETON_API.


Contents

CoreAi Static Ask / Stream / orchestration — section below
Agent AgentBuilder, tools, memory
Chat One-click demo + CoreAiChatPanel
Streaming HTTP / LLMUnity, filters, cancel
LLM modes LocalModel, ClientOwnedApi, ClientLimited, ServerManagedApi, mixed routing
Docs · Tests · Install End of this file

🎯 CoreAi — one static entry point (new in 0.21)

Call the LLM from any script without DI boilerplate:

using CoreAI;

string reply = await CoreAi.AskAsync("Hello!");
await foreach (var chunk in CoreAi.StreamAsync("Tell a story", "PlayerChat"))
    label.text += chunk;
if (CoreAi.TryGetChatService(out var chat)) { /* optional AI */ }

Full guide (beginner checklist + pro patterns): COREAI_SINGLETON_API


Changelog

Release notes and version bumps live in CHANGELOG.md only (this file does not duplicate them). Bump version in package.json when you ship.

Current stable line: 1.0.0. It introduces public LLM execution modes and multi-mode role routing.


🏗️ Build an agent

var blacksmith = new AgentBuilder("Blacksmith")
    .WithSystemPrompt("You are a blacksmith. Sell weapons and remember purchases.")
    .WithTool(new InventoryLlmTool(myInventory))
    .WithMemory()
    .WithMode(AgentMode.ToolsAndChat)
    .WithStreaming(true)          // per-agent override (0.20+)
    .Build();

blacksmith.ApplyToPolicy(CoreAIAgent.Policy);
await blacksmith.Ask("Show me your swords");

Docs: AGENT_BUILDER · TOOL_CALL_SPEC · MemorySystem


💬 Add chat UI in 1 click

CoreAI → Setup → Create Chat Demo Scene

Generates a ready scene with CoreAILifetimeScope, CoreAiChatPanel, panel settings and a CoreAiChatConfig_Demo asset. Just set your backend in CoreAISettings and press Play.

Manual setup, configuration hierarchy and styling: README_CHAT.


🔧 Built-in tools

Tool Purpose
🧠 MemoryTool Per-role JSON memory on disk
📜 LuaTool Sandboxed Lua execution (steps/timeout guard, <think> stripped)
🎒 InventoryLlmTool NPC inventory queries
⚙️ GameConfigTool Read/modify game configs
🌍 SceneLlmTool Hierarchy & transforms in PlayMode
📸 CameraLlmTool Base64 JPEG screenshots for Vision models

Create your own — implement ILlmTool and register via AgentBuilder.WithTool(...).


🌊 Streaming & cancellation

  • ClientOwnedApi, ClientLimited, ServerManagedApi: MeaiOpenAiChatClient parses OpenAI-compatible SSE. Cancellation aborts UnityWebRequest immediately.
  • LocalModel: LlmUnityMeaiChatClient bridges LLMUnity's frame callbacks to IAsyncEnumerable.
  • Both paths run through ThinkBlockStreamFilter — a state machine that removes <think>…</think> blocks even when tags are split across chunks.

Priority: UI toggle → AgentMemoryPolicy.SetStreamingEnabled(role, bool)AgentBuilder.WithStreaming(bool)CoreAISettings.EnableStreaming (default true).

Deep dive: STREAMING_ARCHITECTURE.


📖 Documentation

Level Documents
🟢 Beginner QUICK_START · QUICK_START_FULL · COREAI_SINGLETON_API · AGENT_BUILDER · COREAI_SETTINGS · EXAMPLES
💬 Chat & streaming README_CHAT · STREAMING_ARCHITECTURE
🟡 Intermediate TOOL_CALL_SPEC · MemorySystem · AI_AGENT_ROLES · WORLD_COMMANDS · TROUBLESHOOTING
🔴 Architecture DEVELOPER_GUIDE · DGF_SPEC · MEAI_TOOL_CALLING · MULTIPLAYER_AI

Full map: DOCS_INDEX.


📏 Recommended models

Model Size Tool calling Notes
Qwen3.5-4B 4B ✅ Excellent Recommended local GGUF
Qwen3.5-35B (MoE) via API 35B/3A ✅ Excellent Fast as 4B, accurate as 35B
Gemma 4 26B (LM Studio) 26B ✅ Excellent Great over HTTP API
Qwen3.5-2B 2B ⚠️ Works Occasional mistakes in multi-step
Qwen3.5-0.8B 0.8B ⚠️ Basic Most tests pass

🏆 Qwen3.5-4B passes the full PlayMode suite and is the production minimum.


🧪 Tests

Unity → Window → General → Test Runner
  ├── EditMode — large fast suite (no real LLM): streaming, Lua, tools, rate limit, CoreAi facade, orchestrator streaming, …
  └── PlayMode — integration tests; needs HTTP (env vars) or local GGUF

Details: LLMUNITY_SETUP_AND_MODELS §7 (COREAI_OPENAI_TEST_* for HTTP).


📦 Install

Add via Unity Package Manager → Add package from Git URL:

https://github.com/NeoXider/CoreAI.git?path=Assets/CoreAI          # core first
https://github.com/NeoXider/CoreAI.git?path=Assets/CoreAiUnity     # then Unity layer

NuGet DLLs and Git dependencies for VContainer/MoonSharp/UniTask/MessagePipe/LLMUnity — see the root README §Quick Start.


🤝 Author

Neoxider · NeoxiderTools · License: PolyForm Noncommercial 1.0

🎮 CoreAI Unity — stop writing dialogue trees. Wire the model once — ship chat, tools, and streaming without losing weekends to plumbing.