AIChatBot is a local and online AI-powered chatbot built using open-source language models. This project supports both locally hosted models (via Ollama) and cloud-based models via OpenRouter. It demonstrates the integration of AI with a .NET 9 API and Angular 20 frontend.
Watch the AIChatBot in action on YouTube:
πΊ AIChatBot Demo
To run this project locally, ensure the following:
- Windows 10/11 with WSL support
- Installed Ubuntu 20.04.6 LTS (via Microsoft Store)
- .NET 9 SDK
- Node.js (v20+)
- Angular CLI (
npm install -g @angular/cli
) - Ollama installed in Ubuntu for running local AI models
- Optional: Account on https://openrouter.ai
- Go to Microsoft Store β Search for Ubuntu 20.04.6 LTS β Install.
- Open Ubuntu and create your UNIX user account.
curl -fsSL https://ollama.com/install.sh | sh
To pull and run the desired models:
# Pull models
ollama pull phi3:latest
ollama pull mistral:latest
ollama pull gemma:2b
ollama pull llama3:latest
# Run models
ollama run phi3
ollama run mistral
ollama run gemma:2b
ollama run llama3
# List all pulled models
ollama list
# Stop a running model
ollama stop phi3
# View running models
ps aux | grep ollama
From Ubuntu terminal:
shutdown now
Or simply close the terminal window if you donβt need a full shutdown.
-
Go to https://openrouter.ai and sign up.
-
Navigate to API Keys in your profile and generate an API key.
-
Set this key as an environment variable in your API project:
export OPENROUTER_API_KEY=your_key_here
-
Models used:
google/gemma-3-27b-it:free
deepseek/deepseek-chat-v3-0324:free
API requests are routed via OpenRouter using this key, supporting seamless AI chat.
- Navigate to
AIChatBot.API/
- Run the following commands:
dotnet restore
dotnet build
dotnet run
- Ensure
appsettings.json
file includes:
ApiKey=YOUR_KEY_HERE
- Navigate to
AIChatBot.UI/
- Run:
npm install
ng serve
- Access the chatbot UI at
http://localhost:4200/
Model | Type | Source | Access |
---|---|---|---|
PHI-3:latest | Local | Ollama | ollama run |
Mistral:latest | Local | Ollama | ollama run |
Gemma:2b | Local | Ollama | ollama run |
Llama3:latest | Local | Ollama | ollama run |
google/gemma-3-27b-it:free | Online | OpenRouter.ai | API Key |
deepseek/deepseek-chat-v3-0324 | Online | OpenRouter.ai | API Key |
AIChatBot/
β
βββ AIChatBot.API/ # .NET 9 API for chatbot
βββ AIChatBot.UI/ # Angular 20 UI frontend
βββ README.md # Project documentation
The AIChatBot supports two advanced operation modes beyond simple chat:
In this mode, the AI can recognize specific tasks in user prompts and use internal tools (functions) to perform actions. Integrated tools include:
Tool Function | Description | Example Prompt |
---|---|---|
CreateFile |
Creates a text file with given content | "Create a file called report.txt with the text Hello world ." |
FetchWebData |
Fetches the HTML/content of a public URL | "Fetch the content of https://example.com" |
SendEmail |
Simulates sending an email (console-logged) | "Send an email to john@example.com with subject Hello ." |
These functions are executed server-side in .NET
, with input parsed from natural language prompts.
The AI agent is capable of:
- Understanding high-level tasks
- Selecting and invoking appropriate tools
- Providing intelligent responses based on the outcome
This is powered by an AgentService
that works with both local LLMs (via Ollama) and cloud models (via OpenRouter) to determine the right function to execute and handle the response.
You can toggle between AI modes via the UI:
- Chat-Only Mode
- AI + Tools Mode
- Agent Mode (multi-step planning, coming soon)
- Choose your preferred model type (local or online).
- Start the backend using
.NET 9
- Start the frontend using Angular CLI
- Interact with AIChatBot at
http://localhost:4200/
Pull requests and suggestions are welcome! Feel free to fork the repo and enhance it.
This project is open-source and available under the MIT License.