Welcome to Rusty Chat, a full-stack web application built using the Leptos web framework, Axum backend, and Tokio for asynchronous operations. The app integrates a local AI language model using Ollama to offer intelligent chat responses. This project is designed for developers interested in building performant, real-time chat applications with a modern Rust stack.
- Real-time Chat: 💬 Leverage Http Stream to enable seamless real-time chat functionality.
- AI-Powered Chat: 🤖 Integrate Ollama to run local AI models for enhanced chat responses.
- Full-Stack Development: 🌐 Leptos is used for both the frontend and backend, with Axum as the backend server.
- Async Framework: ⚡ Tokio is used for handling asynchronous tasks efficiently.
- TailwindCSS: 🎨 Beautiful, responsive UI powered by TailwindCSS.
- WebAssembly: 🕹️ Support for WebAssembly (WASM) for better performance in the browser.
🎥 Demo Video
You can watch the demo of Rusty Chat here:
To get started with Rusty Chat, you'll need the following tools installed:
- Rust (Nightly version): Install Rust through rustup.
- Node.js & npm: Required for managing frontend dependencies and building assets.
- cargo-leptos: Install this tool to manage Leptos-specific tasks.
Install cargo-leptos with the following command:
cargo install cargo-leptos --lockedClone the project to your local machine:
git clone https://github.com/Abhishek2010DevSingh/RustyChat
cd RustyChatInstall the required dependencies using Cargo:
cargo buildFor frontend assets, you can install the necessary npm packages by running:
npm installStart the development server with the following command:
cargo leptos watchThis will compile the Rust backend and the frontend assets, then serve the application locally.
By default, the app will be hosted at 127.0.0.1:3000.
Rusty Chat uses TailwindCSS for styling. Make sure the tailwind-input-file is set to "style/main.css" in your Cargo.toml file. You can customize the styles according to your needs.
To start the build process for the frontend assets, run the following:
npm run buildIf you want to use the AI chat functionality with Ollama, make sure you have Ollama installed and set up locally.
cargo leptos watch --features "ssr"This enables server-side rendering (SSR) along with Ollama's local AI model integration.
Rusty Chat includes end-to-end testing with Playwright. You can run tests using the following command:
cargo leptos end-to-endTo run the tests in release mode:
cargo leptos end-to-end --releaseFor a production build, use the following command:
cargo leptos build --releaseThis will compile both the backend and the frontend assets and prepare them for deployment.
After building for release, you can deploy the compiled server and the static site to your production server.
Copy the following files from the target directory to your server:
- Server binary: Located at
target/server/release - Site package: Located at
target/site
Set up the environment variables for your project:
LEPTOS_OUTPUT_NAME="rusty-chat"
LEPTOS_SITE_ROOT="site"
LEPTOS_SITE_PKG_DIR="pkg"
LEPTOS_SITE_ADDR="127.0.0.1:3000"
LEPTOS_RELOAD_PORT="3001"Finally, run the server binary:
./target/server/release/rusty-chatIf you plan to run the app with WebAssembly, make sure the wasm32-unknown-unknown target is added:
rustup target add wasm32-unknown-unknownYou can then build the app for WebAssembly by running:
cargo leptos build --target wasm32-unknown-unknownThe app's design uses modern icons and clean UI elements that align with current web design trends. Custom icons are integrated into the app for a unique look and feel. Here's a small preview:
- Chat bubble: 💬
- AI-powered: 🤖
- Real-time: ⚡
- TailwindCSS: 🎨
If you encounter any issues or need help, feel free to open an issue on the GitHub repository.
Rusty Chat is a full-stack, real-time web chat app built with cutting-edge technologies in Rust. With features like local AI integration, real-time messaging, and full-stack development using Leptos and Axum, it provides a modern and efficient platform for building chat applications.
Happy coding! 🎉
Feel free to replace the demo video link with the actual one when you have it.