Skip to content

redis-developer/dealership-chatbot-agent-memory-demo

Repository files navigation

Car Dealership Agent with Redis Agent Memory Server

Redis Agent Memory Server LangChain LangGraph License: MIT

Car dealership AI agent that demonstrates how Redis Agent Memory Server enables long-term memory and conversation context retrieval, allowing the agent to remember customer preferences across sessions and provide personalized car purchase assistance.

Table of Contents

Demo Objectives

  • Long-term memory storage using Redis Agent Memory Server for persistent customer preferences
  • Short-term/working memory using LangGraph checkpointers and Redis Agent Memory Server
  • Conversation context retrieval for personalized interactions across sessions
  • Agentic orchestration with LangGraph workflow stages (needs analysis → shortlist → test drive → financing)

Tech Stack

Layer Technology Purpose
Memory Redis Agent Memory Server Long-term and working memory management
Database Redis Cloud Vector storage and session persistence
Orchestration LangGraph Stateful workflow management
LLM Framework LangChain LLM integration and prompting
Backend FastAPI Python REST API
Frontend React 18 + TypeScript User interface
Styling Tailwind CSS UI styling
LLM OpenAI GPT-4 Language model
Deployment Docker + Terraform Containerization and cloud infrastructure

Prerequisites

  • Python 3.11+
  • Node.js 18+
  • Docker and Docker Compose
  • Redis Cloud account or local Redis instance
  • OpenAI API key

Getting Started

1. Clone the Repository

git clone <repository-url>
cd dealership-chatbot-agent-memory-demo

2. Environment Configuration

Create a .env file in the project root:

OPENAI_API_KEY=your_openai_api_key_here
REDIS_URL=redis://default:password@your-redis-host:port
MEMORY_SERVER_URL=http://localhost:8000

3. Start Agent Memory Server

Get the pre-built Docker image from Docker Hub:

docker run -p 8000:8000 \
  -e REDIS_URL=redis://default:<password>@<your-redis-host>:<port> \
  -e OPENAI_API_KEY=<your-openai-api-key> \
  redislabs/agent-memory-server:latest \
  agent-memory api --host 0.0.0.0 --port 8000 --task-backend=asyncio

Note: This command starts the Agent Memory Server API with asyncio task backend. You must have a running Redis instance (e.g., Redis Cloud) accessible at the URL you provide.

4. Run with Docker

Build and start all services:

docker-compose up --build

Access the application:

5. Run for Development

Backend:

cd backend
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt
python main.py

Frontend:

cd frontend
npm install
npm run dev

Screenshots

Landing Page Chatbot Interface

Architecture

Architecture_diagram

Architecture Flow

User Query
    ↓
[Retrieve Conversation Context] → Load past preferences from long-term memory
    ↓
[Parse Slots] → Extract car preferences using LLM
    ↓
[Ensure Readiness] → Check if all required slots are filled
    ↓
[Decide Next]
    ├→ Missing slots? → Ask follow-up question
    └→ All slots filled? → Advance to next stage
         ↓
    [Workflow Stages]
         ├→ Brand Selected? → Suggest Models
         ├→ Model Selected? → Suggest Test Drive
         ├→ Test Drive Completed? → Suggest Financing
         └→ Financing Discussed? → Prepare for Delivery
         ↓
    [Save to Memory] → Store conversation and preferences
         ↓
    Response to User

Project Structure

dealership-chatbot-agent-memory-demo/
├── backend/
│   ├── main.py              # FastAPI application
│   ├── orchestrator.py      # LangGraph workflow
│   └── requirements.txt     # Python dependencies
├── frontend/
│   ├── src/
│   │   ├── components/      # React components
│   │   └── contexts/        # React contexts
│   ├── package.json
│   └── nginx.conf           # Production server config
├── docker/
│   ├── Dockerfile.backend
│   └── Dockerfile.frontend
├── terraform/
│   ├── main.tf               # AWS infrastructure
│   ├── variables.tf          # Variable definitions
│   ├── outputs.tf            # Output definitions
│   └── user_data.sh          # EC2 bootstrap script
├── docker-compose.yml
└── README.md

Usage

  1. Start a conversation by logging in with any username
  2. Share your preferences (e.g., "I'm looking for a 5-seater SUV")
  3. Browse recommendations based on your requirements
  4. Select a model and schedule a test drive
  5. Complete the journey through financing options

The agent remembers your preferences across sessions, so returning customers get personalized recommendations immediately.

Cloud Deployment

Deploy to AWS EC2 using Terraform.

Prerequisites:

  • AWS account with credentials configured
  • Terraform installed (>= 1.0)
  • SSH key pair in AWS EC2

Quick Start:

cd terraform
cp terraform.tfvars.example terraform.tfvars
# Edit terraform.tfvars with your values
terraform init
terraform plan
terraform apply

Full deployment guide: See terraform/README.md for detailed instructions.

Resources

Maintainers

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Car Dealership Agent with Redis Agent Memory Server, showcasing short-term and long-term memory

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors