Skip to content

This is a simple chatbot project built using LangGraph, a graph-based framework for orchestrating LLMs, and powered by the Groq API with the LLaMA3-8B-8192 model. The chatbot demonstrates how to structure conversational flows using state graphs and message reducers for memory. Ideal for learning how to integrate LLMs into modular applications.

Notifications You must be signed in to change notification settings

jiyashetty28/Basic-LLM-Chatbot-using-Langgraph

Repository files navigation

🤖 Basic Chatbot using LangGraph & Groq API

Welcome to this minimal yet functional chatbot project that leverages the LangGraph Graph API and Groq's LLaMA3-8B-8192 model! This chatbot demonstrates how to structure conversational flows using a graph-based state machine and large language models.


📌 Features

  • 🌐 Built using LangGraph – A graph-based framework for building LLM-powered applications.
  • 🚀 Powered by Groq API with LLaMA3-8B-8192 model.
  • 🔁 Graph execution flow: Define states → Add LLM node → Connect via edges → Compile → Invoke.
  • 🧠 Stateful memory with message history via add_messages.
  • 🔎 Visualizes graph using Mermaid.

📁 Project Structure

.
├── 1-basicchatbot.ipynb     # Jupyter Notebook with complete chatbot code
├── .env                     # Stores your Groq API Key
└── README.md                # Project documentation

🧰 Requirements

Make sure you have the following Python packages installed:

pip install langgraph langchain langchain-groq python-dotenv

🔐 .env Configuration

Create a .env file in the same directory as your notebook/script and add:

GROQ_API_KEY=your_groq_api_key_here

🧱 How It Works

1. Define State

class State(TypedDict):
    messages: Annotated[list, add_messages]

The chatbot keeps track of all messages in a list using add_messages, which appends new user/LLM messages to the state.


2. Initialize Groq LLM

from langchain_groq import ChatGroq
llm = ChatGroq(model="llama3-8b-8192")

You can also use init_chat_model("groq:llama3-8b-8192") for compatibility with LangChain.


3. Define Node Function

def chatbot(state: State):
    return {"messages": [llm.invoke(state["messages"])]}

The chatbot node takes the current state and returns a new message generated by the LLM.


4. Build and Compile Graph

graph_builder.add_node("llmchatbot", chatbot)
graph_builder.add_edge(START, "llmchatbot")
graph_builder.add_edge("llmchatbot", END)

graph = graph_builder.compile()

This builds the LangGraph with a simple START → LLM → END flow.


5. Invoke the Chatbot

response = graph.invoke({"messages": "Hiiii"})
print(response["messages"][-1].content)

🧠 Graph Visualization

To visualize the LangGraph structure:

from IPython.display import Image, display
display(Image(graph.get_graph().draw_mermaid_png()))

📸 Demo Output

Input:  Hiiii
Output: Hello! How can I help you today?

💡 Notes

  • This is a minimal working prototype. You can extend it with more nodes like context handling, retrieval, tools, or memory chains.
  • You can swap out Groq with OpenAI, Mistral, or any other LLM backend supported by LangChain.
  • Don't forget to keep your API keys secure.

✨ Credits


📜 License

This project is licensed under the MIT License. Feel free to modify and build upon it!


🧑‍💻 Author

Jiya Shetty 📧 jiyashetty173@somaiya.edu 👩‍🎓 TY BTech Electronics and Computer Engineering 📍 K. J. Somaiya College of Engineering

About

This is a simple chatbot project built using LangGraph, a graph-based framework for orchestrating LLMs, and powered by the Groq API with the LLaMA3-8B-8192 model. The chatbot demonstrates how to structure conversational flows using state graphs and message reducers for memory. Ideal for learning how to integrate LLMs into modular applications.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published