Welcome to this minimal yet functional chatbot project that leverages the LangGraph Graph API and Groq's LLaMA3-8B-8192 model! This chatbot demonstrates how to structure conversational flows using a graph-based state machine and large language models.
- 🌐 Built using LangGraph – A graph-based framework for building LLM-powered applications.
- 🚀 Powered by Groq API with LLaMA3-8B-8192 model.
- 🔁 Graph execution flow: Define states → Add LLM node → Connect via edges → Compile → Invoke.
- 🧠 Stateful memory with message history via
add_messages
. - 🔎 Visualizes graph using Mermaid.
.
├── 1-basicchatbot.ipynb # Jupyter Notebook with complete chatbot code
├── .env # Stores your Groq API Key
└── README.md # Project documentation
Make sure you have the following Python packages installed:
pip install langgraph langchain langchain-groq python-dotenv
Create a .env
file in the same directory as your notebook/script and add:
GROQ_API_KEY=your_groq_api_key_here
class State(TypedDict):
messages: Annotated[list, add_messages]
The chatbot keeps track of all messages in a list using add_messages
, which appends new user/LLM messages to the state.
from langchain_groq import ChatGroq
llm = ChatGroq(model="llama3-8b-8192")
You can also use init_chat_model("groq:llama3-8b-8192")
for compatibility with LangChain.
def chatbot(state: State):
return {"messages": [llm.invoke(state["messages"])]}
The chatbot node takes the current state and returns a new message generated by the LLM.
graph_builder.add_node("llmchatbot", chatbot)
graph_builder.add_edge(START, "llmchatbot")
graph_builder.add_edge("llmchatbot", END)
graph = graph_builder.compile()
This builds the LangGraph with a simple START → LLM → END flow.
response = graph.invoke({"messages": "Hiiii"})
print(response["messages"][-1].content)
To visualize the LangGraph structure:
from IPython.display import Image, display
display(Image(graph.get_graph().draw_mermaid_png()))
Input: Hiiii
Output: Hello! How can I help you today?
- This is a minimal working prototype. You can extend it with more nodes like context handling, retrieval, tools, or memory chains.
- You can swap out Groq with OpenAI, Mistral, or any other LLM backend supported by LangChain.
- Don't forget to keep your API keys secure.
This project is licensed under the MIT License. Feel free to modify and build upon it!
Jiya Shetty 📧 jiyashetty173@somaiya.edu 👩🎓 TY BTech Electronics and Computer Engineering 📍 K. J. Somaiya College of Engineering