You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+10-1
Original file line number
Diff line number
Diff line change
@@ -212,7 +212,7 @@ The root directory contains configuration files and documentation for the overal
212
212
213
213
6. Go to `http://localhost:8000/docs` to access the FastAPI Swagger documentation of the backend.
214
214
215
-
### Usage
215
+
### Frontend usage
216
216
217
217
1. Enter your message in the chat interface on the Streamlit application.
218
218
2. Click the "Send" button to send the message to the backend API.
@@ -231,6 +231,15 @@ The interface send all history of the chat to the backend API to generate the re
231
231
232
232
For clear the chat history, you can click the "New Chat" button.
233
233
234
+
### Backend API Usage
235
+
236
+
The backend API is a FastAPI application that handles the chatbot logic and interacts with the TinyLlama model to generate responses. The API exposes the following endpoints:
237
+
238
+
- **POST /api/v1/chat**: This endpoint receives a chat messages from the frontend and generates a response using the TinyLlama model. View swuaager documentation in`http://localhost:8000/docs`for more information.
239
+
240
+
- **GET /docs**: This endpoint provides the Swagger documentation forthe backend API. You can access it by going to `http://localhost:8000/docs`in your web browser.
241
+
242
+
234
243
### Building the Docker Image Locally
235
244
236
245
To build the Docker image for the backend FastAPI application, run the following command:
0 commit comments