-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Something went wrong. Please check your internet connection. / Ollama Setup on Ubuntu #1697
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hmm, Can you provide me with your .env file and please tell me which ollama model you are using? Also do you see any error inside your backend container? |
My .env file API_KEY=None ......:~/DocsGPT$ curl http://localhost:11434/api/tags ...:/DocsGPT$ docker ps Logs from backend ...: [2025-03-13 13:59:46 +0000] [1] [INFO] Handling signal: term [2025-03-13 15:33:17 +0000] [1] [INFO] Starting gunicorn 23.0.0 |
Looks like env vars did not pass correctly to the app.
|
These are the backend logs after the new install Got another error message "Please try again later. We apologize for any inconvinience." /DocsGPT$ docker logs 96f9e8caed43 The above exception was the direct cause of the following exception: Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): |
I performed the standard installation using the setup.sh file.

After everything was successfully executed for the Ollama setup (without modifications, just the default settings), I can access the frontend. There, I can upload files and adjust options.
However, the chat does not work.
Keep saying this error message
Ollama is accessible via the default port and indicates that it is running.
I tried installing Ollama separately and linking it, but the same error occurs. In the terminal, I can chat with the LLM without any issues.
I also tried adjusting the .env file afterward, but that didn’t help either.
I am using the system with Ubuntu.
The text was updated successfully, but these errors were encountered: