Hi everyone,
I’ve been trying to set up the Ollama Llama3 model locally using Docker, but I’m running into connection issues. I’ve installed Weaviate and set up the Ollama embedders, but it’s showing “Couldn’t connect to Ollama at http://localhost:11434” (see screenshot below).
Setup details:
- Weaviate Chatbot installed via Docker
- Ollama Llama3 instance running locally
- Weaviate is set to connect to Ollama on
localhost:11434
What I’ve tried:
- Verified that Docker is running and Ollama instance is installed.
- Confirmed that
http://localhost:11434
is the correct URL.