Issue Connecting to Ollama Llama3 Instance in Docker – Need Help

Hi everyone,

I’ve been trying to set up the Ollama Llama3 model locally using Docker, but I’m running into connection issues. I’ve installed Weaviate and set up the Ollama embedders, but it’s showing “Couldn’t connect to Ollama at http://localhost:11434” (see screenshot below).

Setup details:

  • Weaviate Chatbot installed via Docker
  • Ollama Llama3 instance running locally
  • Weaviate is set to connect to Ollama on localhost:11434

What I’ve tried:

  1. Verified that Docker is running and Ollama instance is installed.
  2. Confirmed that http://localhost:11434 is the correct URL.

hi @Deepak_Pachiannan !!

Welcome to our community :hugs: !!

When you run Weaviate as a docker container, from it’s perspective localhost will point to Weaviate itself, not Ollama.

Setting Ollama endpoint as localhost will only work if you run Verba from python directly.

You need to change ollama’s endpoint to the host machine, as stated here:

When you set the ollama endpoint to http://host.docker.internal:11434, Weaviate will try to reach ollama at the docker host at port 11434.

Let me know if this helps!

Thanks!