[Question] Getting Weaviate and Ollama working together running locally

I have been trying to get weaviate and ollama working together on my local machine. I have yet to get weaviate to successfully connect to ollama at all. Does anyone any tips?

Here is my docker-compose.yml file:

version: "3.7"
services:
  weaviate:
    image: cr.weaviate.io/semitechnologies/weaviate:1.28.2
    ports:
      - 8080:8080
      - 50051:50051
    environment:
      ENABLE_MODULES: text2vec-ollama
      OLLAMA_URL: http://ollama:11434
      TEXT2VEC_OLLAMA_BASE_URL: http://ollama:11434
      TEXT2VEC_OLLAMA_MODEL: nomic-embed-text
      WEAVIATE_HOST: 0.0.0.0
      QUERY_DEFAULTS_LIMIT: 25
      AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: true
      PERSISTENCE_DATA_PATH: /var/lib/weaviate
      DEFAULT_VECTORIZER_MODULE: text2vec-ollama
      CLUSTER_HOSTNAME: node1
    volumes:
      - weaviate_data:/var/lib/weaviate
    depends_on:
      - ollama

  ollama:
    image: ollama/ollama:latest
    ports:
      - 11434:11434
    volumes:
      - ollama_data:/root/.ollama

volumes:
  weaviate_data:
  ollama_data:

And the error I get when doing an insert:
} else {
318 | return Promise.reject(new WeaviateUnexpectedStatusCodeError(res.status, err));
^
WeaviateUnexpectedStatusCodeError: The request to Weaviate failed with status code: 500 and message: {“error”:[{“message”:“vectorize target vector default: update vector: send POST request: Post "http://localhost:11434/api/embed\”: dial tcp [::1]:11434: connect: connection refused"}]}
code: 500,

hi @PaulJPhilp !!

Welcome to our community :hugs:

There isn’t a variables such as TEXT2VEC_OLLAMA_BASE_URL or TEXT2VEC_OLLAMA_MODEL

Do you remember where you saw those?

That configuration is defined per collection, as documented here:

In your case, you will need to use:

api_endpoint="http://ollama:11434",

Let me know if that helps!

Thank you so much for sharing it helped me.

1 Like