Local Ollama Verba-Weaviate on docker odd errors when using chat

Hello

I am super new at this so forgive my errors. I had first tried to run llama and verba locally on wsl2 and had to many issues. So i moved verba to docker. could not get that to work until i used the wsl instance ip then verba connected how ever i get these errors.

HW 5950x 128gb ram primary 1tb m.2 data 2 tb ssd 4090.

This is a local Ollama running llama3.1 llama3.1:8b-instruct-fp16 WSL2 ubuntu 24.x
I have the embed snowflake-arctic-embed:latest

Windows Desktop Docker yaml file listed below

Error 1 from weaviate log:{“action”:“requests_total”,“api”:“rest”,“build_git_commit”:“353d907”,“build_go_version”:“go1.22.7”,“build_image_tag”:“1.26.5”,“build_wv_version”:“1.26.5”,“class_name”:“VERBA_SUGGESTION”,“error”:“no moduleconfig for class VERBA_SUGGESTION present”,“level”:“error”,“msg”:“unexpected error”,“query_type”:“objects”,“time”:“2024-10-06T10:02:46Z”}

Error 2 on verba page if you summit anything see image:
:warning: Failed to set new RAG Config Object was not added! Unexpected status
code: 500, with response body: {‘error’: [{‘message’: ‘no moduleconfig for class
VERBA_CONFIG present’}]}.

Error 3 when in Verba page and select Docker and then goto settings top right.
When config opens Version Nodes Collections are giving loading animations the whole time. but in the Weaviate docker logs i can see Verba errors so i know its connecting


.

My Docker Yaml

version: '3.8'
services:
  weaviate:
    command:
    - --host
    - 0.0.0.0
    - --port
    - '8080'
    - --scheme
    - http
    image: cr.weaviate.io/semitechnologies/weaviate:1.26.5
    ports:
    - 8080:8080
    - 50051:50051
    volumes:
    - weaviate_data:/var/lib/weaviate
    restart: on-failure:0
    environment:
      QNA_INFERENCE_API: "http://qna-transformers:8080"
      NER_INFERENCE_API: "http://ner-transformers:8080"
      BIND_INFERENCE_API: 'http://multi2vec-bind:8080'
      RERANKER_INFERENCE_API: 'http://reranker-transformers:8080'
      QUERY_DEFAULTS_LIMIT: 25
      AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'
      PERSISTENCE_DATA_PATH: '/var/lib/weaviate'
      DEFAULT_VECTORIZER_MODULE: 'multi2vec-bind'
      ENABLE_MODULES: 'multi2vec-bind,ref2vec-centroid,reranker-transformers,qna-transformers,ner-transformers'
      CLUSTER_HOSTNAME: 'node1'
    healthcheck:
      test: wget --no-verbose --tries=3 --spider http://localhost:8080/v1/.well-known/live || exit 1
      interval: 5s
      timeout: 10s
      retries: 5
      start_period: 10s

  ner-transformers:
    image: cr.weaviate.io/semitechnologies/ner-transformers:dbmdz-bert-large-cased-finetuned-conll03-english

  multi2vec-bind:
    image: cr.weaviate.io/semitechnologies/multi2vec-bind:imagebind
    environment:
      ENABLE_CUDA: '1'
      NVIDIA_VISIBLE_DEVICES: all
    deploy:
      resources:
        reservations:
          devices:
          - capabilities: [gpu]

  qna-transformers:
    image: cr.weaviate.io/semitechnologies/qna-transformers:bert-large-uncased-whole-word-masking-finetuned-squad
    environment:
      ENABLE_CUDA: '1'
      NVIDIA_VISIBLE_DEVICES: all
    deploy:
      resources:
        reservations:
          devices:
          - capabilities: [gpu]

  reranker-transformers:
    image: cr.weaviate.io/semitechnologies/reranker-transformers:cross-encoder-ms-marco-MiniLM-L-6-v2
    environment:
      ENABLE_CUDA: '0'

  verba:
    image: verba-verba
    ports:
      - 8000:8000
    environment:
      WEAVIATE_URL_VERBA: 'http://weaviate:8080'  # Corrected service name
      OLLAMA_URL: 'http://172.28.14.23:11434'  # Depending on your Docker environment (localhost or internal)
    volumes:
      - ./data:/data/
    depends_on:
      weaviate:
        condition: service_healthy
    healthcheck:
      test: wget --no-verbose --tries=3 --spider http://weaviate:8080 || exit 1  # Changed port to 8080
      interval: 5s
      timeout: 10s
      retries: 5
      start_period: 10s

volumes:
  weaviate_data:

Hi!

We have been discussing about running all locally with docker here:

Let me know if you were able to run it.

Thanks!