Current Status: Endpoint Ignored in text2vec-ollama Module

In Query-time text2vec-ollama module ignores custom endpoint and always falls back to localhost:11434

I have Weaviate, pipelines and Open WebUI containers sharing the same custom bridge network

I can run Langchain Python script and generate the embedding.

When I query using Open WebUI and pipelines, it falls back to localhost as explained above

OS is Windows Pro 11
Ollama runs in the host computer
I use Docker Desktop

All the environment variables are set properly

Hi @lssindustry4,

Welcome to our community, it’s great to have you with us! :partying_face:

Just to confirm, is this the same issue as reported here?

Best regards,

Mohamed Shahin
Weaviate Support Engineer
(Ireland, UTC±00:00/+01:00)

Hi Mohamed,

Thank you for your response, this is a different issue but I have a similar issue as reported .

It always falls back to localhost:11434 during query- time .

Do you need any specifics ?

Best,

JC