Verba chat got error no chunks available

Description

Hi all, I am new in this field and I managed to setup Llama3 + Verba + Weaviate. However I am now stuck on error “no chunks available” when trying to chat after adding documents. In the log, there is error message like this

✔ Received query: tell me about owaps testing
✘ The query retriever result in the window retriever contains an error:
({'locations': [{'column': 6, 'line': 1}], 'message': 'get vector input from
modules provider: VectorFromInput was called without vectorizer', 'path':
['Get', 'VERBA_Chunk_OLLAMA']})
ℹ No data found for VERBA_Chunk_OLLAMA.
ℹ Retrieved Context of 0 tokens
✔ Succesfully processed query: tell me about owaps testing in 0.02s

Server Setup Information

Any additional Information

These are indicators from Verba Overview (since I couldn’t put screenshot here)

  • VERBA_Chunk_OLLAMA 4629
  • VERBA_Config 1
  • VERBA_Document_OLLAMA 1
  • docx Available
  • openai Available
  • pypdf Available
  • tiktoken Available
  • OLLAMA_EMBED_MODEL Available
  • OLLAMA_MODEL Available
  • OLLAMA_URL Available
  • WEAVIATE_URL_VERBA Available

Hopefully all these information are sufficient to describe what is gone wrong. Thank in advance

Regards,
YP.Ajie

Hi! Welcome to our community! :hugs:

Do you see success logs when importing the data?

Can you connect to the server and make sure the data is correctly indexed?

Thanks!

Hello! I installed Verba for the first time last night and ran into the same issue. What worked for me is making sure I downloaded the models in Ollama prior to running Verba.

~$ ollama run llama3
~$ ollama run mxbai-embed-large

Ctrl+C and then run Verba and Ollama server as usual, see if that works.

2 Likes

hi @arelyx !!

Welcome to our community :hugs:

Thanks for sharing!

1 Like

You are my hero! Thanks bro

1 Like

what was happening is I did not define the OLLAMA_EMBED_MODEL so the embeder falling back to OLLAMA_MODEL which is llama3 and embeding could not be done with it

1 Like

THanks for sharing, @YP_Ajie !!

We really appreciate it :slight_smile: