WeaviateQueryError

Description

I am getting this error:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/weaviate/collections/grpc/query.py", line 609, in __call
    res, _ = self._connection.grpc_stub.Search.with_call(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/grpc/_channel.py", line 1198, in with_call
    return _end_unary_response_blocking(state, call, True, None)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/grpc/_channel.py", line 1006, in _end_unary_response_blocking
    raise _InactiveRpcError(state)  # pytype: disable=not-instantiable
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNKNOWN
	details = "explorer: get class: vector search: object vector search at index faq: shard faq_g8JQfN2F0moQ: vector search: knn search: distance between entrypoint and query node: got a nil or zero-length vector at docID 0"
	debug_error_string = "UNKNOWN:Error received from peer  {created_time:"2024-05-14T14:42:36.515262638+02:00", grpc_status:2, grpc_message:"explorer: get class: vector search: object vector search at index faq: shard faq_g8JQfN2F0moQ: vector search: knn search: distance between entrypoint and query node: got a nil or zero-length vector at docID 0"}"
>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/src/httpd/./manage.py", line 27, in <module>
    main()
  File "/usr/src/httpd/./manage.py", line 23, in main
    execute_from_command_line(sys.argv)
  File "/usr/local/lib/python3.11/site-packages/django/core/management/__init__.py", line 442, in execute_from_command_line
    utility.execute()
  File "/usr/local/lib/python3.11/site-packages/django/core/management/__init__.py", line 436, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/usr/local/lib/python3.11/site-packages/django/core/management/base.py", line 413, in run_from_argv
    self.execute(*args, **cmd_options)
  File "/usr/local/lib/python3.11/site-packages/django/core/management/base.py", line 459, in execute
    output = self.handle(*args, **options)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/httpd/common/management/commands/wquery.py", line 24, in handle
    result = collection.query.near_text(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/weaviate/collections/queries/near_text/query.py", line 90, in near_text
    res = self._query.near_text(
          ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/weaviate/collections/grpc/query.py", line 418, in near_text
    return self.__call(request)
           ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/weaviate/collections/grpc/query.py", line 618, in __call
    raise WeaviateQueryError(e.details(), "GRPC search")  # pyright: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
weaviate.exceptions.WeaviateQueryError: Query call with protocol GRPC search failed with message explorer: get class: vector search: object vector search at index faq: shard faq_g8JQfN2F0moQ: vector search: knn search: distance between entrypoint and query node: got a nil or zero-length vector at docID 0.

What does it mean? I have the same data and same structure and same Weaviate version on a different server working fine. My Python code looks like:

collection = client.collections.get('Faq')
result = collection.query.near_text(
    target_vector='question',
    query='Hello World',
    limit=1,
)

Server Setup Information

  • Weaviate Server Version: 1.24.10
  • Deployment Method: docker
  • Number of Running Nodes: 1
  • Client Language and Version: Python 3.11.9, Weaviate Library v4

Any additional Information

hi @christian.mayer !! Welcome to our community :hugs:

This bug was fixed in 1.24.11! :smiley:

Please, upgrade to 1.24.11+ and let me know if this was fixed on your side too.

Thanks!

Thank you so much. I upgraded to 1.24.12 and it’s now working. :pray:

still occurred at weaviate:1.25.1 :frowning:

Hi!

Can you provide a code so we can reproduce this in 1.25.1?

Thanks!

sclient = weaviate.connect_to_embedded(
    persistence_data_path="./cache/weaviate",
    version="1.25.1",
    environment_variables={"ENABLE_MODULES":"text2vec-ollama,generative-ollama","AUTOSCHEMA_ENABLED": "false", "DISABLE_TELEMETRY":"true"}
    )


print("####end of configure schema.")

print("###col_toyou:",col_toyou)

_query_text = "test"
_collection_name = "test"
chunks = client.collections.get(_collection_name)
#testing of fetch_objects
print("###quring chunks of:",_collection_name)

# An example prompt
prompt_text="test"

# Generate an embedding for the prompt and retrieve the most relevant doc
response = ollama.embeddings(
  model = "all-minilm",
  prompt = prompt_text,
)
print("###ollama.embeddings response:",response)

# quried_response = chunks.query.near_text(query=_query_text, limit=3)
results = chunks.query.near_vector(near_vector = response["embedding"],
                                       limit = 1)
print("###nearVector queried results:",results)
FYI 

thanks.

@zhou_yangbo

Check here a working ollama recipe:

Let me know if this helps :slight_smile:

This has worked for me:

query = "When Lamas were first domesticated and how long do they live?"
response = ollama.embeddings(
  model = "all-minilm",
  prompt = query,
)
results = collection.query.near_vector(
    near_vector=response["embedding"],
    limit=2
)
for object in results.objects:
    print(object.properties)

{‘text’: ‘Llamas live to be about 20 years old, though some only live for 15 years and others live to be 30 years old’}
{‘text’: ‘Llamas were first domesticated and used as pack animals 4,000 to 5,000 years ago in the Peruvian highlands’}