Hi Weaviate community,
we are using Weaviate DB as embeddings storage and performing the similarity search etc. Recently we got the following warning message from the DB
“{“action”:“unlimited_vector_search”,“level”:“warning”,“msg”:“maximum search limit of 10000 results has been reached”,“time”:“2024-01-12T08:23:19Z”}.”
Can you please let us know:
1, what is the reason for that?
2, what is the consequence with this message? Does that mean we are not allowed to save vectors anymore? or we have constrain to perform similarity search? If one of both is “Yes”, can you suggest us any solution on that?
I believe this is a limit in which Weaviate will stop querying thru the index.
here is where this is raised in code:
So when performing searches or deletions, there are upper limits set to prevent excessive memory usage and long-running requests, which could lead to out-of-memory errors or other performance issues. For example, when deleting objects with a batch delete operation, Weaviate imposes a default limit of 10,000 objects that can be deleted in a single query to protect against unexpected memory surges.
You can still save vectors and objects, however, there is the default limit of 10000 objects to be returned. This value is set by the environment variable QUERY_MAXIMUM_RESULTS as listed here:
Hi Duda,
I can say that verba changed really the way I work.
Is there any possibility to change the 10000 upper limit? We have a powerful computer and most of our documents are very short txt files. We would like to bring this up to 50000.
@DudaNogueira
Thank you for your swift answer.
I am deploying verba localy through a docker compose file, using cohere for embeddings, and GPT4 for queries.
Could you tell me where I am able to change the QUERY_MAXIMUM_RESULTS variable? I tried to add QUERY_MAXIMUM_RESULTS: 12000 under the QUERY_DEFAULTS_LIMIT: 25 on the docker-compose.yml, but it didn’t work.
The total number of documents in the GUI is still capped at 10k.