Hi Guys
We use weaviate with docker image and we experience high CPU usage even when stale can any one suggest any methods to limit it?
Thanks
Sam
Hi Guys
We use weaviate with docker image and we experience high CPU usage even when stale can any one suggest any methods to limit it?
Thanks
Sam
Hi! Welcome to our community
How many objects do you have in your index? Also, do you see any outstanding logs?
One way of limit cpu usage is definig those limit on the upper layer. In you case, using docker.
Let me know if that helps.
Thanks!
Hi Duda,
we are also experiencing similar kind of issue.
How does it work in case we have a high number of object but all the vectors are currently set to None, even in this case memory and cpu usage are so high and it get out of memory.
Thanks,
Saurabh
Hi @Saurabh ! Welcome to our community
So you are not providing your own vectors, and doesn’t have a vectorizer configured for the class, right?
Weaviate will consume a lot of CPU while ingesting data, as it needs to build up both the vector and the inverted index for performing searches.
If you do not have vectors, the memory consumption should be lower.
How many objects are you ingesting? Do you see any outstanding logs?
I am also getting the same issue and I shared the stats . I have only 10k objects present inside 2 schema and don’t get any logs regarding this
What are the version you are running @Dharanish ?
Also, is there a public dataset we can test this on?
We are running this in v1.21.2 and we don’t use any public dataset and also no vectorizer module
According to here:
There is an option, LIMIT_RESOURCES
, where you can force Weaviate to do not use all resources.
This is probably the reason it is consuming all the allocated resources.
Let me know if this helps.
Thanks for the reply Duda!
Switched to the managed service for now, it was blocking us really bad
We were using the LIMIT_RESOURCES option as well but the pod were just dying out of memory, it’s not happening with the managed version.
When time permits will investigate it again, but currently the reason is unknown to us. The data objects where in the range of 500K-600K
Regards,
Saurabh
I tried setting up LIMIT_RESOURCES and also tried with setting the container max memory limit
I am still the max ram of 62gb which is high… the weaviate keeps on crashing
this is happening when I did a data add object of 130,000 objects … my vectors are open ai embedings vectors 1536d I already have 200,000 vectors in the db
Any possible things that I am missing out?
High CPU usage in Weaviate with Docker can occur due to a couple of reasons:
Data Ingestion: Weaviate consumes CPU while ingesting data, building search indexes. This is normal during initial data load.
Large Dataset: If your dataset is very large, even without active ingestion, maintaining the indexes can lead to high CPU usage.
Here are some solutions to consider:
Check for Ongoing Ingestion: Verify if any data ingestion processes are running. If not, the high CPU might be due to a large dataset.
Optimize Schema (if applicable): Reduce unnecessary properties or complex relations in your Weaviate schema to minimize indexing overhead.
Limit CPU Resources (Docker): You can configure CPU resource limits for the Weaviate Docker container using the --cpus flag. This will restrict the maximum CPU Weaviate can utilize.
Horizontal Scaling (Large Datasets): For very large datasets, consider scaling Weaviate horizontally by running multiple instances and sharding your data across them.
Remember, limiting CPU usage might impact Weaviate’s performance.
hi @warner !! Welcome to our community and thanks!
@DudaNogueira Thank you. I’m new here.