Vector database

Consider i have two files one for HR and another for credit cards ,I have to build a vector store for both them but separately ,like when I have ask a HR related query ,i goes to HR and fetch only from there ,it should not go to credit card.Will anyone please share the documents for this to work on it

Hi!

Checkout this recipe we have using llamaindex:

The llamaindex’s RouterQueryEngine will allow you to set up different indexes, and depending on the initial query, it will route to the most related one.

Do you think this helps?

Thanks!

Thanks for the reply,
I am requesting any reference ,where i create two different collections for two different documents (i.e.)., HR document will store in separate vector store ,similarly credit card document will store in separate vector store ,
how can i perform embedding on both the documents simulatenously and push it into weaviate under different collections .

Hi!

You can create the two collections, and define the vectorizer configurations for each.

Whenever you ingest content, Weaviate will vectorize it for you.

If you are using llamaindex, you can definie the collection name using a different index_name.

Let me know if that helps!

Thanks!

Thank you so much for your response ,May I have any references, please

Hi!

We have some recipes using llamaindex here:

I have recently rewritten them to use the v4 client and integration, so they are up to date.

Here you can find a simple query engine:

For instance:

# Let's name our index properly as BlogPost, as we will need it later.
vector_store = WeaviateVectorStore(
    weaviate_client=client, index_name="BlogPost"
)
storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(
    documents, storage_context=storage_context
)

This is how you can set the name of the collection (note the index_name).

Now this recipe, combined with the aforementioned example, can get you two indices, that you can feed to the query router as per the Query Router example.

Let me know if that helps.