Storing KnowledgeGraph Index and Vector Index via llama_index

I am using the RetrieverQueryEngine from llama_index to create a custom vector and knowledgegraph index query engine. I have previously stored my vector index with Weaviate when I only needed a vector Index. But know I am using both. What would you suggest is a good production ready way to remotely store my VectorIndex AND KnowledgeGraph Index? Create 2 stores? Just like below or is it more complicated??

from llama_index.vector_stores import WeaviateVectorStore
# construct vector store
vector_store = WeaviateVectorStore(weaviate_client = client, index_name="content_vec", text_key="content")
# construct knowledgegraph store
kg_store = WeaviateVectorStore(weaviate_client = client, index_name="content_kg", text_key="content")


Sorry for the delay here :slight_smile:

I am not sure llamaindex will handle cross references. So I don’t think you have a builtin way to store those references only using llamaindex :thinking:

Thanks for the reply, how would you store a knowledge graph on Weaviate? Do you have a work around?