LLM generated weaviate query

Will it help if I ask LLM (GPT 4o) to transform the user query to my embedder-friendly to get accurate search results ?

For instance, I want to translate the following query to get accurate search results . I will also append some conversational memory to the transformed text.

“2 bedrooms, anywhere in london, near to schools, 1 bathroom, 1 toilets, furnished, maximum 1 million, no pet, minimum 50 square metre, apartment”

Hi @uma_shankar !

I am not sure that would help. As this is a similarity search, all the content from you query will need to be close to the contents you have indexed.

This will also depend on your vectorizer. The vectorizer will need to properly vectorize both your query and indexed data, and considering they are similarly close, that data will come first.

However, I wouldn’t expect the vectorizer to work as a natural language query processor. Specially if filtering for numbers.

Let me know if this help!