How do I config vector size of a module

I was following the attendance example here weaviate-examples/attendance-system-example at main · weaviate/weaviate-examples · GitHub, however the near vector search fails due to error “explorer: get class: vector search: object vector search at index students: shard students_hfx1X8zELvMi: vector search: knn search: distance between entrypint and query node: vector lengths don’t match: 2048 vs 128”

It looks like the vectorizer img2vec-neural returns 2024 vector, while when search the face_recognition returns 128 dim embeding.

How can we config to make it right?

Hi @litlig !! Welcome to our community :hugs:

That example is a little bit old. I will take a look on that and see if we can update it.

Meanwhile, do you know about our free, online workshops?

Thanks!

Hi!

So, the error message is that this app is using it’s own encoding for the images in order to search, so instead of doing a near image search it doesn a near vector search here:

I changed it to this:

res = client.query.get(
        "Students", ["labelName", "_additional {certainty}"]
    ).with_near_image(
        {"image": path}
    ).do()

So now Weaviate does the image vectorization before querying using the same vectorization.

However… the results are far from optimal. It only finds one student always. Something is still off.

If you want a nice example on how to play with images, check this one out:

thanks!

1 Like

Thanks for the detailed reply @DudaNogueira. I’m wondering if it’s because the models are different for embedding at query and insert time. Anyways, I’ll check out the workshop you linked, thanks!

1 Like

Yes, that’s probably it.

For some reason, something has changed on a library that is returning different dimensions.

When you pass the image itself for Weaviate, it will then vectorize the near_image with the same vectorizer used to index data, so the search will work.

Hey, we have an image_search example with clip in here