Issues facing while performing CRUD operations

I am facing an issue with deleting and retrieving data when I have applied where filter.

I want to delete only the name test, but it is deleting the test and test limi properties/data objects I have in my weaviate. I don’t want this to happen.
Here, if you observe that the display or get function deviates from retrieving results based on semantic search, but I need to extract search results. For example, for a test, I need only matches of the test, not the test limi, etc.

My query::
check_query = f"“”
{{
Get {{
{class_name}(
where: {{
path: [“name”],
operator: Equal,
valueString: “{name}”
}}
) {{
name
_additional {{
id
}}
}}
}}
}}
“”"
response = client.query.raw(check_query)

print(“Response from Weaviate:”, response)
delete_query = {
“operator”: “Equal”,
“path”: [“name”],
“valueString”: name,
}

x= client.batch.delete_objects(
class_name=class_name,
where=delete_query
)
print(x)
Response from Weaviate: {‘data’: {‘Get’: {‘Issue’: [{‘_additional’: {‘id’: ‘a5fff400-df0f-4320-8134-036ae28c4891’}, ‘name’: ‘test limi’}, {‘_additional’: {‘id’: ‘9cb804bc-8375-4dfe-bf4d-f30402780386’}, ‘name’: ‘test’}]}}}

Delete query response :

{‘dryRun’: False, ‘match’: {‘class’: ‘Issue’, ‘where’: {‘operands’: None, ‘operator’: ‘Equal’, ‘path’: [‘name’], ‘valueString’: ‘test’}}, ‘output’: ‘minimal’, ‘results’: {‘failed’: 0, ‘limit’: 10000, ‘matches’: 2, ‘objects’: None, ‘successful’: 2}}

Successfully deleted 4 exact matches for client name ‘test’

Support

Resolve this issue if anyone aware of
I am using docker version of weavaite and my tokenizer for this property is whitespace

Hi!

What is the server version?

You property probably has the tokenization of the property name set to field.

This means that My Cool Test will become three tokens: my cool test, and now if you search for some objects that has cool it will find all objects that has this token.

Check here for a comprehensive doc on Tokenization in Weaviate:

You may want to set the tokenization to field to avoid this scenario.

Let me know if that helps!

Thanks!

No, I have used “white space” tokenization in my schema while creating a class.

And I am using python client v3

That best thing here is to create some python code, if possible in pyv4, so we can try to reproduce the issue you are facing.

Can you share a code for this?

Thanks!