I’m using the following code to import to a collection:
with collection.batch.dynamic() as batch:
for data_row in [
{
"filename": "feeds.pdf",
"chunk": content,
"chunk_n": 1,
},
...
]:
uuid = batch.add_object(
properties=data_row,
)
print(uuid)
Some of the objects didn’t get ingested - I noticed that by counting the expected number of documents and the actual number of documents in the db.
I had to do some debugging and figured it’s because some of the chunks were too big for the embedding model.
Is there a way to know if an ingestion during batch went correctly? There was no error.