Is it possible to easily change the Verba code so we can try using the new OpenAI “text-embedding-3-large” model? We’re curious to see if that improves retrieval for unusual words.
And is it possible to play with changing chunk size?
Is it possible to easily change the Verba code so we can try using the new OpenAI “text-embedding-3-large” model? We’re curious to see if that improves retrieval for unusual words.
And is it possible to play with changing chunk size?
Hi!
Verba has now a modularized structure that easily allows new embedding components.
This is the one that controls the current OpenAi embedding:
We should have some contents around this new model soon.
You can probably experiment with it by adapting this recipe:
Let me know if this helps.
Thanks!
Thanks. I think that means that for now there isn’t an easy way to use the 3072 dimension “text-embedding-3-large” embedder with Verba. https://openai.com/blog/new-embedding-models-and-api-updates
Not without changing the code.
But hopefully we get it soon.
It would be interesting to point this out as an issue:
Done – issue created. Thanks Duda.