how can i run it on my local machine ? for my own embedding model , without using any APIs
Hi @Ajay_Yadav !
You can run different inference models on your own machine. Note that those are usually resource hungry, requiring significant resources to run with a good performance.
One nice inference model (not opensource, unfortunatelly) that is very interesting in looking at is this one:
Zain did a really nice blog post here about this subject:
Let me know if this helps