Is it possible to use our own LLM to do generative search? If we can, how can we set it up?
Hi!
Check this module for that:
Let me know if this helps
@DudaNogueira Am I missing something? This module is using transformer? I would like to use my LLM to do a generative search.
Hi @rhuang !
You can adapt this docker container:
to your own, and use it with your model, reusing all the apis that are exposed in Weaviate to consume that service.