Description
in the Quickstart
page you provided how to connect with openai but what if i wanted use ollama or open source models from huggingface, how to do it?
try:
questions = client.collections.create(
name="Question",
vectorizer_config=wvc.config.Configure.Vectorizer.text2vec_openai(), # If set to "none" you must always provide vectors yourself. Could be any other "text2vec-*" also.
generative_config=wvc.config.Configure.Generative.openai() # Ensure the `generative-openai` module is used for generative queries
)
Server Setup Information
- Weaviate Server Version:
- Deployment Method: pip install -U weaviate-client
- Multi Node? Number of Running Nodes: one node, localhost
- Client Language and Version: python3.10
- Multitenancy?: no
Any additional Information
when i use ollama like this:
try:
questions = client.collections.create(
name="Question",
vectorizer_config=wvc.config.Configure.Vectorizer.text2vec_ollama(), # If set to "none" you must always provide vectors yourself. Could be any other "text2vec-*" also.
generative_config=wvc.config.Configure.Generative.ollama() # Ensure the `generative-openai` module is used for generative queries
)
i get this error:
weaviate.exceptions.UnexpectedStatusCodeError: Collection may not have been created properly.! Unexpected status code: 422, with response body: {'error': [{'message': 'vectorizer: no module with name "text2vec-ollama" present'}]}.