Description
I’m trying to insert documents with inverted indexes into weaviate (local instance) but embeddings aren’t being created. As embedding model I’m using Azure OpenAI model “text-embedding-3-small”. When collection is being inserted I’m getting the next error for each document:
{
message: 'API Key: no api key found neither in request header: X-Openai-Api-Key nor in environment variable under OPENAI_APIKEY',
object: [Object],
originalUuid: undefined
}
Question: why the client is trying to use X-Openai-Api-Key key instead of X-Azure-Api-Key for text2VecAzureOpenAI vectorizer? I also tried to replace text2vec-openai module in docker with text2vec-azure-openai one but got the error that such module doesn’t exists. When I replaced X-Azure-Api-Key with X-Openai-Api-Key the client tried to connect to OpenAI API and not Azure.
Is it possible to use remote (azure) embedding model for local weaviate instance running in docker?
Here is my config:
Connection to local instance (working):
const client = await weaviate.connectToLocal({
host: "172.16.41.55",
port: 8080,
grpcPort: 50051,
headers: {
'X-Azure-Api-Key': this.embeddings.azureOpenAIApiKey || '',
}
});
await client.isReady()
Create collection function call:
client.collections.create({
name: `${collection}_${this.context.id}`,
properties: [
{
name: 'document',
dataType: dataType.TEXT,
description: 'Splitted document' as const,
vectorizePropertyName: true,
},
],
invertedIndex: configure.invertedIndex({
indexNullState: true,
indexPropertyLength: true,
indexTimestamps: true,
}),
vectorizers: [
weaviate.configure.vectorizer.text2VecAzureOpenAI(
{
name: 'title_vector',
sourceProperties: ['title'],
resourceName: this.embeddings.azureOpenAIApiInstanceName || '',
deploymentId: this.embeddings.azureOpenAIApiDeploymentName || '',
},
),
],
});
Server Setup Information
- Weaviate Server Version: 1.27.1
- Deployment Method: docker
- Multi Node? Number of Running Nodes: 1
- Client Language and Version: TS (3.2.2)
- Multitenancy?:
Any additional Information
Weaviate service in docker-compose file:
weaviate:
command: --host 0.0.0.0 --port '8080' --scheme http
container_name: dowow-weaviate
image: cr.weaviate.io/semitechnologies/weaviate:1.27.1
restart: always
volumes:
- weaviate_data:/var/lib/weaviate
networks:
dowow:
ipv4_address: 172.16.41.55
ports:
- 8086:8080
- 50051:50051
- 2112:2112
environment:
QUERY_DEFAULTS_LIMIT: 25
AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'
PERSISTENCE_DATA_PATH: '/var/lib/weaviate'
DEFAULT_VECTORIZER_MODULE: 'text2vec-openai'
ENABLE_MODULES: 'text2vec-openai'
CLUSTER_HOSTNAME: 'node1'
volumes:
weaviate_data:
driver: local
Response from /v1/meta endpoint:
{
"grpcMaxMessageSize": 10485760,
"hostname": "http://[::]:8080",
"modules": {
"text2vec-openai": {
"documentationHref": "https://platform.openai.com/docs/guides/embeddings/what-are-embeddings",
"name": "OpenAI Module"
}
},
"version": "1.27.1"
}