HI!!
Hey @niknokseyer! Welcome to our community.
@Jonathan_Yapp Good news! I found a fix for this (or I was able to run in lambda)
Sorry for the delay, I was out the past days
I was able to make it work.
here was how with some example code.
First, create the package and install the dependency
mkdir -p my-lambda-function/package
cd my-lambda-function/package
pip install weaviate-client --platform manylinux2014_x86_64 -t . --only-binary=:all:
cd ..
now we have in our root folder a package
folder with all dependencies so far.
I created a lambda_function.py
file on this root folder with the contents:
import json
import os
import requests
import weaviate
from weaviate import classes as wvc
CLUSTER_URL = os.environ.get(
"CLUSTER_URL",
"https://duda-test-sdfsdfsdf.weaviate.network"
)
CLUSTER_APIKEY = os.environ.get(
"CLUSTER_APIKEY",
"PasdasdpUrmLsdfsdfsdfsdfsdfokj"
)
OPENAPI_KEY = os.environ.get(
"OPENAI_APIKEY",
"sk-j0sdfsdfsdfsdfdasdasdaO5MsdfsdfsdfYoGVw"
)
def lambda_handler(event, context):
# if not ?query=some_query, defaults to biology
query = event.get("queryStringParameters", {}).get("query", "biology")
client = weaviate.connect_to_wcs(
cluster_url=CLUSTER_URL,
auth_credentials=weaviate.auth.AuthApiKey(CLUSTER_APIKEY),
headers={
"X-OpenAI-Api-Key": OPENAPI_KEY # Replace with your inference API key
}
)
if not client.collections.exists("Question"):
# create content and import some data
questions = client.collections.create(
name="Question",
vectorizer_config=wvc.config.Configure.Vectorizer.text2vec_openai(), # If set to "none" you must always provide vectors yourself. Could be any other "text2vec-*" also.
generative_config=wvc.config.Configure.Generative.openai() # Ensure the `generative-openai` module is used for generative queries
)
# add some data
resp = requests.get('https://raw.githubusercontent.com/weaviate-tutorials/quickstart/main/data/jeopardy_tiny.json')
data = json.loads(resp.text) # Load data
question_objs = list()
for i, d in enumerate(data):
question_objs.append({
"answer": d["Answer"],
"question": d["Question"],
"category": d["Category"],
})
questions.data.insert_many(question_objs)
questions = client.collections.get("Question")
# return results
response = {}
# single prompt
results = questions.generate.near_text(
query=query,
limit=1,
single_prompt="Explain {answer} as you might to a five-year-old."
)
response["single_prompt"] = results.objects[0].generated
# group task
results = questions.generate.near_text(
query=query,
limit=2,
grouped_task="Write a tweet with emojis about these facts."
)
response["grouped_task"] = results.generated
client.close()
return {
'statusCode': 200,
'body': json.dumps(response)
}
now you need to zip those files on this specific order/way:
cd package
zip -r ../deployment.zip .
cd ..
zip deployment.zip lambda_function.py
Let me know if this works on your side, as this could be a starter pack recipe for weaviate client in lambda
Thanks!