Embedded server starts and shuts down right away

I am just trying to get started with the experimental embedded weaviate.
It seems to download and run but then stops running when client tries to get meta data about version. Can you help me? I have tried opening ports and tried different ports, including 8080, 8888 and the default port. What could be causing the meta query to come back empty? I think the local server is not accessible but do not know how to fix it.

Description

import weaviate
client = weaviate.connect_to_embedded(version=“latest”)

Server Setup Information

Started /home/dyoung/.cache/weaviate-embedded: process ID 106577
{“action”:“startup”,“default_vectorizer_module”:“none”,“level”:“info”,“msg”:“the default vectorizer modules is set to "none", as a result all new schema classes without an explicit vectorizer setting, will use this vectorizer”,“time”:“2024-03-28T14:44:07-04:00”}
{“action”:“startup”,“auto_schema_enabled”:true,“level”:“info”,“msg”:“auto schema enabled setting is set to "true"”,“time”:“2024-03-28T14:44:07-04:00”}
{“level”:“info”,“msg”:“No resource limits set, weaviate will use all available memory and CPU. To limit resources, set LIMIT_RESOURCES=true”,“time”:“2024-03-28T14:44:07-04:00”}
{“level”:“warning”,“msg”:“Multiple vector spaces are present, GraphQL Explore and REST API list objects endpoint module include params has been disabled as a result.”,“time”:“2024-03-28T14:44:07-04:00”}
{“action”:“grpc_startup”,“level”:“info”,“msg”:“grpc server listening at [::]:50050”,“time”:“2024-03-28T14:44:07-04:00”}
{“action”:“restapi_management”,“level”:“info”,“msg”:“Serving weaviate at http://127.0.0.1:8079”,“time”:“2024-03-28T14:44:07-04:00”}
{“action”:“restapi_management”,“level”:“info”,“msg”:“Shutting down… “,“time”:“2024-03-28T14:44:07-04:00”}
{“action”:“restapi_management”,“level”:“info”,“msg”:“Stopped serving weaviate at http://127.0.0.1:8079”,“time”:“2024-03-28T14:44:07-04:00”}
{“action”:“telemetry_push”,“level”:“info”,“msg”:“telemetry started”,“payload”:”\u0026{MachineID:1a5178e4-1132-4b47-8cd8-37511d58d10a Type:INIT Version:1.24.6 Modules:generative-openai,qna-openai,ref2vec-centroid,reranker-cohere,text2vec-cohere,text2vec-huggingface,text2vec-openai NumObjects:0 OS:linux Arch:amd64}”,“time”:“2024-03-28T14:44:08-04:00”}
{“action”:“telemetry_push”,“level”:“info”,“msg”:“telemetry terminated”,“payload”:“\u0026{MachineID:1a5178e4-1132-4b47-8cd8-37511d58d10a Type:TERMINATE Version:1.24.6 Modules:generative-openai,qna-openai,ref2vec-centroid,reranker-cohere,text2vec-cohere,text2vec-huggingface,text2vec-openai NumObjects:0 OS:linux Arch:amd64}”,“time”:“2024-03-28T14:44:08-04:00”}

Any additional Information

self._weaviate_version = _ServerVersion.from_string(self.get_meta()[“version”])
^^^^^^^^^^^^^^^
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/connect/v4.py”, line 581, in get_meta
res = _decode_json_response_dict(response, “Meta endpoint”)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/util.py”, line 929, in _decode_json_response_dict
raise UnexpectedStatusCodeError(location, response)
weaviate.exceptions.UnexpectedStatusCodeError: Meta endpoint! Unexpected status code: 403, with response body: None.

Hi @mail4dy ! Welcome to our community :hugs:

Can you share a reproducible code?

Also, notice that Embedded options is experimental, and shouldn’t be used for anything other than experimenting.

For a better Developer experience while running locally, docker is a great way to go.

Thanks!

Same problem using Docker. Please see below:
I start Docker and I know it is running because I get output when pointing browser to 8080
(langchain) bash-4.4$ docker run -p 8080:8080 -p 50051:50051 --env-file ./env.list cr.weaviate.io/semitechnologies/weaviate:1.24.6
{“action”:“startup”,“default_vectorizer_module”:“none”,“level”:“info”,“msg”:“the default vectorizer modules is set to "none", as a result all new schema classes without an explicit vectorizer setting, will use this vectorizer”,“time”:“2024-04-03T15:14:47Z”}
{“action”:“startup”,“auto_schema_enabled”:true,“level”:“info”,“msg”:“auto schema enabled setting is set to "true"”,“time”:“2024-04-03T15:14:47Z”}
{“level”:“info”,“msg”:“No resource limits set, weaviate will use all available memory and CPU. To limit resources, set LIMIT_RESOURCES=true”,“time”:“2024-04-03T15:14:47Z”}
{“action”:“grpc_startup”,“level”:“info”,“msg”:“grpc server listening at [::]:50051”,“time”:“2024-04-03T15:14:47Z”}
{“action”:“restapi_management”,“level”:“info”,“msg”:“Serving weaviate at http://[::]:8080”,“time”:“2024-04-03T15:14:47Z”}

I then try to connect with Python using latest client but always same result.
Python 3.11.7 (main, Dec 15 2023, 18:12:31) [GCC 11.2.0] on linux
Type “help”, “copyright”, “credits” or “license” for more information.

import weaviate
client=weaviate.connect_to_local()
Traceback (most recent call last):
File “”, line 1, in
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/connect/helpers.py”, line 157, in connect_to_local
return __connect(client)
^^^^^^^^^^^^^^^^^
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/connect/helpers.py”, line 345, in __connect
raise e
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/connect/helpers.py”, line 341, in __connect
client.connect()
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/client.py”, line 282, in connect
self._connection.connect(self.__skip_init_checks)
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/connect/v4.py”, line 655, in connect
super().connect(skip_init_checks)
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/connect/v4.py”, line 141, in connect
self._weaviate_version = _ServerVersion.from_string(self.get_meta()[“version”])
^^^^^^^^^^^^^^^
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/connect/v4.py”, line 579, in get_meta
res = _decode_json_response_dict(response, “Meta endpoint”)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home_local/dyoung/anaconda3/envs/langchain/lib/python3.11/site-packages/weaviate/util.py”, line 929, in _decode_json_response_dict
raise UnexpectedStatusCodeError(location, response)
weaviate.exceptions.UnexpectedStatusCodeError: Meta endpoint! Unexpected status code: 403, with response body: None.

I had the same problem with Docker. The problem was noproxy was not set. After setting NO_PROXY=127.0.0.1 when launching Docker and setting it my .bashrc export no_proxy=‘localhost’ I was able to connect. Not sure which one did the trick but I think it is the lower case no_proxy

1 Like

thanks for sharing, @mail4dy !! :heart:

even though after export no_proxy=‘localhost’, still meets errors as following: raceback (most recent call last):
File “/Users/yangboz/anaconda3/envs/py311/lib/python3.11/site-packages/weaviate/collections/grpc/query.py”, line 649, in __call
res, _ = self._connection.grpc_stub.Search.with_call(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/Users/yangboz/anaconda3/envs/py311/lib/python3.11/site-packages/grpc/_channel.py”, line 1198, in with_call
return _end_unary_response_blocking(state, call, True, None)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/Users/yangboz/anaconda3/envs/py311/lib/python3.11/site-packages/grpc/_channel.py”, line 1006, in _end_unary_response_blocking
raise _InactiveRpcError(state) # pytype: disable=not-instantiable