Successfully installed Verba at port 8000, proxy 80 and 443 to 8000, but comes up blank white page?
Server Setup Information
Weaviate Server Version: Debian Linux
Deployment Method:
Multi Node? Number of Running Nodes: 1
Client Language and Version: English
Multitenancy?: No
Any additional Information
(verbavenv) getonthis@nginx-ai-2-vm:~/weaviate$ curl http://localhost:8000 gets me the html for the start page.
Logs show: (verbavenv) getonthis@nginx-ai-2-vm:~/weaviate/Verba$ verba start No Ollama Model detected No Ollama Model detected
INFO: Will watch for changes in these directories: [‘/home/getonthis/weaviate/Verba’]
WARNING: “workers” flag is ignored when reloading is enabled.
INFO: Uvicorn running on http://localhost:8000 (Press CTRL+C to quit)
INFO: Started reloader process [10909] using WatchFiles No Ollama Model detected No Ollama Model detected
INFO: Started server process [10916]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: 127.0.0.1:37698 - “GET / HTTP/1.1” 200 OK
INFO: 104.63.129.184:0 - “GET / HTTP/1.1” 200 OK
INFO: 104.63.129.184:0 - “GET /static/icon.ico HTTP/1.1” 200 OK
INFO: 104.63.129.184:0 - “GET / HTTP/1.1” 200 OK
INFO: 104.63.129.184:0 - “GET / HTTP/1.1” 200 OK
INFO: 104.63.129.184:0 - “GET / HTTP/1.1” 200 OK
So i know the server is running.
And here is the docker-compose:
Ps: Make sure that ollama has plenty of resources. If you run ollama under docker, make sure there isn’t resource constraints, otherwise it can fail at ingestion.