Hi,
I am actually having the same issue. Sorry if this is a simple solution.
below is what I see in the terminal up until the error.
INFO: Will watch for changes in these directories: ['/media/zain/Seagate 1TB/rag application/verba']
INFO: Uvicorn running on http://localhost:8000 (Press CTRL+C to quit)
INFO: Started reloader process [24553] using WatchFiles
βΉ Setting up client
βΉ Using Weaviate Embedded
Started /home/zain/.cache/weaviate-embedded: process ID 24558
{"action":"startup","default_vectorizer_module":"none","level":"info","msg":"the default vectorizer modules is set to \"none\", as a result all new schema classes without an explicit vectorizer setting, will use this vectorizer","time":"2024-05-28T15:47:13-05:00"}
{"action":"startup","auto_schema_enabled":true,"level":"info","msg":"auto schema enabled setting is set to \"true\"","time":"2024-05-28T15:47:13-05:00"}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_cache_minilm_OpOxyh0Yh1zs","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":19019}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_cache_ollama_TrbZXem6A3hA","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":17690}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_cache_text2vec_cohere_zoy81ihTCKNg","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":16050}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_cache_text2vec_openai_hW626EysIZbv","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":17140}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_chunk_minilm_PTizXIKxeTt4","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":73039}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_chunk_ollama_D3l7vB6x3M4S","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":25219}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_chunk_text2vec_cohere_G52kMo5c3Sdp","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":17409}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_chunk_text2vec_openai_RZLxs2iHSQNe","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":21040}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_config_7aWR2bffor77","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":22369}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_document_minilm_oBk2EwWpWBxr","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":17559}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_document_ollama_B5TfvXJ3e9Yu","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":32719}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_document_text2vec_cohere_EDJ9loflo0Bi","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":24100}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_document_text2vec_openai_v8nLrQv8CxPb","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":16710}
{"action":"hnsw_vector_cache_prefill","count":3000,"index_id":"verba_suggestion_NnsmZ4qnwRAx","level":"info","limit":1000000000000,"msg":"prefilled vector cache","time":"2024-05-28T15:47:13-05:00","took":17989}
{"level":"warning","msg":"Multiple vector spaces are present, GraphQL Explore and REST API list objects endpoint module include params has been disabled as a result.","time":"2024-05-28T15:47:13-05:00"}
{"action":"grpc_startup","level":"info","msg":"grpc server listening at [::]:50051","time":"2024-05-28T15:47:13-05:00"}
{"action":"restapi_management","level":"info","msg":"Serving weaviate at http://127.0.0.1:6666","time":"2024-05-28T15:47:13-05:00"}
β Connected to Weaviate
βΉ Setting up components
βΉ Retrieve Config From Weaviate
β Config Saved in Weaviate
βΉ Setting READER to BasicReader
βΉ Setting CHUNKER to TokenChunker
βΉ Setting EMBEDDER to OllamaEmbedder
βΉ Setting RETRIEVER to WindowRetriever
βΉ Setting GENERATOR to Ollama
βΉ Updating BasicReader config (document_type) Research -> Research
βΉ Updating TokenChunker config (overlap) 50 -> 40
INFO: Started server process [24555]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: 127.0.0.1:49100 - "GET / HTTP/1.1" 200 OK
INFO: 127.0.0.1:49100 - "GET /static/media/2b3f1035ed87a788-s.p.woff2 HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49128 - "GET /static/media/3d9ea938b6afa941-s.p.woff2 HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49142 - "GET /static/media/c9a5bc6a7c948fb0-s.p.woff2 HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49138 - "GET /static/media/4049f3f580e14086-s.p.woff2 HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49114 - "GET /static/css/afc8501c2b22bb29.css HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49156 - "GET /static/css/7af5f0c0467cb98b.css HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49100 - "GET /static/chunks/737dfa3e-71fd4aa07f7d84a6.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49138 - "GET /static/chunks/23-5e3f67a9ac794630.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49142 - "GET /static/chunks/main-app-6d8fe3bc29305481.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49114 - "GET /static/chunks/fd9d1056-13318e87e7edaf08.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49156 - "GET /static/chunks/webpack-f7ec7a24106fdb21.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49128 - "GET /static/chunks/bc9c3264-d07564fa5e9c78e4.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49128 - "GET /static/chunks/ec3863c0-51ee858d5ca1a7f6.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49114 - "GET /static/chunks/39aecf79-4a889f14de9b85cb.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49156 - "GET /static/chunks/12038df7-6e0eda258325d644.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49142 - "GET /static/chunks/93854f56-29cce777bbb44957.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49100 - "GET /static/chunks/3627521c-57ae5a9df6c7e5b9.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49138 - "GET /static/chunks/9081a741-61a1020146c5d975.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49138 - "GET /static/chunks/558-ac85fa7667d15ac6.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49100 - "GET /static/chunks/app/page-a9dac4e4664785de.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49100 - "GET /api/health HTTP/1.1" 200 OK
INFO: 127.0.0.1:49138 - "GET /icon.ico HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:49138 - "GET /api/health HTTP/1.1" 200 OK
βΉ Config Retrieved
INFO: 127.0.0.1:49138 - "GET /api/config HTTP/1.1" 200 OK
INFO: ('127.0.0.1', 49168) - "WebSocket /ws/generate_stream" [accepted]
β Config Saved in Weaviate
βΉ Setting READER to BasicReader
βΉ Setting CHUNKER to TokenChunker
βΉ Setting EMBEDDER to OllamaEmbedder
βΉ Setting RETRIEVER to WindowRetriever
βΉ Setting GENERATOR to Ollama
INFO: 127.0.0.1:49138 - "POST /api/set_config HTTP/1.1" 200 OK
INFO: connection open
β WebSocket connection closed by client.
INFO: connection closed
β Config Saved in Weaviate
βΉ Setting READER to BasicReader
βΉ Setting CHUNKER to TokenChunker
βΉ Setting EMBEDDER to OllamaEmbedder
βΉ Setting RETRIEVER to WindowRetriever
βΉ Setting GENERATOR to Ollama
βΉ Loading in cureus-0015-00000035077.pdf
β Loaded 1 documents in 0.63s
Chunking documents: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 195.01it/s]
β Chunking completed with 224 chunks in 0.01s
Vectorizing Chunks: 0%| | 0/224 [00:00<?, ?it/s]
Vectorizing document chunks: 0%| | 0/1 [00:00<?, ?it/s]
INFO: 127.0.0.1:53612 - "POST /api/import HTTP/1.1" 200 OK
My .env file has
OLLAMA_URL=http://localhost:11434/
OLLAMA_MODEL=llama3
OLLAMA_EMBED_MODEL=mxbai-embed-large