osi1880vr / prompt_quill

Apache License 2.0
161 stars 16 forks source link

error after fresh installation #14

Open FeatureIsNotABug opened 1 week ago

FeatureIsNotABug commented 1 week ago

fresh installation (NVIDIA + CUDA 12.1) with one_click_install.bat in llama_index_pq. If a try chat, i get: F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\transformers\models\bert\modeling_bert.py:439: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:263.) attn_output = torch.nn.functional.scaled_dot_product_attention( Traceback (most recent call last): File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\queueing.py", line 527, in process_events response = await route_utils.call_process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\route_utils.py", line 270, in call_process_api output = await app.get_blocks().process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1847, in process_api result = await self.call_function( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1431, in call_function prediction = await fn(processed_input) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\utils.py", line 772, in async_wrapper response = await f(args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\chat_interface.py", line 513, in _submit_fn response = await anyio.to_thread.run_sync( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\anyio\to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\anyio_backends_asyncio.py", line 2177, in run_sync_in_worker_thread return await future ^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\anyio_backends_asyncio.py", line 859, in run result = context.run(func, args) ^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\ui.py", line 78, in run_llm_response prompt = self.interface.run_llm_response(query, history) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llm_interface_qdrant.py", line 202, in run_llm_response response = self.adapter.retrieve_llm_completion(query) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llama_index_interface.py", line 372, in retrieve_llm_completion context = self.get_context_text(prompt) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llama_index_interface.py", line 257, in get_context_text nodes = self.retrieve_context(query) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llama_index_interface.py", line 253, in retrieve_context return self.direct_search(prompt,self.g.settings_data['top_k'],0,True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llama_index_interface.py", line 219, in direct_search result = self.document_store.search(collection_name=self.g.settings_data['collection'], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\qdrant_client.py", line 353, in search return self._client.search( ^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\qdrant_remote.py", line 521, in search search_result = self.http.points_api.search_points( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\http\api\points_api.py", line 1524, in search_points return self._build_for_search_points( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\http\api\points_api.py", line 704, in _build_for_search_points return self.api_client.request( ^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\http\apiclient.py", line 79, in request return self.send(request, type) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\http\api_client.py", line 102, in send raise UnexpectedResponse.for_response(response) qdrant_client.http.exceptions.UnexpectedResponse: Unexpected Response: 404 (Not Found) Raw response content: b'{"status":{"error":"Not found: Collection prompts_large_meta doesn\'t exist!"},"time":0.0000123}'

hexive commented 1 week ago

this is related to #13

the one click installer initiates the downloaded snapshot recovery to Qdrant, but it fails in error.

for what it's worth, I was eventually able to manually populate the data from the snapshot with the dockerized version of Qdrant, so maybe the issue lies somewhere in the implementation of the windows-native version of Qdrant?