wsxqaza12 / RAG_LangChain_streamlit

18 stars 8 forks source link

不好意思~想請問有關在匯入pdf及輸入問題後出現了一些錯誤訊息。 #2

Closed WesissoNB closed 5 months ago

WesissoNB commented 6 months ago

錯誤訊息如下~ NotFoundError: File Not Found Traceback: File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 535, in _run_script exec(code, module.dict) File "D:\py\git\streamlit\RAG_LangChain_streamlit-main\RAG_LangChain_streamlit-main\rag_engine.py", line 140, in boot() File "D:\py\git\streamlit\RAG_LangChain_streamlit-main\RAG_LangChain_streamlit-main\rag_engine.py", line 132, in boot response = query_llm(st.session_state.retriever, query) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\py\git\streamlit\RAG_LangChain_streamlit-main\RAG_LangChain_streamlit-main\rag_engine.py", line 63, in query_llm result = qa_chain({'question': query, 'chat_history': st.session_state.messages}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain_core_api\deprecation.py", line 145, in warning_emitting_wrapper return wrapped(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 378, in call return self.invoke( ^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 163, in invoke raise e File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 153, in invoke self._call(inputs, run_manager=run_manager) File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\conversational_retrieval\base.py", line 166, in _call answer = self.combine_docs_chain.run( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain_core_api\deprecation.py", line 145, in warning_emitting_wrapper return wrapped(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 550, in run return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain_core_api\deprecation.py", line 145, in warning_emitting_wrapper return wrapped(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 378, in call return self.invoke( ^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 163, in invoke raise e File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 153, in invoke self._call(inputs, run_manager=run_manager) File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\combine_documents\base.py", line 137, in _call output, extra_return_dict = self.combine_docs( ^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\combine_documents\stuff.py", line 244, in combine_docs return self.llm_chain.predict(callbacks=callbacks, inputs), {} ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\llm.py", line 293, in predict return self(kwargs, callbacks=callbacks)[self.output_key] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain_core_api\deprecation.py", line 145, in warning_emitting_wrapper return wrapped(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 378, in call return self.invoke( ^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 163, in invoke raise e File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\base.py", line 153, in invoke self._call(inputs, run_manager=run_manager) File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\llm.py", line 103, in _call response = self.generate([inputs], run_manager=run_manager) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain\chains\llm.py", line 115, in generate return self.llm.generate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain_core\language_models\chat_models.py", line 544, in generate_prompt return self.generate(prompt_messages, stop=stop, callbacks=callbacks, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain_core\language_models\chat_models.py", line 408, in generate raise e File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain_core\language_models\chat_models.py", line 398, in generate self._generate_with_cache( File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain_core\language_models\chat_models.py", line 577, in _generate_with_cache return self._generate( ^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\langchain_openai\chat_models\base.py", line 462, in _generate response = self.client.create(messages=message_dicts, *params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\openai_utils_utils.py", line 275, in wrapper return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\openai\resources\chat\completions.py", line 663, in create return self._post( ^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\openai_base_client.py", line 1200, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\openai_base_client.py", line 889, in request return self._request( ^^^^^^^^^^^^^^ File "C:\Users\luzif\anaconda3\envs\rag_streamlit\Lib\site-packages\openai_base_client.py", line 980, in _request raise self._make_status_error_from_response(err.response) from None 想請大神救救我~非常感謝~~

wsxqaza12 commented 6 months ago

@WesissoNB 感謝提問, 初步看起來是找不到 LLM 模型,你有建立流程的步驟與截圖嗎?

WesissoNB commented 5 months ago

有的,在目前的狀況中可以在本地藉由127.0.0.1呼叫Llama模型。如果在最一開始沒有在streamlit的data資料夾中創立一個tmp的資料夾的話會在最上方出現錯誤。在新增tmp資料夾後好像可以讀取到pdf檔案,但是開始問問題之後他的錯誤訊息就會出現在下方。 前面三張圖片是成功啟用Llama.cpp llama_model1 llama_model2 llamacpp 未在streamlit的data資料夾中建立tmp資料夾 Snipaste_2024-02-29_15-23-11 建立資料夾並submit完檔案後 streamlit1 streamlit2 streamlit3 streamlit4 streamlit5 不好意思麻煩您了,非常感謝您~

wsxqaza12 commented 5 months ago

@WesissoNB 感謝你提供的詳細資訊, 你畫面中的 URL 是指向 http://127.0.0.1:8080 但因為我們是使用 openai 的接口, 因此正確應該是 http://127.0.0.1:8080/v1 細節可以參考 llama.cpp 的文件

WesissoNB commented 5 months ago

了解,謝謝您:)不好意思還想請問在設置llama_2b_chat模型時需要安裝額外的套件才能讓他擁有中文問答的能力嗎?

wsxqaza12 commented 5 months ago

@WesissoNB llama 是由 meta 提出,pre-training 的資料大部分都是英文, 所以即使下了 prompt,也很難讓他以中文回答, 因此許多中文圈的開發者有試著以 llama 為基礎, 再用中文文檔做第二次 pre-training,讓它成為一個懂中文的 llama2。

提供一個我有在使用的專案:

其實這類型的模型非常多,你可以上 HuggingFace 看有沒有適合你需求的。

WesissoNB commented 5 months ago

非常感謝您 ~