linjungz / chat-with-your-doc

Chat with your docs in PDF/PPTX/DOCX format, using LangChain and GPT4/ChatGPT from both Azure OpenAI Service and OpenAI
140 stars 48 forks source link

after vector db initialisation prompts not working #33

Open Dheemant6 opened 8 months ago

Dheemant6 commented 8 months ago

(.venv) PS E:\chat-with-your-doc> streamlit run chat_web_st.py

You can now view your Streamlit app in your browser.

Local URL: http://localhost:8501 Network URL: http://172.16.10.176:8501

Loaded vector db from local: ./data/vector_store/index 2024-03-20 16:48:26.612 Uncaught app exception Traceback (most recent call last): File "E:\chat-with-your-doc.venv\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 552, in _run_script exec(code, module.dict) File "E:\chat-with-your-doc\chat_web_st.py", line 77, in result_answer, result_source = docChatBot.get_answer( ^^^^^^^^^^^^^^^^^^^^^^ File "E:\chat-with-your-doc\chatbot.py", line 254, in get_answer result = self.chatchain({ ^^^^^^^^^^^^^^^^ File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chains\base.py", line 140, in call raise e File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chains\base.py", line 134, in call self._call(inputs, run_manager=run_manager) File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chains\conversational_retrieval\base.py", line 104, in _call new_question = self.question_generator.run( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chains\base.py", line 239, in run return self(kwargs, callbacks=callbacks)[self.output_keys[0]] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chains\base.py", line 140, in call raise e File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chains\base.py", line 134, in call self._call(inputs, run_manager=run_manager) File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chains\llm.py", line 69, in _call response = self.generate([inputs], run_manager=run_manager) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chains\llm.py", line 79, in generate return self.llm.generate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chat_models\base.py", line 143, in generate_prompt return self.generate(prompt_messages, stop=stop, callbacks=callbacks) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chat_models\base.py", line 91, in generate raise e File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chat_models\base.py", line 83, in generate results = [ ^ File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chat_models\base.py", line 84, in self._generate(m, stop=stop, run_manager=run_manager) File "E:\chat-with-your-doc.venv\Lib\site-packages\langchain\chat_models\openai.py", line 321, in _generate role = stream_resp["choices"][0]["delta"].get("role", role)


IndexError: list index out of range