IntelligenzaArtificiale / Free-Auto-GPT

Free Auto GPT with NO paids API is a repository that offers a simple version of Auto GPT, an autonomous AI agent capable of performing tasks independently. Unlike other versions, our implementation does not rely on any paid OpenAI API, making it accessible to anyone.
MIT License
2.49k stars 384 forks source link

Failed to parse response #133

Closed manishag1988 closed 1 year ago

manishag1988 commented 1 year ago
⚠️INSTRUCTIONS: - Enter ONE "x" inside the brackets [x] to choose the answer - [x] Example - [ ] Example2

Have you already searched for your ISSUE among the resolved ones?

What version of Python do you have?

What version of operating system do you have?

What type of installation did you perform?

Desktop (please complete the following information):

Describe the bug A clear and concise description of what the bug is. Hi, No matter how many times I try, I always end up getting the error:

ChatError: Failed to parse response: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> Traceback: File "d:\Free AutoGPT\.venv\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 552, in _run_script exec(code, module.__dict__) File "D:\Free AutoGPT\Camel.py", line 238, in <module> user_ai_msg = user_agent.step(assistant_msg) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Free AutoGPT\Camel.py", line 67, in step output_message = self.model(str(input_message.content)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "d:\Free AutoGPT\.venv\Lib\site-packages\langchain\llms\base.py", line 310, in __call__ self.generate([prompt], stop=stop, callbacks=callbacks) File "d:\Free AutoGPT\.venv\Lib\site-packages\langchain\llms\base.py", line 192, in generate raise e File "d:\Free AutoGPT\.venv\Lib\site-packages\langchain\llms\base.py", line 186, in generate self._generate(prompts, stop=stop, run_manager=run_manager) File "d:\Free AutoGPT\.venv\Lib\site-packages\langchain\llms\base.py", line 451, in _generate else self._call(prompt, stop=stop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Free AutoGPT\FreeLLM\HuggingChatAPI.py", line 42, in _call data = self.chatbot.chat(prompt, temperature=0.5, stream=False) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "d:\Free AutoGPT\.venv\Lib\site-packages\hugchat\hugchat.py", line 267, in chat raise ChatError(f"Failed to parse response: {res}")

Screenshots If applicable, add screenshots to help explain your problem. image_2023-06-09_194305620

Additional context Add any other context about the problem here.

IntelligenzaArtificiale commented 1 year ago

unfortunately this problem is because LLm models don't return the output as expected. It is normal at the moment, in the next updates we will fix this too.