IntelligenzaArtificiale / Free-Auto-GPT

Free Auto GPT with NO paids API is a repository that offers a simple version of Auto GPT, an autonomous AI agent capable of performing tasks independently. Unlike other versions, our implementation does not rely on any paid OpenAI API, making it accessible to anyone.
MIT License
2.43k stars 382 forks source link

huggingchat doesn't work #115

Closed prehcp closed 1 year ago

prehcp commented 1 year ago
⚠️INSTRUCTIONS: - Enter ONE "x" inside the brackets [x] to choose the answer - [x] Example - [ ] Example2

Have you already searched for your ISSUE among the resolved ones?

What version of Python do you have?

What version of operating system do you have?

What type of installation did you perform?

Desktop (please complete the following information):

Describe the bug A clear and concise description of what the bug is.

python autogpt.py Select the model you want to use (1, 2, 3 or 4) 1) ChatGPT 2) HuggingChat 3) BingChat 4) Google Bard 5) HuggingFace

2 Enter the objective of the AI system: (Be realistic!) how to learn AI Traceback (most recent call last): File "C:\chat\Auto-GPT1\autogpt.py", line 314, in agent.run([input("Enter the objective of the AI system: (Be realistic!) ")]) File "C:\Python311\Lib\site-packages\langchain\experimental\autonomous_agents\autogpt\agent.py", line 91, in run assistant_reply = self.chain.run( ^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\chains\base.py", line 239, in run return self(kwargs, callbacks=callbacks)[self.output_keys[0]] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\chains\base.py", line 140, in call raise e File "C:\Python311\Lib\site-packages\langchain\chains\base.py", line 134, in call self._call(inputs, run_manager=run_manager) File "C:\Python311\Lib\site-packages\langchain\chains\llm.py", line 69, in _call response = self.generate([inputs], run_manager=run_manager) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\chains\llm.py", line 79, in generate return self.llm.generate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\llms\base.py", line 127, in generate_prompt return self.generate(prompt_strings, stop=stop, callbacks=callbacks) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\llms\base.py", line 176, in generate raise e File "C:\Python311\Lib\site-packages\langchain\llms\base.py", line 170, in generate self._generate(prompts, stop=stop, run_manager=run_manager) File "C:\Python311\Lib\site-packages\langchain\llms\base.py", line 379, in _generate else self._call(prompt, stop=stop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\chat\Auto-GPT1\FreeLLM\HuggingChatAPI.py", line 36, in _call self.chatbot = hugchat.ChatBot(cookie_path=self.cookiepath) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\hugchat\hugchat.py", line 40, in init self.current_conversation = self.new_conversation() ^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\hugchat\hugchat.py", line 101, in new_conversation self.accept_ethics_modal() File "C:\Python311\Lib\site-packages\hugchat\hugchat.py", line 92, in accept_ethics_modal raise Exception(f"Failed to accept ethics modal with status code {response.status_code}. {response.content.decode()}") Exception: Failed to accept ethics modal with status code 403. Non-JSON form requests need to have a referer

Screenshots If applicable, add screenshots to help explain your problem.

Additional context Add any other context about the problem here.

prehcp commented 1 year ago

after pip install hugchat --upgrade,

the error:

python autogpt.py Select the model you want to use (1, 2, 3 or 4) 1) ChatGPT 2) HuggingChat 3) BingChat 4) Google Bard 5) HuggingFace

2 Enter the objective of the AI system: (Be realistic!) how to learn AI Traceback (most recent call last): File "C:\chat\Auto-GPT1\autogpt.py", line 314, in agent.run([input("Enter the objective of the AI system: (Be realistic!) ")]) File "C:\Python311\Lib\site-packages\langchain\experimental\autonomous_agents\autogpt\agent.py", line 91, in run assistant_reply = self.chain.run( ^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\chains\base.py", line 239, in run return self(kwargs, callbacks=callbacks)[self.output_keys[0]] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\chains\base.py", line 140, in call raise e File "C:\Python311\Lib\site-packages\langchain\chains\base.py", line 134, in call self._call(inputs, run_manager=run_manager) File "C:\Python311\Lib\site-packages\langchain\chains\llm.py", line 69, in _call response = self.generate([inputs], run_manager=run_manager) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\chains\llm.py", line 79, in generate return self.llm.generate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\llms\base.py", line 127, in generate_prompt return self.generate(prompt_strings, stop=stop, callbacks=callbacks) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\langchain\llms\base.py", line 176, in generate raise e File "C:\Python311\Lib\site-packages\langchain\llms\base.py", line 170, in generate self._generate(prompts, stop=stop, run_manager=run_manager) File "C:\Python311\Lib\site-packages\langchain\llms\base.py", line 379, in _generate else self._call(prompt, stop=stop) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\chat\Auto-GPT1\FreeLLM\HuggingChatAPI.py", line 42, in _call data = self.chatbot.chat(prompt, temperature=0.5, stream=False) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\hugchat\hugchat.py", line 217, in chat obj = json.loads(res[1:-1]) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\json__init__.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\json\decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\json\decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

IntelligenzaArtificiale commented 1 year ago

This bug has already been reported to the hugchat library. Unfortunately when the output response is long there are problems in returning it.

We are working on trying to fix the problem.

prehcp commented 1 year ago

This bug has already been reported to the hugchat library. Unfortunately when the output response is long there are problems in returning it.

We are working on trying to fix the problem.

Hi, thanks for your reply! It works now.

By the way, is it possible to use free-auto-gpt on local AI models, such as oobabooga/text-generation-webui? That will be more productive and easier to use.

IntelligenzaArtificiale commented 1 year ago

For the moment we have not found any local models that have the performance of GPT or similar. So it would be pointless work to integrate these local models.

This is because people having trouble copying a cookie imagine having to spend a few minutes configuring their pc to run a local LLM. And if they succeed then they complain about poor performance.

prehcp commented 1 year ago

oobabooga/text-generation-webui is easy to install local LLM. It can download any model from huggingface.co directly. It will be very helpful to use autogpt on it.