Closed wambugu71 closed 11 months ago
'causs the error message is
{ "message": "..." }
And it does not contain 'type'
maybe?
'causs the error message is
{ "message": "..." }
And it does not contain 'type'
maybe?
Meaning it's not a bug?
'causs the error message is
{ "message": "..." }
And it does not contain 'type' maybe?
Meaning it's not a bug?
No, if that were the problem then it would still be a bug as the package is expecting the type
key to be in the json response.
After some testing, I am getting the same error with the following code, using Windows w/ Python 3.10.7:
from hugchat import hugchat
import logging
logging.basicConfig(level=logging.DEBUG)
email = "example@example.com"
chatbot = hugchat.ChatBot(cookie_path=f"usercookies/{email}.json")
chatbot.switch_llm(1)
chatbot.new_conversation(switch_to=True)
print(chatbot.active_model.id)
print(chatbot.query("Hello!"))
and this is the console output:
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): huggingface.co:443
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "GET /chat HTTP/1.1" 308 None
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "GET /chat/ HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "POST /chat/__data.json HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "POST /chat/conversation HTTP/1.1" 200 None
DEBUG:root:{"conversationId":"CONVERSATION_ID"}
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "GET /chat/conversation/CONVERSATION_ID/__data.json?x-sveltekit-invalidated=1_1 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "POST /chat/conversation HTTP/1.1" 200 None
DEBUG:root:{"conversationId":"CONVERSATION_ID"}
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "GET /chat/conversation/CONVERSATION_ID/__data.json?x-sveltekit-invalidated=1_1 HTTP/1.1" 200 None
codellama/CodeLlama-34b-Instruct-hf
DEBUG:urllib3.connectionpool:Resetting dropped connection: huggingface.co
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "POST /chat/conversation/CONVERSATION_ID HTTP/1.1" 200 None
# Above 2 lines repeat about 15 times
DEBUG:urllib3.connectionpool:Resetting dropped connection: huggingface.co
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "POST /chat/conversation/CONVERSATION_ID HTTP/1.1" 429 65
Traceback (most recent call last):
File "C:\Users\%user%\AppData\Local\Programs\Python\Python310\lib\site-packages\hugchat\hugchat.py", line 625, in _stream_query
_type = obj["type"]
KeyError: 'type'
DEBUG:root:RESPONSE_HEADERS # headers removed
Traceback (most recent call last):
File "C:\Users\%user%\AppData\Local\Programs\Python\Python310\lib\site-packages\hugchat\hugchat.py", line 625, in _stream_query
_type = obj["type"]
KeyError: 'type'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "%filepath%\test.py", line 17, in <module>
print(chatbot.query("Hello!"))
File "C:\Users\%user%\AppData\Local\Programs\Python\Python310\lib\site-packages\hugchat\message.py", line 212, in __str__
return self.wait_until_done()
File "C:\Users\%user%\AppData\Local\Programs\Python\Python310\lib\site-packages\hugchat\message.py", line 182, in wait_until_done
raise self.error
File "C:\Users\%user%\AppData\Local\Programs\Python\Python310\lib\site-packages\hugchat\message.py", line 83, in __next__
a: dict = next(self.g)
File "C:\Users\%user%\AppData\Local\Programs\Python\Python310\lib\site-packages\hugchat\hugchat.py", line 657, in _stream_query
raise ChatError(f"Failed to parse response: {res}")
hugchat.exceptions.ChatError: Failed to parse response: {"message":"You are sending too many messages. Try again later."}
And after a quick look at the HuggingChat UI, it does appear like the message is getting through to the servers
we may need a pr for capturing error message like this
we may need a pr for capturing error message like this Also from my logs it seems it's an error with the server. Just catch this error(exception)
From the log
It happens frequently after changing the model ie all the time when you make a new request. Any work about on this?