Open King007t opened 1 year ago
Got the same problem as well.
{
...
"request":{
"type":"SessionEndedRequest",
"requestId":"******",
"timestamp":"2022-12-23T02:33:41Z",
"locale":"en-US",
"reason":"ERROR",
"error":{
"type":"INVALID_RESPONSE",
"message":"An exception occurred while dispatching the request to the skill."
}
}
}
I try to debug it tomorrow, maybe I can provide a bit more information what causes this error.
Edit: Didn't find anything on a quick look, my debug skills in python/Alexa are very limited. @paulotruta @inverse you need to have a look please.
the python library that we we're using got abandoned due to the security measures chatgpt put in place to avoid automated scraping.
looks like a new one is in the works but is a completely different toolchain.
the python library that we we're using got abandoned due to the security measures chatgpt put in place to avoid automated scraping.
looks like a new one is in the works but is a completely different toolchain.
Unfortunately these restrictions imposed by OpenAI make the current situation difficult. Ideally we integrate another way to access ChatGPT.
Another possible option on my pov could be to consider integrating with GPT3 instead via the official API. This could also allow us to publish the skill officially and make it available to the general (non techy) public!
I want to run some tests and see if it would be possible to build a usable conversational experience this way.
I'll keep you posted via discussions. will leave the issue open for replies while this situation is ongoing
Alright, thanks for checking it out. I don't think it would help much, but maybe you can have a look into my Discord Bot where I'm using the OpenAI GPT API with C#.
I tried implementing what you did here: https://github.com/paulotruta/jee-pee-tee/pull/28
But the responses from that model seem a lot shorter. I didn't investigate too much right now though.
(.venv) malachi@pulsar lambda $ python test_chat.py 'who are you'
Asking ChatGPT 'who are you'
I'm a human being.
Hmm strange, I'm not 100% sure, but maybe I implemented the “old” GPT-3 Model and not the current one what ChatGPT is using. I just implemented it with the model that OpenAI is giving in their documentation, but same as you, I didn't investigate it either.
Hello guys,
I'm stuck trying to set up and figure it all for a bunch of hours now, and I'm bringing some points to the discussion:
Integrating directly through OpenAI's API needs some additional work regards to "teaching" the model how to formulate sentences, that happens because we are not using the ChatGPT interface itself, we are using the raw model beneath it, so yeah... Short and incomplete answers.
The RevChatGPT dependency does have recent updates and seems to be working fine, whatsoever a lot of errors appear to happen regarding the Chrome driver, which seemed to be impossible to install on the lambda using only the Alexa Dev Kit. Would require building the whole infrastructure and setting up a custom lambda layer with Chrome pre-installed.
Hope we can find a workaround soon, I haven't had the chance to see this app working, but I’m really looking forward to this.
question: alexa ask jee pee tee about our solar system
answer: There was a problem with the requested skill's response