Open denisjoshua opened 1 week ago
I'm sorry... I just repeat the question to ChatGPT and this time it's work. I do nothing in the mean time :-) But now it give me the temperature. Thanks again Daniel
Interestingly enough now I'm getting that issue as well. Went back to regular open ai conversation w/same exposure and no issues. You didn't change anything huh?
Well the unique think that I have change was the model from gpt-3.5-turbo-1106 to gpt-3.5-turbo. But this morning I have back again to gpt-3.5-turbo-1106 and then it give me the error:
Sorry, I had a problem talking to OpenAI: Error code: 400 - {'error': {'message': "An assistant message with 'tool_calls' must be followed by tool messages responding to each 'tool_call_id'. The following tool_call_ids did not have response messages: call_UOtoIEQnsd2Lei6aEhIkSQTF", 'type': 'invalid_request_error', 'param': 'messages.[5].role', 'code': None}}
Then I restart HASOS and now it turn back to work again using gpt-3.5-turbo-1106.
I can't understand.
The strange think is that I search the service sensor.get_temperature in HassOS and I still not find it. I don't know if it really need this service... cause now is work :-)
Denis
The issue is here that ChatGPT does not know the name of the entity and just guesses the name. You need do give it the exact name or rename the name of the temperature sensor.
The problem is that sometime is work and some time is not work.
Hi there and first of all thanks a lot for this HACK. I have installed and it's work nearly all... light, TV etc... I have some temperature and himidity sensor around my home. But when I ask GHATGPT to tell me the room temperature it return with error:
Something went wrong: Service sensor.get_temperature not found
Anyone can help me please ? Thanks in advance Denis