dresden-elektronik / deconz-rest-plugin

deCONZ REST-API plugin to control ZigBee devices
BSD 3-Clause "New" or "Revised" License
1.89k stars 496 forks source link

LIDL SilverCrest Smart Radiator Thermostat #5349

Open synatis opened 2 years ago

synatis commented 2 years ago

Device

Screenshots

Endpoint and Node Info

image

Basic

image

Groups

image

Scenes

image

Thermostat

image

Smanar commented 2 years ago

You have some logs to check for command order ?

If we have first 0x0210 = 0 deconz mode need to be set to "off"

                        if (productId == "Tuya_THD SilverCrest Smart Radiator Thermostat" && (temp == 0 || temp == 6000))
                        {
                            QString mode = QLatin1String("auto");
                            if (temp == 0) { QString mode = QLatin1String("off"); }
                            if (temp == 6000) { QString mode = QLatin1String("heat"); }

So if after we have 0x0402 = 0 It need to be skipped by


                            //Do the device is in "auto" mode
                            item = sensorNode->item(RConfigMode);
                            if (item && item->toString() != QLatin1String("auto"))
                            {
                                return;
                            }

Or on log you can see websocket notification, so you can see if the config/mode is modified (even it s just for 0.5s) Because I don't see what is bad on the code ....

BTW I m wrong for "heat", bad value for temperature, but need working for "off"

rikroe commented 2 years ago

You are right, deconz mode will be set to off for a short period of time (starting at 21:43:46:311). It is reset at 21:43:52:045 but only receives tuya mode 1 (for manu) or temperature 0 before from the device. Hope this log helps a little.

21:43:45:310 poll node 00:0b:57:ff:fe:4f:d0:6a-01
21:43:45:310 Poll light node Esszimmer 1
21:43:45:359 read attributes of 0x000B57FFFE4FD06A cluster: 0x0006: [ 21:43:45:360 0x0000 21:43:45:360 ]
21:43:45:361 add task 451758 type 19 to 0x000B57FFFE4FD06A cluster 0x0006 req.id 236
21:43:45:361 Poll APS request 236 to 0x000B57FFFE4FD06A cluster: 0x0006
21:43:45:460 Poll APS confirm 236 status: 0x00
21:43:45:461 Erase task req-id: 236, type: 19 zcl seqno: 58 send time 0, profileId: 0x0104, clusterId: 0x0006
21:43:45:492 Node data 0x000b57fffe4fd06a profileId: 0x0104, clusterId: 0x0006
21:43:45:493 0x000B57FFFE4FD06A: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:43:46:262 don't create binding for attribute reporting of sensor Thermostat 7
21:43:46:305 Send Tuya request 0x0C4314FFFE73C758 : Dp_type: 0x04, Dp_identifier 0x02, data: 01
21:43:46:306 add task 451764 type 37 to 0x0C4314FFFE73C758 cluster 0xEF00 req.id 243
21:43:46:306 Send Tuya request 0x0C4314FFFE73C758 : Dp_type: 0x01, Dp_identifier 0x10, data: 00000000
21:43:46:307 add task 451765 type 37 to 0x0C4314FFFE73C758 cluster 0xEF00 req.id 244
21:43:46:307 delay sending request 244 dt 0 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:46:308 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:46:308 delay sending request 244 dt 0 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:46:309 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:46:310 delay sending request 244 dt 0 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:46:310 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:46:311 Websocket 172.17.0.1:37852 send message: {"config":{"battery":100,"comfort_heatsetpoint":2200,"eco_heatsetpoint":1700,"heatsetpoint":2150,"locked":false,"mode":"off","offset":0,"on":true,"preset":"auto","reachable":true,"schedule":{},"schedule_on":false},"e":"changed","id":"7","r":"sensors","t":"event","uniqueid":"0c:43:14:ff:fe:73:c7:58-01-0201"} (ret = 308)
21:43:46:312 Websocket 192.168.91.5:57049 send message: {"config":{"battery":100,"comfort_heatsetpoint":2200,"eco_heatsetpoint":1700,"heatsetpoint":2150,"locked":false,"mode":"off","offset":0,"on":true,"preset":"auto","reachable":true,"schedule":{},"schedule_on":false},"e":"changed","id":"7","r":"sensors","t":"event","uniqueid":"0c:43:14:ff:fe:73:c7:58-01-0201"} (ret = 308)
21:43:46:410 delay sending request 244 dt 0 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:46:411 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:46:501 void deCONZ::zmNode::setFetched(deCONZ::RequestId, bool) fetched item: 8, node: 0x3808
21:43:46:509 delay sending request 244 dt 0 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:46:510 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:46:514 poll node 7c:b0:3e:aa:00:ad:5e:50-03
21:43:46:515 Poll light node Osram Steckdose
21:43:46:563 Poll APS request to 0x7CB03EAA00AD5E50 cluster: 0x0006 dropped, values are fresh enough
21:43:46:609 delay sending request 244 dt 0 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:46:610 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:46:710 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:46:711 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:46:809 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:46:810 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:46:910 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:46:911 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:009 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:010 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:110 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:110 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:210 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:210 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:310 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:311 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:410 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:410 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:510 poll node 00:0b:57:ff:fe:4c:01:f4-01
21:43:47:511 Poll light node Wohnzimmer 1
21:43:47:511 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:511 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:559 read attributes of 0x000B57FFFE4C01F4 cluster: 0x0006: [ 21:43:47:560 0x0000 21:43:47:560 ]
21:43:47:561 add task 451771 type 19 to 0x000B57FFFE4C01F4 cluster 0x0006 req.id 252
21:43:47:561 Poll APS request 252 to 0x000B57FFFE4C01F4 cluster: 0x0006
21:43:47:610 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:610 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:654 Poll APS confirm 252 status: 0x00
21:43:47:654 Erase task req-id: 252, type: 19 zcl seqno: 61 send time 0, profileId: 0x0104, clusterId: 0x0006
21:43:47:655 delay sending request 244 dt 1 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:656 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:686 Node data 0x000b57fffe4c01f4 profileId: 0x0104, clusterId: 0x0006
21:43:47:687 0x000B57FFFE4C01F4: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:43:47:711 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:711 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:809 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:810 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:47:909 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:47:910 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:009 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:010 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:109 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:110 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:209 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:210 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:310 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:310 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:409 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:410 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:509 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:510 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:610 delay sending request 244 dt 2 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:610 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:659 poll node 00:15:8d:00:03:1d:7d:7c-01
21:43:48:659 Poll light node Arbeitszimmer 2
21:43:48:710 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:711 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:711 read attributes of 0x00158D00031D7D7C cluster: 0x0006: [ 21:43:48:712 0x0000 21:43:48:712 ]
21:43:48:713 add task 451777 type 19 to 0x00158D00031D7D7C cluster 0x0006 req.id 3
21:43:48:713 Poll APS request 3 to 0x00158D00031D7D7C cluster: 0x0006
21:43:48:809 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:810 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:855 Poll APS confirm 3 status: 0x00
21:43:48:856 Erase task req-id: 3, type: 19 zcl seqno: 62 send time 0, profileId: 0x0104, clusterId: 0x0006
21:43:48:856 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:857 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:48:903 Node data 0x00158d00031d7d7c profileId: 0x0104, clusterId: 0x0006
21:43:48:903 0x00158D00031D7D7C: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:43:48:910 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:48:910 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:009 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:010 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:048 read attributes of 0x00158D00031D7D7C cluster: 0x0008: [ 21:43:49:049 0x0000 21:43:49:049 ]
21:43:49:049 add task 451779 type 19 to 0x00158D00031D7D7C cluster 0x0008 req.id 7
21:43:49:050 Poll APS request 7 to 0x00158D00031D7D7C cluster: 0x0008
21:43:49:110 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:110 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:159 Poll APS confirm 7 status: 0x00
21:43:49:160 Erase task req-id: 7, type: 19 zcl seqno: 63 send time 0, profileId: 0x0104, clusterId: 0x0008
21:43:49:161 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:162 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:162 read attributes of 0x00158D00031D7D7C cluster: 0x0300: [ 21:43:49:163 0x0007 21:43:49:164 0x0008 21:43:49:164 0x4001 21:43:49:164 ]
21:43:49:165 add task 451780 type 19 to 0x00158D00031D7D7C cluster 0x0300 req.id 8
21:43:49:165 Poll APS request 8 to 0x00158D00031D7D7C cluster: 0x0300
21:43:49:191 Node data 0x00158d00031d7d7c profileId: 0x0104, clusterId: 0x0008
21:43:49:191 0x00158D00031D7D7C: update ZCL value 0x01/0x0008/0x0000 after 0 s
21:43:49:209 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:210 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:271 Poll APS confirm 8 status: 0x00
21:43:49:272 Erase task req-id: 8, type: 19 zcl seqno: 64 send time 0, profileId: 0x0104, clusterId: 0x0300
21:43:49:272 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:273 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:303 Node data 0x00158d00031d7d7c profileId: 0x0104, clusterId: 0x0300
21:43:49:304 0x00158D00031D7D7C: update ZCL value 0x01/0x0300/0x0007 after 0 s
21:43:49:304 0x00158D00031D7D7C: update ZCL value 0x01/0x0300/0x0008 after 0 s
21:43:49:305 0x00158D00031D7D7C: update ZCL value 0x01/0x0300/0x4001 after 0 s
21:43:49:309 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:310 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:410 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:410 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:509 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:510 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:610 delay sending request 244 dt 3 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:611 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:710 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:711 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:810 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:811 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:49:854 poll node 00:15:8d:00:03:1d:7b:d1-01
21:43:49:855 Poll light node Arbeitszimmer 1
21:43:49:905 Poll APS request to 0x00158D00031D7BD1 cluster: 0x0006 dropped, values are fresh enough
21:43:49:910 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:49:911 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:50:009 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:50:011 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:50:109 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:50:110 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:50:210 Poll APS request to 0x00158D00031D7BD1 cluster: 0x0008 dropped, values are fresh enough
21:43:50:210 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:50:211 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:50:309 read attributes of 0x00158D00031D7BD1 cluster: 0x0300: [ 21:43:50:310 0x0007 21:43:50:311 0x0008 21:43:50:311 0x4001 21:43:50:312 ]
21:43:50:312 add task 451786 type 19 to 0x00158D00031D7BD1 cluster 0x0300 req.id 20
21:43:50:312 Poll APS request 20 to 0x00158D00031D7BD1 cluster: 0x0300
21:43:50:313 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:50:313 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:50:359 Poll APS confirm 20 status: 0x00
21:43:50:359 Erase task req-id: 20, type: 19 zcl seqno: 65 send time 0, profileId: 0x0104, clusterId: 0x0300
21:43:50:360 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:50:360 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:50:391 Node data 0x00158d00031d7bd1 profileId: 0x0104, clusterId: 0x0300
21:43:50:392 0x00158D00031D7BD1: update ZCL value 0x01/0x0300/0x0007 after 0 s
21:43:50:392 0x00158D00031D7BD1: update ZCL value 0x01/0x0300/0x0008 after 0 s
21:43:50:393 0x00158D00031D7BD1: update ZCL value 0x01/0x0300/0x4001 after 0 s
21:43:50:409 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:50:410 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:50:510 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:50:510 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:50:609 delay sending request 244 dt 4 ms to 0x0C4314FFFE73C758, ep: 0x01 cluster: 0xEF00 onAir: 1
21:43:50:610 delay sending request 244 ep: 0x01 cluster 0xEF00 to 0x0c4314fffe73c758 onAir 1
21:43:50:991 poll node 00:0b:57:ff:fe:92:30:6c-01
21:43:50:991 Poll light node Wohnzimmer 2
21:43:51:044 read attributes of 0x000B57FFFE92306C cluster: 0x0006: [ 21:43:51:044 0x0000 21:43:51:044 ]
21:43:51:045 add task 451790 type 19 to 0x000B57FFFE92306C cluster 0x0006 req.id 26
21:43:51:045 Poll APS request 26 to 0x000B57FFFE92306C cluster: 0x0006
21:43:51:159 Poll APS confirm 26 status: 0x00
21:43:51:160 Erase task req-id: 26, type: 19 zcl seqno: 66 send time 0, profileId: 0x0104, clusterId: 0x0006
21:43:51:224 Node data 0x000b57fffe92306c profileId: 0x0104, clusterId: 0x0006
21:43:51:224 0x000B57FFFE92306C: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:43:51:305 don't create binding for attribute reporting of sensor Thermostat 7
21:43:51:336 Erase task req-id: 243, type: 37 zcl seqno: 59 send time 5, profileId: 0x0104, clusterId: 0xEF00
21:43:51:560 APS-DATA.indication from child 0xEFD2
21:43:51:562 Tuya debug Request : Address 0x0C4314FFFE73C758, Endpoint 0x01, Command 0x02, Payload 00020204000101
21:43:51:562 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 00020204000101
21:43:51:563 Tuya debug 5 : Status: 0 Transid: 2 Dp: 1026 (0x04,0x02) Fn: 0 Data 1
21:43:51:625 don't create binding for attribute reporting of sensor Thermostat 7
21:43:51:640 Erase task req-id: 244, type: 37 zcl seqno: 60 send time 0, profileId: 0x0104, clusterId: 0xEF00
21:43:51:721 don't create binding for attribute reporting of sensor Thermostat 7
21:43:51:784 APS-DATA.indication from child 0xEFD2
21:43:51:786 Tuya debug Request : Address 0x0C4314FFFE73C758, Endpoint 0x01, Command 0x02, Payload 00020204000101
21:43:51:786 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 00020204000101
21:43:51:787 Tuya debug 5 : Status: 0 Transid: 2 Dp: 1026 (0x04,0x02) Fn: 0 Data 1
21:43:51:849 don't create binding for attribute reporting of sensor Thermostat 7
21:43:51:962 don't create binding for attribute reporting of sensor Thermostat 7
21:43:52:040 APS-DATA.indication from child 0xEFD2
21:43:52:041 Tuya debug Request : Address 0x0C4314FFFE73C758, Endpoint 0x01, Command 0x02, Payload 00101002000400000000
21:43:52:042 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 00101002000400000000
21:43:52:042 Tuya debug 5 : Status: 0 Transid: 16 Dp: 528 (0x02,0x10) Fn: 0 Data 0
21:43:52:045 Websocket 172.17.0.1:37852 send message: {"config":{"battery":100,"comfort_heatsetpoint":2200,"eco_heatsetpoint":1700,"heatsetpoint":2150,"locked":false,"mode":"auto","offset":0,"on":true,"preset":"auto","reachable":true,"schedule":{},"schedule_on":false},"e":"changed","id":"7","r":"sensors","t":"event","uniqueid":"0c:43:14:ff:fe:73:c7:58-01-0201"} (ret = 309)
21:43:52:045 Websocket 192.168.91.5:57049 send message: {"config":{"battery":100,"comfort_heatsetpoint":2200,"eco_heatsetpoint":1700,"heatsetpoint":2150,"locked":false,"mode":"auto","offset":0,"on":true,"preset":"auto","reachable":true,"schedule":{},"schedule_on":false},"e":"changed","id":"7","r":"sensors","t":"event","uniqueid":"0c:43:14:ff:fe:73:c7:58-01-0201"} (ret = 309)
21:43:52:046 Websocket 172.17.0.1:37852 send message: {"e":"changed","id":"7","r":"sensors","state":{"lastupdated":"2021-11-08T21:43:52.043","on":false,"temperature":2990},"t":"event","uniqueid":"0c:43:14:ff:fe:73:c7:58-01-0201"} (ret = 175)
21:43:52:047 Websocket 192.168.91.5:57049 send message: {"e":"changed","id":"7","r":"sensors","state":{"lastupdated":"2021-11-08T21:43:52.043","on":false,"temperature":2990},"t":"event","uniqueid":"0c:43:14:ff:fe:73:c7:58-01-0201"} (ret = 175)
21:43:52:106 don't create binding for attribute reporting of sensor Thermostat 7
21:43:52:168 APS-DATA.indication from child 0xEFD2
21:43:52:170 Tuya debug Request : Address 0x0C4314FFFE73C758, Endpoint 0x01, Command 0x02, Payload 00101002000400000000
21:43:52:171 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 00101002000400000000
21:43:52:171 Tuya debug 5 : Status: 0 Transid: 16 Dp: 528 (0x02,0x10) Fn: 0 Data 0
21:43:52:173 Websocket 172.17.0.1:37852 send message: {"e":"changed","id":"7","r":"sensors","state":{"lastupdated":"2021-11-08T21:43:52.172","on":false,"temperature":2990},"t":"event","uniqueid":"0c:43:14:ff:fe:73:c7:58-01-0201"} (ret = 175)
21:43:52:174 Websocket 192.168.91.5:57049 send message: {"e":"changed","id":"7","r":"sensors","state":{"lastupdated":"2021-11-08T21:43:52.172","on":false,"temperature":2990},"t":"event","uniqueid":"0c:43:14:ff:fe:73:c7:58-01-0201"} (ret = 175)
21:43:52:209 Daylight now: nightStart, status: 230, daylight: 0, dark: 1
21:43:52:210 poll node 00:0b:57:ff:fe:8d:3a:2e-01
21:43:52:212 Poll light node Esszimmer 2
21:43:52:218 don't create binding for attribute reporting of sensor Thermostat 7
21:43:52:259 read attributes of 0x000B57FFFE8D3A2E cluster: 0x0006: [ 21:43:52:260 0x0000 21:43:52:260 ]
21:43:52:260 add task 451796 type 19 to 0x000B57FFFE8D3A2E cluster 0x0006 req.id 41
21:43:52:260 Poll APS request 41 to 0x000B57FFFE8D3A2E cluster: 0x0006
21:43:52:330 don't create binding for attribute reporting of sensor Thermostat 7
21:43:52:360 Poll APS confirm 41 status: 0x00
21:43:52:361 Erase task req-id: 41, type: 19 zcl seqno: 67 send time 0, profileId: 0x0104, clusterId: 0x0006
21:43:52:424 Node data 0x000b57fffe8d3a2e profileId: 0x0104, clusterId: 0x0006
21:43:52:425 0x000B57FFFE8D3A2E: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:43:52:427 don't create binding for attribute reporting of sensor Thermostat 7
21:43:52:538 don't create binding for attribute reporting of sensor Thermostat 7
21:43:53:363 poll node 00:0b:57:ff:fe:4f:d0:6a-01
21:43:53:363 Poll light node Esszimmer 1
21:43:53:410 read attributes of 0x000B57FFFE4FD06A cluster: 0x0006: [ 21:43:53:411 0x0000 21:43:53:411 ]
21:43:53:412 add task 451801 type 19 to 0x000B57FFFE4FD06A cluster 0x0006 req.id 49
21:43:53:412 Poll APS request 49 to 0x000B57FFFE4FD06A cluster: 0x0006
21:43:53:545 Poll APS confirm 49 status: 0x00
21:43:53:546 Erase task req-id: 49, type: 19 zcl seqno: 68 send time 0, profileId: 0x0104, clusterId: 0x0006
21:43:53:593 Node data 0x000b57fffe4fd06a profileId: 0x0104, clusterId: 0x0006
21:43:53:594 0x000B57FFFE4FD06A: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:43:54:549 poll node 7c:b0:3e:aa:00:ad:5e:50-03
21:43:54:550 Poll light node Osram Steckdose
21:43:54:602 Poll APS request to 0x7CB03EAA00AD5E50 cluster: 0x0006 dropped, values are fresh enough
21:43:55:534 poll node 00:0b:57:ff:fe:4c:01:f4-01
21:43:55:535 Poll light node Wohnzimmer 1
21:43:55:586 read attributes of 0x000B57FFFE4C01F4 cluster: 0x0006: [ 21:43:55:587 0x0000 21:43:55:587 ]
21:43:55:588 add task 451811 type 19 to 0x000B57FFFE4C01F4 cluster 0x0006 req.id 61
21:43:55:588 Poll APS request 61 to 0x000B57FFFE4C01F4 cluster: 0x0006
21:43:55:611 void deCONZ::zmNode::setFetched(deCONZ::RequestId, bool) fetched item: 8, node: 0x4164
21:43:55:675 Poll APS confirm 61 status: 0xE9
21:43:55:675     drop item attr/modelid
21:43:55:676     drop item attr/swversion
21:43:55:676     drop item state/bri
21:43:55:677     drop item state/colormode
21:43:55:677 0x000B57FFFE4C01F4 error APSDE-DATA.confirm: 0xE9 on task
21:43:55:678 Erase task req-id: 61, type: 19 zcl seqno: 69 send time 0, profileId: 0x0104, clusterId: 0x0006
21:43:55:707 Node data 0x000b57fffe4c01f4 profileId: 0x0104, clusterId: 0x0006
21:43:55:708 0x000B57FFFE4C01F4: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:43:56:259 poll node 00:15:8d:00:03:1d:7d:7c-01
21:43:56:259 Poll light node Arbeitszimmer 2
21:43:56:309 read attributes of 0x00158D00031D7D7C cluster: 0x0006: [ 21:43:56:310 0x0000 21:43:56:310 ]
21:43:56:311 add task 451815 type 19 to 0x00158D00031D7D7C cluster 0x0006 req.id 67
21:43:56:311 Poll APS request 67 to 0x00158D00031D7D7C cluster: 0x0006
21:43:56:460 Poll APS confirm 67 status: 0x00
21:43:56:461 Erase task req-id: 67, type: 19 zcl seqno: 70 send time 0, profileId: 0x0104, clusterId: 0x0006
21:43:56:492 Node data 0x00158d00031d7d7c profileId: 0x0104, clusterId: 0x0006
21:43:56:492 0x00158D00031D7D7C: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:43:56:673 read attributes of 0x00158D00031D7D7C cluster: 0x0008: [ 21:43:56:674 0x0000 21:43:56:674 ]
21:43:56:675 add task 451817 type 19 to 0x00158D00031D7D7C cluster 0x0008 req.id 71
21:43:56:675 Poll APS request 71 to 0x00158D00031D7D7C cluster: 0x0008
21:43:56:748 Poll APS confirm 71 status: 0x00
21:43:56:748 Erase task req-id: 71, type: 19 zcl seqno: 71 send time 0, profileId: 0x0104, clusterId: 0x0008
21:43:56:750 read attributes of 0x00158D00031D7D7C cluster: 0x0300: [ 21:43:56:750 0x0007 21:43:56:751 0x0008 21:43:56:751 0x4001 21:43:56:751 ]
21:43:56:752 add task 451819 type 19 to 0x00158D00031D7D7C cluster 0x0300 req.id 73
21:43:56:752 Poll APS request 73 to 0x00158D00031D7D7C cluster: 0x0300
21:43:56:796 Node data 0x00158d00031d7d7c profileId: 0x0104, clusterId: 0x0008
21:43:56:796 0x00158D00031D7D7C: update ZCL value 0x01/0x0008/0x0000 after 0 s
21:43:56:860 Poll APS confirm 73 status: 0x00
21:43:56:860 Erase task req-id: 73, type: 19 zcl seqno: 72 send time 0, profileId: 0x0104, clusterId: 0x0300
21:43:56:908 Node data 0x00158d00031d7d7c profileId: 0x0104, clusterId: 0x0300
21:43:56:909 0x00158D00031D7D7C: update ZCL value 0x01/0x0300/0x0007 after 0 s
21:43:56:909 0x00158D00031D7D7C: update ZCL value 0x01/0x0300/0x0008 after 0 s
21:43:56:910 0x00158D00031D7D7C: update ZCL value 0x01/0x0300/0x4001 after 0 s
21:43:57:492 poll node 00:15:8d:00:03:1d:7b:d1-01
21:43:57:493 Poll light node Arbeitszimmer 1
21:43:57:545 Poll APS request to 0x00158D00031D7BD1 cluster: 0x0006 dropped, values are fresh enough
21:43:57:582 don't create binding for attribute reporting of sensor Thermostat 7
21:43:57:831 Poll APS request to 0x00158D00031D7BD1 cluster: 0x0008 dropped, values are fresh enough
21:43:57:927 read attributes of 0x00158D00031D7BD1 cluster: 0x0300: [ 21:43:57:927 0x0007 21:43:57:928 0x0008 21:43:57:928 0x4001 21:43:57:928 ]
21:43:57:929 add task 451824 type 19 to 0x00158D00031D7BD1 cluster 0x0300 req.id 82
21:43:57:929 Poll APS request 82 to 0x00158D00031D7BD1 cluster: 0x0300
21:43:58:061 Poll APS confirm 82 status: 0x00
21:43:58:061 Erase task req-id: 82, type: 19 zcl seqno: 73 send time 0, profileId: 0x0104, clusterId: 0x0300
21:43:58:093 Node data 0x00158d00031d7bd1 profileId: 0x0104, clusterId: 0x0300
21:43:58:093 0x00158D00031D7BD1: update ZCL value 0x01/0x0300/0x0007 after 0 s
21:43:58:094 0x00158D00031D7BD1: update ZCL value 0x01/0x0300/0x0008 after 0 s
21:43:58:095 0x00158D00031D7BD1: update ZCL value 0x01/0x0300/0x4001 after 0 s
21:43:58:693 poll node 00:0b:57:ff:fe:92:30:6c-01
21:43:58:693 Poll light node Wohnzimmer 2
21:43:58:745 read attributes of 0x000B57FFFE92306C cluster: 0x0006: [ 21:43:58:745 0x0000 21:43:58:746 ]
21:43:58:746 add task 451829 type 19 to 0x000B57FFFE92306C cluster 0x0006 req.id 89
21:43:58:746 Poll APS request 89 to 0x000B57FFFE92306C cluster: 0x0006
21:43:58:861 Poll APS confirm 89 status: 0x00
21:43:58:862 Erase task req-id: 89, type: 19 zcl seqno: 74 send time 0, profileId: 0x0104, clusterId: 0x0006
21:43:58:989 Node data 0x000b57fffe92306c profileId: 0x0104, clusterId: 0x0006
21:43:58:990 0x000B57FFFE92306C: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:43:59:864 poll node 00:0b:57:ff:fe:8d:3a:2e-01
21:43:59:865 Poll light node Esszimmer 2
21:43:59:913 read attributes of 0x000B57FFFE8D3A2E cluster: 0x0006: [ 21:43:59:913 0x0000 21:43:59:914 ]
21:43:59:914 add task 451834 type 19 to 0x000B57FFFE8D3A2E cluster 0x0006 req.id 97
21:43:59:915 Poll APS request 97 to 0x000B57FFFE8D3A2E cluster: 0x0006
21:44:00:046 Poll APS confirm 97 status: 0x00
21:44:00:047 Erase task req-id: 97, type: 19 zcl seqno: 75 send time 0, profileId: 0x0104, clusterId: 0x0006
21:44:00:126 Node data 0x000b57fffe8d3a2e profileId: 0x0104, clusterId: 0x0006
21:44:00:127 0x000B57FFFE8D3A2E: update ZCL value 0x01/0x0006/0x0000 after 0 s
21:44:01:048 poll node 00:0b:57:ff:fe:4f:d0:6a-01
21:44:01:049 Poll light node Esszimmer 1
21:44:01:100 read attributes of 0x000B57FFFE4FD06A cluster: 0x0006: [ 21:44:01:101 0x0000 21:44:01:101 ]
21:44:01:102 add task 451840 type 19 to 0x000B57FFFE4FD06A cluster 0x0006 req.id 104
21:44:01:102 Poll APS request 104 to 0x000B57FFFE4FD06A cluster: 0x0006
21:44:01:151 Poll APS confirm 104 status: 0x00
21:44:01:152 Erase task req-id: 104, type: 19 zcl seqno: 76 send time 0, profileId: 0x0104, clusterId: 0x0006
21:44:01:199 Node data 0x000b57fffe4fd06a profileId: 0x0104, clusterId: 0x0006
21:44:01:200 0x000B57FFFE4FD06A: update ZCL value 0x01/0x0006/0x0000 after 0 s
Smanar commented 2 years ago

I realy don't understand

21:43:46:311 Websocket 172.17.0.1:37852 send message: {"mode":"off","preset":"auto"}
21:43:51:787 Tuya debug 5 : Status: 0 Transid: 2 Dp: 1026 (0x04,0x02) Fn: 0 Data 1
21:43:52:042 Tuya debug 5 : Status: 0 Transid: 16 Dp: 528 (0x02,0x10) Fn: 0 Data 0
21:43:52:045 Websocket 172.17.0.1:37852 send message: {"mode":"auto","preset":"auto"}

on 0x02 0x10 The value can't move because temp = 0 on 0x04 0x02 The vaue can't move because mode = "off"

And there is nothing both state.

21:43:46:306 Send Tuya request 0x0C4314FFFE73C758 : Dp_type: 0x01, Dp_identifier 0x10, data: 00000000

Have corrected that, the type was not good, we need 0x02 0x10. Have set the heat temperature to 30ยฐC Have added some more debug line.

I realy hope it will be solved, bored, I prefer make the schedule feature.

rikroe commented 2 years ago

Can totally understand you being bored. Unfortunately it didn't help. Please find the logs here:

If you prefer to work on schedule first, I have set a schedule on the device and received the following on Tuya debug: https://gist.github.com/rikroe/d187e44e39e29db2e32f947414d641ca#file-schedule

Smanar commented 2 years ago

Haaaaaa ^^ I think I have found the problem

                            QString mode = QLatin1String("auto");
                            if (temp == 0) { QString mode = QLatin1String("off"); }
                            if (temp == 6000) { QString mode = QLatin1String("heat"); }

I re-create a new variable mode.

Code is online, after your validation I will update the whole code to the last official and start to retreive the schedule in the schedule field.

rikroe commented 2 years ago

So I just quickly tested your code (and think I found out how to filter only relevant log entries - basically only lines with hex device ID, Tuya or unique_id. Let me know if you need more). docker logs -f deconz | grep --line-buffered -E 'Tuya|0x0C4314FFFE73C758|0c:43:14:ff:fe:73:c7:58-01-0201'

OFF does work ๐Ÿ‘ Auto still works ๐Ÿ‘ Unfortuately, heat still jumps back to Auto :( Logs for full switch (Heat > Auto > Off > Auto) can be found here: https://gist.github.com/rikroe/d187e44e39e29db2e32f947414d641ca#file-tuya_debug_2021-11-11_heat_auto_off_auto-log

Smanar commented 2 years ago

NP, it s corrected, have updated too the code with last official.

Starting the make the schedule, I think you will have something now during schedule request.

20:34:34:284 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload (006d 6d00 00 12) 01 24-18 2b-33 23-44 2b-5c 22-60 2a-60 22-60 2a-60 22

0x12 is the lenght 0x01 is the day 0x24 is the temperature * 2 0x18 is the hour/minut but coded

So for monday.

day = 01
Temperature = 36/2 = 18ยฐC
Hour = 24/ 4 = 6h
Minut = 24 % 4 * 15 = 0 mn

seem like your logs

The last char is a temperature but IDK what is it.

rikroe commented 2 years ago

Yes, mode heat, off and auto are working now! Cool! ๐Ÿ‘

I do get a schedule back, some notes:

"schedule": {
    "W111": [
        {
            "heatsetpoint": 18,
            "localtime": "T06:00"
        },
        {
            "heatsetpoint": 21,
            "localtime": "T12:45"
        }
    ]
},
"schedule_on": false

Logs are here: https://gist.github.com/rikroe/a2e21d8f64d09bb0bc0707b333343a69

steff75 commented 2 years ago
  • it seems as if the decimal places in heatsetpoint are ignored. Maybe they also should be 1800 and 2150 instead?

Yes, in other examples I have seen only four-digit values so far

  • what does W111 stand for?

https://github.com/dresden-elektronik/deconz-rest-plugin/blob/master/tuya.h

// Monday = 64, Tuesday = 32, Wednesday = 16, Thursday = 8, Friday = 4, Saturday = 2, Sunday = 1 // If you want your schedule to run only on workdays, the value would be W124. (64+32+16+8+4 = 124) // The API specifies 3 numbers, so a schedule that runs on Monday would be W064. // // Workday = W124 // Not working day = W003 // Saturday = W002 // Sunday = W001 // All days = W127

This should be all days except Wednesday. The documentation for thermostat schedules is unfortunately still missing.

Thanks for your work on the thermostat!

Smanar commented 2 years ago

Lol another missing documentation ? From my memory it s a normalized codification rule, but I don't find it again. Or was just from deconz and the first schedule code ...

"schedule_on" is not used yet on this device, you need it ?

Have made some correction and add a debug line because you have some strange days, like W15 ...

rikroe commented 2 years ago

Making progress ๐Ÿ‘ Now I am able to see all heatsetpoints in the schedule. Not sure if the last ones can be ignored (i.e. when localtime has reached T24:00, then ignore). There is still some issue with the temperatures - looks more off (or maybe just missing a conversion)?

"schedule": {
    "W111": [
        {
            "heatsetpoint": 8,
            "localtime": "T06:00"
        },
        {
            "heatsetpoint": 102,
            "localtime": "T12:45"
        },
        {
            "heatsetpoint": 214,
            "localtime": "T17:00"
        },
        {
            "heatsetpoint": 102,
            "localtime": "T23:00"
        },
        {
            "heatsetpoint": 164,
            "localtime": "T24:00"
        },
        {
            "heatsetpoint": 52,
            "localtime": "T24:00"
        },
        {
            "heatsetpoint": 164,
            "localtime": "T24:00"
        },
        {
            "heatsetpoint": 52,
            "localtime": "T24:00"
        }
    ]
},

Link to logs: https://gist.github.com/rikroe/b7f6ade4cf0780d25cb4c8452d830626#file-schedule-log-2021-11-12

Don't need schedule_on, was just wondering.

Smanar commented 2 years ago

Not sure if the last ones can be ignored (i.e. when localtime has reached T24:00, then ignore).

Ha yes right good idea, but how to be sure it's not set at all (an user can use 24:00 in the schedule), according to the defaut temperature ?

There is still some issue with the temperatures

Yes I have made a rollback, the temperature is not *100 in schedule, so the correct display is "18".

For the Schedule (why W111), It seem the code update the list itself, so if you have the same schedule for Sunday and saturday, you will not have W001 and W002 but direclty W003. The code re-use previous schedule, so better to delete it first.

Have find some information on the schedule core https://github.com/dresden-elektronik/deconz-rest-plugin/issues/2393#issuecomment-674513698

To delete it can use DELETE /sensors/<ID>/config/schedule/W124

And as you are using the same shcedule for all days, you need to have W124

21:55:01:095 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 006d6d0000 12 01 24182b3323442b5c22602a6022602a6022
21:55:01:096 Tuya : Schedule debug 64
21:55:01:480 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 006e6e0000 12 02 24182b3323442b5c22602a6022602a6022
21:55:01:480 Tuya : Schedule debug 32
21:55:01:864 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 006f6f0000 12 03 24182b3323442b5c22602a6022602a6022
21:55:01:865 Tuya : Schedule debug 46
21:55:02:249 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 0070700000 12 04 24182b3323442b5c22602a6022602a6022
21:55:02:249 Tuya : Schedule debug 8
21:55:02:633 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 0071710000 12 05 24182b3323442b5c22602a6022602a6022
21:55:02:633 Tuya : Schedule debug 4
21:55:02:001 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 0072720000 12 06 24182b3323442b5c22602a6022602a6022
21:55:02:002 Tuya : Schedule debug 2
21:55:03:513 Tuya debug 4 : Address 0x0C4314FFFE73C758 Payload 0073730000 12 07 24182b3323442b5c22602a6022602a6022
21:55:03:514 Tuya : Schedule debug 1

Edit: BTW why it s missing Wednesday (16) ?

Edit2: Ok issue solved, I think you will have W124 without deleting it this time, You had W111 because of missing Wednesday

pillemats commented 2 years ago

Is it working now ?

Smanar commented 2 years ago

I think the device yes, still missing the shcedule part.

pillemats commented 2 years ago

Ah ok, i donโ€™t know what it is. Is it possible to add the manual function on next beta release ?

Mimiix commented 2 years ago

We don't do half integrations here @pillemats ๐Ÿ˜‰ . I really would appreciate it if you'd just follow and support where you can. Smanar does his best here ๐Ÿ˜…

pillemats commented 2 years ago

I know that! Many thanks for that! Tell me how I can help! I will do my best

rikroe commented 2 years ago

Sorry guys, didn't realize there was a new update. Will only be able to test next week, currently on vacation.

Smanar commented 2 years ago

Else I can make the PR without the shedule part ? But IDK how it will be done in the future with DDF .....

rikroe commented 2 years ago

Yes, for me that would be fine. Get basic support in, and then with DDF coming up we can revisit this topic.

pillemats commented 2 years ago

Fineโ€ฆ thank you guys!

Did you know when the update comes out with the basic support ?

ChristianH21220 commented 2 years ago

@Smanar : it look that the PR is on hold because it show: "This branch has conflicts that must be resolved"

When you start the PR is was shown as ok and now I see it was changed to conflict.

Smanar commented 2 years ago

Yep, it's because the official code was updated. Have just updated the branch too.

ChristianH21220 commented 2 years ago

Thx for the quick response. Now the PR lock fine again ๐Ÿ‘

pillemats commented 2 years ago

How can I see when the device is supported ?

All the updates will happens when deconz get an new update or will they automatically updated in background ?

sorry for the noob question

Smanar commented 2 years ago

This issue will be closed automaticaly. Or can check the PR state.

But ATM new device inclusion are blocked because of DDF, so few of them are validated.

popokatapepel commented 2 years ago

Oh no... What does DDF stand for? Is there anything we as a community can do? At the moment I am running @Smanar s branch in production and would like to switch back to the official repo :D

Smanar commented 2 years ago

DDF long story https://dresden-elektronik.github.io/deconz-dev-doc/modules/ddf/ DDF editor description https://github.com/dresden-elektronik/deconz-rest-plugin/wiki/DDF-cheat-sheet

And yes, it will be something FOR the community, will be simple json file to edit with a simple text editor, one json by device. You can already use it for some device Unfortunately Tuya with his special cluster need lot or more work, so it will be the last thing added I think. So can be a reason for the PR will be validated.

Else don't worry, just ask and I can update the branch with the last code from official repo.

popokatapepel commented 2 years ago

ok i will check the documentation and will see if I can create such a file fot the thermostat

Smanar commented 2 years ago

Nope, don't loose your time for the moment, Tuya cluster is not yet supported in DDF core.

slabyvladislav commented 2 years ago

DDF long story https://dresden-elektronik.github.io/deconz-dev-doc/modules/ddf/ DDF editor description https://github.com/dresden-elektronik/deconz-rest-plugin/wiki/DDF-cheat-sheet

And yes, it will be something FOR the community, will be simple json file to edit with a simple text editor, one json by device. You can already use it for some device Unfortunately Tuya with his special cluster need lot or more work, so it will be the last thing added I think. So can be a reason for the PR will be validated.

Else don't worry, just ask and I can update the branch with the last code from official repo.

Hi @Smanar, sorry other for off topic. I bought one Tuya Radiator Actuator TRV TS0601 / _TZE200_hue3yfsn thermostat, that deconz can't identify. Only show unnamed device with manufacturer _TZE200_hue3yfsn. Is there a chance to manage that device ? Thanx

dodo124 commented 2 years ago

Hi there,

I just got the Silvercrest Thermostat and I tried to figure out the whole conversation here but Iยดm not sure, if I got it right.

So from my experience the device looks and works pretty much the same like a eQ3 Model N which Iยดm using for several years now. Here is the link to the manual (https://www.eq-3.com/products/homematic/detail/radiator-thermostat-model-n.html).

From my understanding of how the device works:

There are three modes: Auto; Manual; Holiday. In addition, two special ones: Sun; Moon (which I did not use in 6 years). Moreover, the boost mode: Maximum power for 300sec.

Modes:

Auto = Schedule you program for each day; or Weekdays; or Weekends; or Whole week.

Always From 0:00 up to 23:59 where you can define seven timestamps.

Starting at 0:00 (starting time) --> then temperature (at starting time e.g. 17 Celsius) --> then next timestamp (e.g. 6:00) --> then next temperature (e.g. 22 Celsius) --> next timestamp --> next temperature --> until 23:59.

Manual = Set a specific temperature the whole time (as you do with the normal analog ones). Only in this mode you can set the thermostat to "off" and therfor shutting the valve off.

Holiday = I never used that but it is intended to set a specific temperature for a specific time and switch to auto after that.

Special: As said above I did not use them but you can set two different temperatures here to "quick access" them all the time.

Booster: Press the big round button in the middle to get boosted for 300 sec open valve max.

Ok that is the Model N now to the model from Silvercrest:

The menu and all functions work the same except for the booster, which says โ€žNO.SEโ€œ, probably the value is missing to set the valve to. HOWEVER, it is not responding at all if it is connected via zigbee.

OK I hope I brought some light to it.

Greatings Dom

Smanar commented 2 years ago

@slabyvladislav New device addition is stopped for the moment, time to finish the DDF core. It s hard for device already started, so for new one ....

@dodo124 yes it s possible, devices realy look same.

dodo124 commented 2 years ago

I will disassemble the silvercrest to see what's under the hut. Hopefully it is easy to open... well that went bad ๐Ÿ‘Ž board destroyed ๐Ÿ˜• anway...

So I can't test anything right now. Sorry.

Here are the pictures. It seems it's a totally new design integrating the zigbee module on one side and the "old" features on the backside.

Screenshot_20211213-185400 Screenshot_20211213-185501 Screenshot_20211213-190553

matra01 commented 2 years ago

Hi @Smanar I had hoped, your TRV integration would made it into the current stable, but unfortunately it was now released without it. If it's not too much trouble, could you please rebase your branch on the current stable release, that would be awesome.

Smanar commented 2 years ago

Hi @Smanar I had hoped, your TRV integration would made it into the current stable, but unfortunately it was now released without it. If it's not too much trouble, could you please rebase your branch on the current stable release, that would be awesome.

Code just updated.

matra01 commented 2 years ago

Thank you, works wonderful!

Mimiix commented 2 years ago

@Smanar @matra01 Is the PR ready and working? If so, what PR does need to be merged?

Smanar commented 2 years ago

If I m right all is working except the schedule ?

matra01 commented 2 years ago

Since nearly 3 weeks, I have 5 TRVs in action in my home assistant installation. I am still facing some issues, but I dont know, if they are caused by the REST plugin, home assistant or the TRV itself. My issues:

@Smanar Do you think it is worthwhile to analyze these issues or is it wasted time and this should be delayed until DDF support is available for tuya cluster?

Smanar commented 2 years ago

this should be delayed until DDF support is available for tuya cluster?

This will need so long time, integrate Tuya cluster in DDF will be not something easy, so if Mimix can help for the PR be validated, I think making a PR can be usefull.

But it's strange all 5 TRV don't react same, there is not something different on the device JSON ?

matra01 commented 2 years ago

How can I check this? I clicked across the Node Info pages of the TRVs in the deCONZ programm, but for me they all look the same. In contrast, in the home assistant integration, they are presented slightly different, since I have not all functions available on 3 of the TRVs. image (to be exact, the 4th TRV (Kinderzimmer) can be broght to automatic mode but nevertheless it has no presets to choose from - so it seems that there are 3 characteristics)

If you can describe me, what kind of test I can perform or which log I can produce, I will naturally provide it to you. Maybe it is possible to control the TRVs without home assistant or another third party tool? Can I issue some commands on the TRVs from deCONZ or the CLI?

Smanar commented 2 years ago

Ha yes I know this issue. When HA create the device entry, it check values in json and create sensor according to present/absent value.

So if you have "preset" : null HA will not add the preset command, but if you wait enought for the device send a defaut value, HA will make better integration.

To be sure you can check the device JSON in phoscon / hep / api information

I think this issue can be solved using defaut value different than null, but the logic is "null" mean the value exist but is not send yet by the device.

matra01 commented 2 years ago

Ahh, you're right. If I pair (or repair) the TRV when it is in a "special" mode, these options are "unlocked" in the home assistant integration. Then my only issue left is that the TRVs sometimes get "frozen", but I will investigate this a little more.

Smanar commented 2 years ago

Yeah there is something to improve, on HA plugin side and deconz side. For the moment there is some device that don't support all attributes but still have them, need to make something more strict when duoing DDF in the future for thoses TRV.

langemar commented 2 years ago

@matra01 if your TRV freeze does that mean forever? The TRV's from AVM f.e. have some kind of sleep mode for power conservation and it might take up to 15min for a (DECT) command to be transfered to them.

matra01 commented 2 years ago

Until now, the behavior of the TRV remains mysterious to me. Yesterday, one of the TRVs are no longer controlable through home assistant. I retried it randomly the next 5 hours, but the TRV does not react on any change request. Without touching it or doing something else, at this morning it works as intended and applies my temperature changes. I cannot say, what it brought to work again. I don't know the process of delivering a temperature command to the zigbee TRV, but I think, it is a one shot. So, if it is really in sleep mode, it seems probable that it will miss the command and there is no retry. @Smanar Can you confirm this or should there be some caching mechanism in the deconz or the conbee stick?

matra01 commented 2 years ago

And there is another issue, until now covered by a daily cron-scheduled reboot of my deconz server. Without this reboot, it seems that home assistant looses connection to the TRVs at certain times: Screenshot at 2022-01-13 14-19-21

After restarting deconz (systemctl restart deconz.service) they are visible and controllable again. Also this issue does not appear if deconz is restarted once per day in advance.

Smanar commented 2 years ago

Do you have the json when the device have lost the connexion ? To check if it's from HA or deconz ?

I don't know the process of delivering a temperature command to the zigbee TRV, but I think, it is a one shot. So, if it is really in sleep mode, it seems probable that it will miss the command and there is no retry. @Smanar Can you confirm this or should there be some caching mechanism in the deconz or the conbee stick?

No, there is more than 1 try, and it s visible on log (with info and error flag). An yes it can be a possibilities (even with more tries), but this device sleep all the time, so you will have the issue just after 15mn, for all of them for exemple.

matra01 commented 2 years ago

Here is the json of such an unreachable TRV:

tvr_unavailable.txt

In contrast, if a TRV is available:

tvr_available.txt