nlrb / com.weather-sensors

Wireless weather sensor app voor Athom Homey
https://apps.athom.com/app/com.weather-sensors
MIT License
14 stars 26 forks source link

Fine-tune oregon and cresta signal #42

Closed jeroenvollenbrock closed 5 years ago

Ogge66 commented 5 years ago

And how do I implement this change?

nlrb commented 5 years ago

Hi @jeroenvollenbrock Thanks for the changes. Can you explain a couple of things?

  1. Why is fw 1.5.13 needed (is that for agc?)
  2. How have you determined min/max length for Orgeon v2?
  3. How do you determine the sensitivity (0.13/0.1)?
  4. Any reason for the difference between v2 & v2.2 agc (v2.2 should be the same as v2 except for freq/manchesterUnit)?
  5. Why has the sof for Oregon v3 been reduced (and no agc)?
  6. Why would you update the default value for inactiveTime? This is not related to signal reception and user configurable anyway.

Thanks!

jeroenvollenbrock commented 5 years ago

Hi @jeroenvollenbrock Thanks for the changes. Can you explain a couple of things?

  1. Why is fw 1.5.13 needed (is that for agc?)

I have updated the compatibility to 1.5.13 because that's what this branch has been tested with. I'm not sure and did not test if this version is compatible with older Homey versions, so in order to avoid breaking things for users that did not update their Homey, it seems safer to update the compatibility to reflect this. (see also my remarks at 3.)

  1. How have you determined min/max length for Orgeon v2?

I have taken the original lengths, divided them by two to compensate for the combined words, and added an offset to the max length to allow for the longer preamble/sync word bits to get included.

  1. How do you determine the sensitivity (0.13/0.1)?

Sensitivity is a delicate thing, too much sensitivity may cause random data to get recognized as meaningful data, too little may cause meaningful data not to get recognized. If you simply put the sensitivity to the maximum setting, chances increase that the start of frame is detected just before the actual start of frame starts coming in, which in turn makes the parser and clock detection get 'out of sync' with the actual data. 0.5 is a really high sensitivity, as it basically means 50% of the signal is allowed to not match the incoming data. During our tests, it turned out this caused problems in the oregon case, as the agc+sof actually contains patterns that could be words as well. This was for instance visible as inverted data coming in (eg all 1's got 0's and all 0's got 1's), because 0.5 allowed the data to get accepted with half a clock cycle difference as well. 1.5.13-rc.14 contains a few modifications to the signal parser that attempt to auto-correct for clock drift and jitter, which as a result makes the mismatch rate much more resilient and allows for a lower maximum mismatch rate (=sensitivity). (I actually only very rarely noticed cases where oregonv2 and oregonv2_2 didn't both trigger with the same data at the same time after this change). Heuristics showed that in the oregon v2 case a sensitivity above 0.15 started to accept data as SOF's it really shouldn't have accepted, while it needed at least 0.1 to function properly. 0.13 Gave the best overall results.

  1. Any reason for the difference between v2 & v2.2 agc (v2.2 should be the same as v2 except for freq/manchesterUnit)?

Good catch, i missed a line and amended my commit to correct this. The AGC sequence is actually only used in the case of sending data, so it is not really doing much in the current situation, but i included it to keep the signal complete and functional in case you ever do want to start to send out data as well.

  1. Why has the sof for Oregon v3 been reduced (and no agc)?

Same as 4.

  1. Why would you update the default value for inactiveTime? This is not related to signal reception and user configurable anyway.

A few of the supported sensors in the app have a reporting interval that's actually greater then the previous default (The oregon sensors i tested with reported every 128 seconds, for instance). I therefore changed the default to 5 minutes instead of 1 minute, as this covers the reporting interval of all sensors and avoids ping-ponging between available and unavailable for sensors that have a greater interval. A good default setting should be the proper setting for as many users as possible, and 5 minutes is a much better tradeoff between false positives and being accurate/recent and therefore serves more users right imho. Also, as this is an sdkv1 app, the settings page is also not visible in homey v2 and i think a default of 5 minutes decreases the dependency on settings page a lot.

nlrb commented 5 years ago

Thanks for the detailed feedback! I'll start playing with it when I am back home in a weeks time.

nlrb commented 5 years ago

I've tried the old & new signal definitions with fw 1.5.13: