Closed Master92 closed 1 year ago
Closes: #161
The settings_load
method initialized the global setting with a non-existing entry, resulting in always loading the default value of 0.
Here are some screenshots, how the power page is going to look after this PR:
Batch 1 goggles
Batch 2 goggles
Batch 2 goggles with voltage per cell
Since my goggle's battery voltage reading was 0.3V too low, the low voltage warning would trigger much too early.
I added a configuration slider to the power page, storing an interger with the other power settings, and used that value in every call querying the battery voltage. The calibration range is currently ±2.5 Volts. I think that should be enough.
Tested it in the emulator and my batch 2 goggles - both displays look correct and behave correctly.
Have you investigated why your goggle voltage reading is too low? I wonder what happened.
Since my goggle's battery voltage reading was 0.3V too low, the low voltage warning would trigger much too early. I added a configuration slider to the power page, storing an interger with the other power settings, and used that value in every call querying the battery voltage. The calibration range is currently ±2.5 Volts. I think that should be enough. Tested it in the emulator and my batch 2 goggles - both displays look correct and behave correctly.
Have you investigated why your goggle voltage reading is too low? I wonder what happened.
I just took a look at different voltage readings while operating the goggles from a DC voltage source (ISDT Q8 charger). I set different voltages in the charger, checked them with a digital multimeter (while powering the goggles) and noted the voltage displayed by the goggles:
DC | Multimeter | Goggles | Difference | Difference / Input |
---|---|---|---|---|
10V | 9.9V | 9.7V | -0.2V | -0.0202 |
12.3V | 12.1V | 11.8V | -0.3V | -0.02479 |
15V | 15.1V | 14.7V | -0.4V | -0.02649 |
20V | 20.1V | 19.5V | -0.6V | -0.02985 |
While I first thought that the difference migh come from a voltage drop in the power cable, the measurements speak against that. Altough I didn't open my goggles and measure directly at the barrel plug, a voltage drop caused by the cable would be less significant when operating at a higher voltage (and therefore less current), wouldn't it?
I saw that another user on Discord had the same issue (only slightly more significant) so I think it's not only my set that suffers from the offset. And as @robhaswell mentioned, there is an open issue #161 addressing this topic.
@Master92 I can confirm that it is because the power line splits a little voltage, and the actual value measured by goggle is no problem. When the goggle is at different power consumption, the voltage split by the power line will be somewhat different, which is related to the power supply current of the goggle. I use 12v for power supply, and it shows 11.7v in the menu interface. If it is scanning, the voltage will drop to 11.6v or even 11.5v. I think this method of voltage calibration is only suitable for power supplies that can stabilize the output voltage. If the 4s lithium battery is directly used for power supply, the calibration will not be accurate. Although a fully charged 4S battery is used for power supply, the actual output voltage will also drop considering the internal resistance of the battery and the operating current. I'd rather goggle an earlier alarm than a delayed alarm as that could cause the battery to over discharge.
Based on your measurements and the power consumption of the goggles stated in the manual, the resistance of your cable should be roughly 0.418Ω (8.4W / 11.7V = 0.718A -> 0.3V / 0.718A = 0.418Ω
)
Doing the math the other way around, you would end up with 19.825V (20V - (8,4W / 20V) * 0.418Ω
) when powering from a 20V DC source and 9.65V when powering from 10V. Can you confirm that this is the case with your goggles? Because with mine, this behavior is not reflected.
I am aware that the voltage in a battery drops when energy is consumed due to its internal resistance, however if the voltage reading in the goggles was accurate, this voltage drop would be reflected in the GUI. As you stated, your voltage drops to 11.5V when powering HDZero RF. This pretty much matches my calculation above that the voltage drop of 200mV originates from the internal resistance of your battery and the cable's resistance is still 0.413Ω.
Although a fully charged 4S battery is used for power supply, the actual output voltage will also drop considering the internal resistance of the battery and the operating current.
I'm not saying anyone should calibrate their goggles to a full battery voltage when powering off of a freshly charged pack. Of course, you always have to measure the battery's voltage while powering the goggles at the same time.
Don't you think it's strange that, even though I'm powering from two different sources (battery and DC power source), the voltage reading difference is the same?
My Goggles read the voltage about 0.6V lower than measured at the battery through the balance leads. Now, maybe that drop is actually due to the power leads, and not a fault with the voltmeter. Does that matter though? The purpose of the voltage reading is for me to know the state of my battery. Therefore, being able to calibrate it to include the losses in the system is exactly what calibration is for.
In my case, the measured voltage discrepancy equates to about 0.2V per cell, which is a lot of runtime in the packs I am using. So I would like a calibrated, accurate reading of the pack voltage in my Goggles.
Since my goggle's battery voltage reading was 0.3V too low, the low voltage warning would trigger much too early.
I added a configuration slider to the power page, storing an interger with the other power settings, and used that value in every call querying the battery voltage. The calibration range is currently ±2.5 Volts. I think that should be enough.
Tested it in the emulator and my batch 2 goggles - both displays look correct and behave correctly.