Open zjwhitehead opened 2 years ago
I did some research into SOC estimation techniques and it looks like the most common approach is coulomb counting, which integrates the current over time and constantly subtracts it from the remaining charge (amp-hour or amp-second), divided by the total charge, resulting in a percentage that is the current state of charge. The accuracy relies on precise battery current measurement and an accurate initial state of charge reading. This method would solve the voltage sag under load issue (since voltage sags under high current) and would also be less affected by temperature (since temperature directly influences current). It also tends to work well with Li-on batteries as opposed to solely voltage due to their relatively flat discharge voltage curves. A few questions:
SOC estimation seems to usually be done inside the battery management system (BMS), which would have constant access to inputs like current, voltage, and temperature. In this case, coulomb counting would factor in current during both charging and discharging phases and be able to estimate the SOC at any point in time along with the battery state of health. Just to confirm, this improvement is focusing on estimating SOC during the discharging phase using inputs primarily from the ESC telemetry data? Is it possible to directly interface or estimate SOC within the BMS?
What is the nominal voltage for the battery (used to calculate total charge)? Is it the same for both battery sizes?
During my research I found that voltage tends to sag during startup of something like an electric motor. My current plan is to estimate initial state of charge using the voltage right after this sag and the voltage vs SOC curve. Do you think this voltage would be under constant current? (i.e. I'm thinking that shortly after starting the motor, there will be some downtime after the voltage sag but before the user pushes on the throttle to draw more current. During this downtime the current might be fairly consistent, and a voltage vs SOC curve under this current could be used to estimate the initial SOC)
Related to the previous question—was the existing voltage curve from load testing calculated under a constant current? And if so, was this electric current representative of the motor during cruising stage, no load, or something else?
I'm still not entirely sure on how to best distinguish between the two battery sizes. You mentioned that a 2.2kwh battery at the same current will drop in voltage faster than the 3.7kwh, so my current idea is to keep track of the voltage derivative and try to guess which size is being used based on the current, voltage derivative, and initial SOC estimate. I think this would require a fair bit of testing and would still be a rough heuristic, but I don't know how else to do it unless there are some key differences between the two battery sizes that would immediately signal which one is in use.
Alternatively, some systems do rely on the voltage method for SOC estimation and potentially compensating by some correction term based on the temperature and current. I don't have any way to get the testing data myself though, and I think the temperature correction term would usually be measured directly from the BMS as the battery's internal temperature instead of the ambient environment temperature. Right now I'm leaning towards coulomb counting since current already factors in temperature and isn't impacted by the flat voltage profile of Li-on batteries.
Thanks for the help here @trebula13579
SOC estimation seems to usually be done inside the battery management system (BMS), which would have constant access to inputs like current, voltage, and temperature.
Yes this does happen in our newer BMS but in the current version the BMS data is only accessible via special Bluetooth app on the users smartphone. In the future we will be integrating with the BMS more directly but for now we can assume we only have access to the ESC data.
What is the nominal voltage for the battery (used to calculate total charge)? Is it the same for both battery sizes?
Its 3.6v per cell. Here is the datasheet https://www.molicel.com/wp-content/uploads/INR21700P42A-V4-80092.pdf You can see in our crude polyline estimation mapping that the pack goes from 60v to 100v https://github.com/openppg/eppg-controller/blob/34aa9ae7d7334756e56b93a07da59bd7b08964b0/src/sp140/power.ino#L6
was the existing voltage curve from load testing calculated under a constant current?
No, and that is part of the problem. Under load the voltage sags so we get inaccuracies when flying at different power draws. For reference. On a typical flight of mine, I fly straight and level with a cruise power of between 3.7kW - 5kW depending on the conditions and the size/surface area of my wing/paraglider.
I'm still not entirely sure on how to best distinguish between the two battery sizes. You mentioned that a 2.2kwh battery at the same current will drop in voltage faster than the 3.7kwh, ...
Ideally the algorithm would be smart enough to figure out the battery size/health based on resting voltage vs sag under load over a certain number of seconds or minutes. However, I anticipated this and have an option (currently not user editable but can be) to help the code distinguish between battery sizes https://github.com/openppg/eppg-controller/blob/34aa9ae7d7334756e56b93a07da59bd7b08964b0/inc/sp140/structs.h#L29
Just submitted some pull requests for review, let me know if you have any feedback or suggestions!
As of v5.8 we still are relying solely on battery voltage for state of charge (SOC) estimation. This results in inaccuracies, especially since the voltage to SOC relationship is not linear and batteries sag under load. We need to get to a more accurate estimator by factoring in at least
We can only get things like battery cycles/health from the BMS (bluetooth interface) but all of the above are presently available to the controller.