stawel / cheali-charger

cheap lipo charger
832 stars 275 forks source link

Unstable voltage for Cell 6 #203

Open neosunrise opened 8 years ago

neosunrise commented 8 years ago

Guys,

I have flashed the cheali charger firmware into my clone B6AC (Atmega32L MCU) as well as my Dynamite Passport Ultraforce 200W (Atmega32A MCU) chargers. It looks great and there are tons of customized features which I love pretty much.

At this moment, I found a problem with my Dynamite which does not seem to surface on the B6AC. Okay, here is the thing, when I was calibrating the Voltage for Dynamite, the first 5 cells seemed to be fine and after I entered the numbers read from the DMM, they sometimes dropped 0.001v after I hit ENTER, which I think is normal. However, for the 6th cell, it fluctuated a lot. For example, the voltage read on DMM is 3.829V (it's a good and accurate DMM which I trust) and I adjusted this on the charger. Then, the displayed number did not hold but was fluctuating all the way from 3.820V to 3.837V, which is very unstable and not acceptable. I bought this charger used and the stock firmware was somewhat not correctly backed up so I can't go back. I don't really think the original firmware did give me a very accurate charge either.

With this fluctuation, when charging my 6s lipo battery, it was really hard to complete the balance procedure as the voltage for the last cell was very unstable. This is not a problem for the 3s batteries as they do not use the last channel.

As a side note, in the voltage calibration menu, when I disconnected the balance port, all but the last cell showed somewhere around 0.015V while the 6th cell showed 0.375V and fluctuated by itself even though nothing was connected.

I suspect that the 6th channel has unstable voltage detection and the resistors that are used to check for voltage might be faulty. I have attached several pictures for your reference.

My B6AC has no problem at all - all channels are stable. I was wondering if I can use some of the resistors on it to repair my Dynamite.

The other thing that I am suspecting is that the definition of the pins on atmega32 is different so my VB6 actually reads something unrelated. But how do I check which pin is connected to which balancer?

I have shot a video demonstrating the problem.

https://youtu.be/1Q9JgVUmUuU

Please help me out. Thank you!

dsc_1016 dsc_1017 dsc_1019

diggit commented 8 years ago

Which file did you flash into Dynamite? Those 2 chargers are probably not compatible...

neosunrise commented 8 years ago

Thanks for the reply! I should have mentioned this - sorry for that!

I flashed the IMAX B6 clone firmware into the B6AC and flashed the GT Power A6-10 200W into the dynamite. Everything else works fine though.

stawel commented 8 years ago

I suspecting that your PCB is broken, can you send us cell voltage measurements with the default calibration? do:

  1. reset your calibration: "options"->"reset default"
  2. connect a battery
  3. go to voltage calibration: "options"->"calibrate"->"voltage"
  4. send us the displayed voltages in Vb1 to Vb6 and the measured cell voltages (do not calibrate the charger)
  5. disconnect battery while you are in voltage calibration menu.
  6. send also Vb1 to Vb6 when battery is not connected

You can also check: Vcc, GND, pcb traces to pin 1,2,3 of the LM2904 chip on your third picture. (http://circuits.datasheetdir.com/37/FAIRCHILDSEMI-LM2904-pinout.jpg) Compare voltage on pin 1 to voltage on pin 7 (LM2904), when:

  1. 6xcell battery is connected to the charger (voltages should be close to each other)
  2. battery not connected (voltages should be near 0V)

pin 1 - is the sixth cell voltage pin 7 - is the fifth cell voltage (guessed based on you pics)

neosunrise commented 8 years ago

Thank you for your reply Stawel. Here are the voltages that I just checked.

  1. Using my multimeter, the batteries had: Cell 1: 3.822v Cell 2: 3.819V Cell 3: 3.826V Cell 4: 3.819V Cell 5: 3.820V Cell 6: 3.821V Cell 1-6: 22.91V
  2. After resetting the EEPROM, the values shown in the voltage calibration menu were: Cell 1: 3.803v Cell 2: 3.806V Cell 3: 3.805V Cell 4: 3.807V Cell 5: 3.863V Cell 6: 3.964V - 3.972V (fluctuating, and when I pressed any button, it seemed to drop as low as 3.902V and then came back into this interval) Cell 1-6: 23.046V (fluctuating)
  3. In the calibration menu, after disconnecting the battery, the readings were: Cell 1: 0.014v Cell 2: 0.022V Cell 3: 0.017V Cell 4: 0.000V (yes, all zeros) Cell 5: 0.039V Cell 6: 0.349V - 0.381V (this was significantly higher)
neosunrise commented 8 years ago

I also measured the voltages (against GND, i.e., my black lead of the multimeter touched the ground on the PCB) on LM2904, and here are the readings:

  1. When battery was disconnected: Pin1: -0.0440V Pin2: started with 0.4998V and went up to 0.6238V and was still going up Pin3: basically the same as Pin2 Pin7: -0.0369V
  2. When battery was connected: Pin1: 1.9324V Pin2: 7.691V Pin3: 7.520V Pin7: 1.9244V

I hope this information can be helpful. I also suspect that the balancer pins are not connected to the same pins of Atmega32 on my charger than on GT Power A6-10.

Thank you!

neosunrise commented 8 years ago

I did a check and have confirmed that the atmega32 pins are connected to the correct balancer pins. Hmm, really don't know what to do now :(

stawel commented 8 years ago

I did a check and have confirmed that the atmega32 pins are connected to the correct balancer pins.

that's great, you eliminated one failure possibility.

You should also check Vcc and GND (your PCB traces may be broken):

  1. what's the voltage between Vcc and GND in compare to your power supply's voltage.
  2. while power supply is turned off a) what is the resistance between Vcc (lm2904) and power supply "+" wire b) what is the resistance between GND (lm2904) and any other GND for example HCF4051 (pin8)

You may also check PCB traces for pin 2,3,1, see: https://drive.google.com/file/d/0B1RXXTatsA1cb1R5NHM3MEtsakE/edit "balance6", "balance5" in yellow rectangle, the "7 2/2" op-amp is your lm2904

if Vcc and GND are ok, and the traces are also ok, then your Lm2904 is broken please replace it. (there is a chance that the HCF4051 is faulty, but this is less likely since all other cell measurements are probably ok)

stawel commented 8 years ago

please check also resistances between HCF4051 and atmega32 pins: HCF4051 - atmega32

  1. pin 11 - pin 3
  2. pin 10 - pin 2
  3. pin 9 - pin 1

when you measure resistances make sure power supply and battery is disconnected, and wait a while so that all capacitors discharge (or discharge them yourself).

neosunrise commented 8 years ago

Thanks again for the reply. It looks like there are multiple LM2904's but I assume you mean the one on the lower right area, close to the balancers. There are also two 4051's so I am not really sure which one to measure, so I did both and here are the answers to your questions:

1. what's the voltage between Vcc and GND in compare to your power supply's voltage.

I assume that you mean the Vcc and GND to Atmega32, and I have labeled the pins in the picture. The voltage between Vcc and GND was 4.949V and my power supply's voltage was 15.695V.

2. while power supply is turned off

a) what is the resistance between Vcc (lm2904) and power supply "+" wire

The lower right LM2904's Vcc and power supply "+" wire had a resistance of 10.3ohm (and interesting enough, one of the other LM2904's Vcc seemed to have very close to zero ohm to the "+" wire.)

b) what is the resistance between GND (lm2904) and any other GND for example HCF4051 (pin8)

zero ohms

PCB seems okay - at least I didn't see any noticeable break.

please check also resistances between HCF4051 and atmega32 pins:

HCF4051 - atmega32

  1. pin 11 - pin 3
  2. pin 10 - pin 2
  3. pin 9 - pin 1

The upper 4051:

  1. pin 11 - pin 3: 0.18 ohm
  2. pin 10 - pin 2: 0.19 ohm
  3. pin 9 - pin 1: 0.20 ohm

The lower 4051:

  1. pin 11 - pin 3: 0.14 ohm
  2. pin 10 - pin 2: 0.13 ohm
  3. pin 9 - pin 1: 0.16 ohm

The power supply and the battery were both disconnected when measured.

btw, I tried to measure the resistance between balance6 (the yellow rectangle, and I assume this is the pin that connects to the battery's 7th pin) and 7 2/2 's pin2, it was about 1.98mega ohms, which is a lot different from the one in the circuit diagram (in which there is only a 499K resistor R89 in between).

I don't recall if the stock firmware also suffers from this voltage fluctuation and if you can share with me the GT Power A6-10 stock firmware, I might be able to try it on my charger.

Thank you so much! pcb

neosunrise commented 8 years ago

I managed to flash the Turnigy A6 200W stock firmware into my charger and it seems to work fine. However, the calibration was very inconsistent - some cells were very accurate but some were way off. Anyway, the last cell voltage did not fluctuate - it stayed there deadly and was actually very accurate compared to my multimeter.

And one thing I need to point out (maybe I should have pointed out earlier) is that the firmware was built from the source code under the ev-peak branch using cygwin. I did this because the hex files built from "master" and "develop" branch got stuck at the first screen "eeprom reset:15 yes" and did not respond to any of my input while the code in ev-peak branch gave me a running firmware, though it has the fluctuation problem.

I just tested the hex files provided by cheali-charger and the fluctuation seemed to have gone. Can you let me know what the differences are bettern the ev-peak and the master code?

stawel commented 8 years ago

I assume that you mean the Vcc and GND to Atmega32..

I meant the Lm2904 Vcc, it should be the same as your power supply's voltage, but It looks like your PCB traces are ok

I tried to measure the resistance between balance6 and 7 2/2 's pin2, it was about 1.98mega ohms, which is a lot different from the one in the circuit diagram...

there is a bug in the circuit diagram, op-amps "+" input is on pin 3 of the LM2904 (and LM358)

And one thing I need to point out ...the firmware was built from the source code under the ev-peak branch using cygwin

Yes, this information is quite important, I do read the source code to make sure I didn't miss something ;)

"master" and "develop" branch got stuck at the first screen "eeprom reset:15 yes"

you have to press the "start" button

I just tested the hex files provided by cheali-charger and the fluctuation seemed to have gone. Can you let me know what the differences are bettern the ev-peak and the master code?

we do 10 measurements in a row to get the 6th cell voltage, as far as I remember, in case of the master branch we ignore the first measurement, to eliminate cross talk from the previously measured input (the "down" button). https://github.com/stawel/cheali-charger/blob/master/src/hardware/atmega32/generic/200W/AnalogInputsADC.cpp#L125-L126 But this cross talk was very small about ~0.001V on a fully functional charger, in case of the "GT Power A6-10" cheali-charger firmware we treat all balance ports inputs equally (not true for 50W chargers) so you should see the cross talk on all cells,

To summarize: I still suspect that your LM2904 is broken, many because of:

Cell 6: 0.349V - 0.381V (this was significantly higher)

if I were you I would replace the LM2904 and the HCF4051 (to be extra sure), they const only about 2$ for 10 chips ;)

neosunrise commented 8 years ago

Thank you so much for the info, your advice, as well as you effort - I really appreciate it. I actually rebuilt the code in the master brach using my other laptop which has Atmel Studio + CMake +MHV AVR Tools and it gave me zero problems during the build process. I then flashed the hex files (50W / 200W) into my B6 and Dynamite and both work fine. The Dynamite does not fluctuate anymore. This is really weird as I can't figure out why the build tools can affect the results so much. I then uninstalled all avr tools (WinAVR, CMake, Cygwin, etc.) on my main laptop and reinstalled only CMake and MHV AVR Tools but it gave some errors like gcc is broken. I am still fixing it but have no clue.

As far as the charger, the firmware seems to work fine. The only problem is that if the charger is calibrated when it's cool, then when it has been discharging batteries for a while and gets hot, the voltage readings are about 0.01-0.015V off. I think this is due to the precision of the resistors used in the charger and there is nothing I can do (nor I want to do).

btw, I saw that there is a calibration point option in the voltage calibration menu (all the way down at the bottom of the menu). I can select 0 or 1. May I know what that is for? Please note that I am not talking about the calibration point in the Expert menu.

stawel commented 8 years ago

... discharging batteries for a while and gets hot, the voltage readings are about 0.01-0.015V off. I think this is due to the precision of the resistors

Yes you are right, to be more precise, it has something to do with the resistors temperature coefficient

btw, I saw that there is a calibration point option in the voltage calibration menu (all the way down at the bottom of the menu). I can select 0 or 1. May I know what that is for? Please note that I am not talking about the calibration point in the Expert menu.

Each measured input (voltage input, temperature input, current input) has two calibration points: point 0 and point 1. When you do voltage calibration you are setting only calibration calibration point (point 1) (you are telling cheali-charger what current ADC value means in terms of voltage), and by default the first calibration point (point 0) for voltage inputs is set to (0,0) (ADC = 0 means 0V), see: https://github.com/stawel/cheali-charger/blob/master/src/hardware/atmega32/targets/GTPowerA6-10-original/defaultCalibration.cpp From this two points we extrapolate any measured value. If you want you can also change calibration point 0, but it's not recommended and I'm not even sure how to correctly do this.

neosunrise commented 8 years ago

Pawel,

Is it possible to add items in the Settings or Voltage calibration menu so that people can set the voltage compensation / internal temperature correlation coefficient? More specifically,

  1. When a user is calibrating the input voltage (V_in) as well as the cell voltages (Vb1 - Vb6), the corrected values are recorded at that particular temperature.
  2. When the temperature changes, the user can get into the voltage calibration menu again and adjust the voltage compensation/temp coefficients for each voltage channel (V_in and Vb1, Vb2, ..., Vb6, separately) so that the voltages can be compensated accordingly (I believe having a value for each channel is reasonable as not all channels are affected by temp equally). The coefficient value could be anywhere between -1.000V/℃ to 1.000V/℃).

This would be extremely useful if the parts in the charger do not have good temperature coefficient.

Not sure if this is viable and I am also looking at the source code but haven't got a clear clue as how to modify it.

Thank you!

stawel commented 8 years ago

Is it possible to add items in the Settings or Voltage calibration menu

Yes it is possible, but I don't have the time to implement this, I encourage you to try it yourself.

Although you should be aware that:

  1. your temp. sensor measures the temp. of your chargers transistors, not the measuring resistors
  2. the place of heat accumulation strongly depends on the current program (charge, balance, discharge...)
  3. we are running out program memory (~2kB flash is available)
  4. most of the chargers do not have a temperature sensor
  5. it would very complicate the calibration process

overall it's hard to implement this feature the right way (maybe even impossible).

Not sure if this is viable and I am also looking at the source code but haven't got a clear clue as how to modify it.

a good place to start would be (at least for the balance port voltage): https://github.com/stawel/cheali-charger/blob/master/src/core/AnalogInputs.cpp#L563-L565

here we set the balance port voltage, currently we just copy Vb1_pin-Vb6_pin "pin" voltage to Vb1-Vb6, (Vb1-Vb6: voltage used throughout the firmware). You could modify the voltage here.

neosunrise commented 8 years ago

Although you should be aware that:

  1. your temp. sensor measures the temp. of your chargers transistors, not the measuring resistors

Yes, I am aware of that but you can set different compensation value for each channel so that so would be compensated more and some less.

  1. the place of heat accumulation strongly depends on the current program (charge, balance, discharge...)

I understand that and that's why my charger's voltmeter always becomes way off when it gets heated. Usually when the charger is calibrated at 25C and if the temp goes up to above 40, some channels would give me 0.02V lower than the actual voltage. Basically, when temp goes up, the voltages displayed on the screen go down. This is not a problem when I discharge a battery as the cutoff voltage would be higher than the actual voltage so it will never be over-discharged but for charge, it's dangerous.

  1. we are running out program memory (~2kB flash is available)

Yes I am aware of that and actually, I have implemented this feature and the flash size is around 32088 out of 32768 so it's almost full. Compared to the stock firmware for my Dynamite (30476) and iMax B6 (27788) , it's bigger. Would there be any problem if we almost use up the 32K memory? BTW, I believe my flash is kinda big because I changed some display text in the firmware and I guess they might be a bit bigger.

  1. most of the chargers do not have a temperature sensor

True, but if their users also suffer from this problem, they can actually install the external temp sensor inside the charger and then use that one to keep track of the internal temp.

  1. it would very complicate the calibration process

Exactly. At this moment, what I did is to calibrate the internal temp first as it will be recorded as a reference value. When calibrating the voltages, the charger will save the user entered voltages as well as the temp at that moment. Then, the user can go ahead and charge/discharge batteries. If he notices the voltages go wrong, he will need to get into the settings menu and turn on the voltage/temp compensation option and adjust the values for the corresponding channels (Vinput, Vb1, Vb2, ..., Vb6, as wellas Voutput). However, I think the voltage sampling resistors' resistance will change much much slower after the temp reaches a certain point so in my implementation, I allow the user to set a lower temp threshold and a higher temp threshold. What happens is that when the internal temp goes beyond the higher temp threshold, then that threshold will be used as the current temp and will not change, unless it drops below the threshold. Same thing applies to the lower threshold.

My personal experience is that this thing works but you need to go back and force to make many changes and it really does complicate the calibration procedure.

a good place to start would be (at least for the balance port voltage): https://github.com/stawel/cheali-charger/blob/master/src/core/AnalogInputs.cpp#L563-L565 here we set the balance port voltage, currently we just copy Vb1_pin-Vb6_pin "pin" voltage to Vb1-Vb6, (Vb1-Vb6: voltage used throughout the firmware). You could modify the voltage here.

That's exactly what I found in the code. I also add the compensation to the input as well as the output voltages. Originally, I used uint16_t as the type of the compensation (in volts) but then quickly realized that this value could be negative when the internal temp is lower than the reference temp so changed it to int_16. Another problem is that I would like to calculate the temp difference and then multiply it by the compensation volts - both of which are held in uint16_t variables. Since this type can only support up to 65535, I guess it will overflow if you multiply two values in some cases. For example, the temp difference is 20C (reference temp is 25C and the current internal temp is 45C), which you store them as 2000 and the compensation voltage set by user is 0.033V/C, which is stored as 33 in an uint16_t, then 2000 * 33 = 66000, a little over the limit and overflows. What I did was to cast both values into int32_t before taking the multiplication and get the result divided by 100 (it turns out that this will make the minimum compensation to be 0.01V when the temp difference is 10C so I get it divided by 1000 instead meaning that the compensation volts set by user is for every 10C change) so that it eliminates the magnified value (x100) for the temperature. It seems to work fine but I will need to do more tests.

Here are my settings and they seem to make sense: Voltages are calibrated at around 24C (my internal temp sensor is not so accurate btw but that doesn't matter as we care about the difference) LOW: 5C, HIGH: 34C Input: 0.032V/10C C1: 0.020V/10C C2: 0.017V/10C C3: 0.030V/10C C4: 0.005V/10C C5: 0.030V/10C C6: 0.019V/10C Output: 0.000V/10C

stawel commented 8 years ago

hm... if it works I can only congratulate, would be to see your release with this feature