Ralim / IronOS

Open Source Soldering Iron firmware
https://ralim.github.io/IronOS/
GNU General Public License v3.0
7.1k stars 706 forks source link

Simple high precision temperature calibration #303

Open ruslan-khudyakov opened 6 years ago

ruslan-khudyakov commented 6 years ago

The simplest way to calibrate is dip the tip into the boiling water. (water boils at 100 °C (212 °F)

The currently used method (ambient temp) expects ≈30°С and is not accurate.

Procedure:

  1. Heat the water to a boil;
  2. Dip tip in the water;
  3. Press "Calibrate"
  4. Done! )))
dhiltonp commented 5 years ago

That's great!

TS-ILS went from being almost 30*C off of a linear fit at max to being 5*C off, @Ralim's 2 point calibration plus the improved PID will really improve performance.

Now to hit equilibrium faster...

Repled commented 5 years ago

Equilibrium: I don't see anything wrong in overshooting a bit. Let's say you set it for 350 C but you target 360 C or higher and after going past 350 C you set the final target of 350 C. For sure, with the right parameters for the PID you can make it overshoot, but how repeatable will it be for all the different tips ? Just an idea....

Let me know if you want me to confirm any new ideas, developments !

Berre1959 commented 5 years ago

I noticed that callibration can only be done reasonably accurate when tip and handle are at equal (room) temperature. Holding the handle in (warm) hands for a couple of minutes raises the temperature of the handle internally up to 32°C, which may differ from the tip by more than 15 °C in a cool environment. Prior to calibration one should not manipulate the handle with warm hands for at least 15 minutes. BTW, anybody tried measuring the temp of the tip using an IR laser meter? (Works between -50°C + 380°C. Accuracy is 1.5°C)

Eldenroot commented 5 years ago

When? :)

evyd13 commented 5 years ago

Just calibrated my TS-I using the simple calibration method, wondering if it could be added in the firmware list of tips? What information would you need?

whitehoose commented 5 years ago

Reading this thread shows a great deal of ingenuity and lateral thinking. I cant help thinking however that it's being wasted on a tiny molehill. I've been soldering since I was 8 years old (now 63). Copper slug with handle on a stove (we had it hard in them days!) 50v and 240v elements heating various weights of copper (the only regulation was in good hardware design), 3v NiCad portable. And an assortment of gas apparatus in various configurations. Its only in the last couple of years I got a Solder re-work station which happened to have temperature regulation/display. 1 iron now fits all (most) requirements.

In the past I'd look at a job and select the appropriate sized iron based on the area I intended to cover in 60/40 or 63/37. can't say I've ever looked at a job and thought that that needs 326.87 deg c. My internal conversation was more like "That's a hefty lump - better go for the big 'un". Or, "better be careful and use the mini".

With 63/37 (eutectic) the melting point is pretty certain, but with reputable brands - the datasheet states the melting point. The aim is to transfer solder to a surface and raise it's temperature sufficiently to fuse the alloy to the to the surface at a molecular level. If you don't achieve that bond - you're going to have problems.

The devil in the detail is in danger of eclipsing the real aim. With temp control I just want consistency, when I set my ideal - I want to be certain that that tip will always be where I set it. The thermostat provides the thermal muscle rather than tip mass. This magic happens at a set temperature - but just how or how long it takes to achieve varies quite considerably depending on physical and environmental circumstances. That's where I come in.

If you know how - knowing the actual tip temp +/- 0.5C is I'll admit nice to know - especially for us mildly aspergers / obsessive compulsive engineer types, but knowing doesn't actually affect the quality of the job. In fact if you blindly assume the join is good based only on temperature - you are in for a world of grief.

You can see the solder break and melt, that moment when the solder fuses to the part is unmistakable. doesn't matter what the temperature says (even if it's spot on). If its not taken, or if it won't take because of impurities or the insulation or casing is dripping onto the floor something isn't right. If a gourmet cook can settle for a 1-10 (or 11) heat scale, experience and/or a separate instrument (meat, sugar thermometer) fills in the gaps and confirms the concept. The rest is just dressing, if you put too much faith in one detail you'll miss the bigger point.

The firmware does a fair job. I like the feel better than the stock. From a temp perspective I can live with 5 degrees or so, as long as the gradient is consistent I'll fill in the blanks.

jjakob commented 4 years ago

So I performed the simple room temperature calibration on 2 TS100 irons today. Both were loaded with the latest 2.09.1 firmware. The temperatures of both irons when operating is ~17% high, when set to 300C they both read ~350C. The 2 thermocouples I measured the tip temps with were checked at 0C and 100C and they are both within 1%.

I also noticed a weird behaviour with the tip calibration: after I do the calibration on a iron that is cold (cooled down for >1h) the standby tip temperature reads ~30C, even though the room temp is ~22C. Is the calibration assuming the room temp is 30C?

I also noticed that after leaving the iron plugged in for a few minutes in standby (the tip was cold and wasn't heated), the handle gets noticeably warm. I assume the circuitry is generating the heat (voltage regulator, mcu or display). If the software reads the internal temperature and assumes the tip is at that same temperature during calibration, the calibration will be off, turning the tip gain up higher to match the sensor in the handle.

IMO the most accurate cal would be with the 0C/100C two point method, the second option having manually adjustable coefficients.

Edit: retried the calibration again, this time with the iron unplugged and at room temp, plugged it in and immediately did the calibration. The result is not much better - measured temp 345C, set at 300C. The standby temp displayed was 26C while the room temp was 22C. I'll double check the meter measurement with the solder melting point method.

Edit 2: the solder melting point confirms the meter measurement, 63/37 solder melts between the 150C and 160C setting, its actual melting point being 183C, 183/155=1.18 or 18% high, identical to the thermocouple measurement.

Firebie commented 4 years ago

In the code is used const 24.9 uV/C, but, if TS100 tip is similar to T12 - which has Type C, should be used coeff 21 uV/C.

jjakob commented 4 years ago

These are the stock B2 tips that came with the iron, from the original packaging. I wonder if the coefficient changed, and what it was before that yielded the more accurate calibration with previous firmwares.

I looked at the current calibration routine, and it just saves the raw measured (filtered) value to CalibrationOffset https://github.com/Ralim/ts100/blob/44e5ceeedfc8ceaedf4c374d0e1fe44e6d392e21/workspace/TS100/Core/Src/gui.cpp#L638

https://github.com/Ralim/ts100/blob/e725e63b3f86e4aafe5faf62d6be6f0506fd5234/workspace/TS100/Core/Src/TipThermoModel.cpp#L42-L61 This doesn't make sense to me, it takes the measured thermocouple voltage and subtracts the tip offset. But the tip offset was measured at room temp (293K) so the room temp is subtracted here too. The result is the current TC voltage minus TC voltage @293K minus offset (ADC). It doesn't take the cold junction compensation correctly into account too? Of course a consequence of this is that the resulting corrected voltage can dip below 0 so it has to catch that, which seems weird.

https://github.com/Ralim/ts100/blob/e725e63b3f86e4aafe5faf62d6be6f0506fd5234/workspace/TS100/Core/Src/TipThermoModel.cpp#L82-L88 The resulting voltage is then converted to a temperature by a single factor, which again seems odd. The room temp should be taken into account in one of these steps, ideally subtracted during the offset calibration. Otherwise if the room temp drops below the room temp during calibration, the measured value will be stuck at 0 and the temperature stuck at its minimum too - I doubt this does good for the PID as it then goes open loop? It shouldn't do anything catastrophic since at or below room temp we're blasting the power at 100% anyways, it's just wrong IMO.

jjakob commented 4 years ago

Reading some thermocouple theory made me realize: a) the output voltage is dependent on the temperature difference between the hot and cold junctions b) the cold junction, if it's just a junction to a different metal and not a real thermocouple junction, as is the case here, needs to be compensated by measuring its temperature and subtracting the resulting TC voltage that would occur at that temperature.

This means that as the handle-to-tip contacts temperature change, the offset should change by the same amount too. This is typically done by a separate sensor thermally coupled to the connection. https://www.maximintegrated.com/en/design/technical-documents/app-notes/4/4026.html

So if I understand this correctly, the current routine does factor out the cold junction temperature with the offset calibration (the room temperature doesn't matter, just that both junctions are at the same temperature), but doesn't account for changes in the cold junction temperature during use with changing iron handle temperature (conducted from the tip). Unless the iron has a temperature sensor inside the handle (maybe even inside the MCU?), this is impossible to solve. Of course we don't really care about the absolute temperature during either calibration or use, just about the temperature difference between the tip TC and the handle-to-tip connection.

A workaround would be to calibrate with the 2- or multi-point method at room temp (cold junction/offset cal) and at operating temp (I don't think 100C would be enough, as the tip temperature affects the cold junction temperature, a higher difference would be better, preferably normal operating temperature). This would calibrate the thermocouple coefficient too. It would still be unoptimal, with the tip temp with a cold handle undershooting and hot hande overshooting. (Thinking about this, it introduces some positive feedback and might make the temperature unstable or even run away if not compensated for. If there is enough positive feedback to cause a runaway, I don't know. But as the tip becomes hotter, the cold junction would get hotter too, the junction difference decreases, and thus the measured temperature is less than it really is. An unfortunate consequence of the design of these tips.)

Firebie commented 4 years ago

https://hackaday.io/project/94905-hakko-revenge/log/144548-hakko-t12-thermocouple-is-not-type-k/discussion-119449

jjakob commented 4 years ago

So at the very least, the constant needs to be corrected to 21uV/C to match the type C thermocouple, even if the advanced calibration isn't implemented. @Ralim

Ralim commented 4 years ago

My understanding here is that the miniware tips are not type C, at least from my testing of the ~5 tips i have available. I get a similar over-reading if i use Hakko tips, but very close to on point reading when using the miniware tips here. One concern here though is that some of my tips are getting on four years old, so it could be them aging.

The concept behind the current code is to use the handle temperature sensor as cold junction temp (its as thermally bonded to the contacts as it can be - it was designed for this use case).

The concept behind the current code is to just model the signal path itself, so it converts the ADC readings into uV. Then in later code the cold junction temperature is subtracted out to give the relative reading.

Advanced calibration has been implemented twice, two different ways, and it was less accurate than the current method, which is why i pulled it out (got sick of emails about it not working). I'm happy for someone else to write one, but i think the concept of it will never really work unless the person doing it has a high accuracy thermocouple anyway, and at that point it should really be just a cold junction calibration + ~350C one.

There is a temp sensor inside the MCU, but its less accurate than the one in the handle (due to calibration).

One thing I am aware of that should be fixed first is the 3.3V calibration should be done, as the current code assumes its 3.3V but it usually never is, and if that is off by X percent, all temps will be off by that same percentage (roughly).

The 24.9uV constant came from my 5 tips, and I'm happy to adjust that to suit the majority of users :)

My thoughts here are that the order to fix would be:

  1. Fix 3.3V rail so its measured to compensate for internal inaccuracies
  2. Change constant to 21uV
  3. Get a varied mix of tips tested
  4. Evaluate what constant should be
Firebie commented 4 years ago

3.3V - is it about this line? uint32_t vddRailmVX10 = 33000; //TODO use ADC Vref to calculate this

jjakob commented 4 years ago

Is the cold junction temp compensated in hardware, before the ADC? I see no code to do that after the ADC reading.

Also I'm pretty sure my tips are type C as the read temperature 17% low, and the difference between 21uV/C and 24.9uV/C is 18%. They may be different tips, or the age of the tips changes their characteristic, or there are other hardware differences (my iron says DFU: 3.45 in download mode). I'll try some more new tips (different shapes) to see if there are any differences.

Ralim commented 4 years ago

Yeah, if you look in the code the handle temperature is added to the measured value (in the uV to degrees conversion)

It would be nice and easy if my tips just have wrong uV values 😅

Firebie commented 4 years ago

From my experience with TS100 - handle temp is not very reliable. If iron is connected to power, debug menu shows 30C for handle, while environment temp is around 20C.

Ralim commented 4 years ago

This is actually correct, because ambient doesn't matter at all, what matters is the temperature of the "cold" junction. The inside of the handle warms up quickly, but that doesn't matter for this case.

Firebie commented 4 years ago

Another thing - I have original mini tips, and their resistance vary from 6.5 Ohm to 8.5 Ohm at room temp. And raw temp on them vary from 600 to 800 in terms of firmware.

jjakob commented 4 years ago

I see the temp sensor inside is right next to the contacts. I actually found the schematic now and looked at some datasheets. XC6206P 3.3v regulator is +-2% at 25C, 100ppm/C STM32F103T8 Vref is +-3%@(-40 to +85)C, 100ppm/C (the tempco only works out to +-0.6% over that same temp range so most of the 3% is the initial accuracy)

So the voltage regulator has a higher initial accuracy than the STM Vref, and both have the same tempco. Using the internal reference for calibration would likely yield worse results.

The TMP36 temperature sensor is typical +-1C, max +-3C. The worst case for the internal temp reading is (750mV * +-2%)/(10mV/C)=+-1.5C +3C= +-4.5C (ADC accuracy is negligible). Though typical will be +-2.5C or better. So good enough for compensating the tip.

Re: code, I see the handle offset is done in "TipThermoModel::getTipInC"..., the math seems correct. So only the tip coefficient seems to be to blame for the inaccuracy.

I measured 3 of my tips. They all have the miniware logo and two codes printed on them, one is the tip shape TS-xx, the other some unknown code (FS15, FS22, NR51). They come in packaging marked miniware that looks oem.

Calibration values (first is cold, then a minute later, then 5 minutes later): TS-B2 957, 939, 915 TS-C4 1008, 984, 972 TS-BC2 947, 928, 906

Temperatures (measured with the last/smallest calibration value (set, measured, error): TS-B2 200 226 +13% 250 286 +14.4% 300 345 +15% TS-C4 200 220 +10% 250 280 +12% 300 336 +12% TS-BC2 200 220 +10% 250 280 +12% 300 334 +11.3%

The errors were smaller now but I think that's because I waited a few minutes for the handle to warm up internally and then calibrating. Just from being powered in standby, the displayed tip temperature went from 24C to 30C (likely the internal sensor heated by the same amount). Retrying with a cold iron and immediately calibrating gives me the higher 16-17% errors again.

Perhaps the easiest would be to give a user adjustable coefficient (uV/C, centered around 21uV/C, 1uV/C steps between 20 and 25) in the advanced settings?

Firebie commented 4 years ago

Experimental version, allows to specify manually (not automatic) in advanced settings offset and tip coeff (uV/C), handle temp is not used.

KoopsF commented 4 years ago

I have a Hakko FG-100 clone. Is this still a calibration option? I'm on v2.08

Firebie commented 4 years ago

I did uV/C calculations for US6087631A: `

  0 100 200 300 400   0 100 200 300 400
0 0.000 1.731 3.622 6.332 8.410 0 0.000 0.017 0.018 0.021 0.021
10 0.175 1.939 3.830 6.521 8.626 10 0.018 0.018 0.018 0.021 0.021
20 0.381 2.079 4.044 6.724 8.849 20 0.019 0.017 0.018 0.021 0.021
30 0.587 2.265 4.400 6.929 9.060 30 0.020 0.017 0.019 0.021 0.021
40 0.804 2.470 4.691 7.132 9.271 40 0.020 0.018 0.020 0.021 0.021
50 1.005 2.676 4.989 7.356 9.531 50 0.020 0.018 0.020 0.021 0.021
60 1.007 2.899 5.289 7.561 9.748 60 0.017 0.018 0.020 0.021 0.021
70 1.107 3.081 5.583 7.774 10.210 70 0.016 0.018 0.021 0.021 0.022
80 1.310 3.186 5.879 7.992 10.219 80 0.016 0.018 0.021 0.021 0.021
90 1.522 3.422 6.075 8.200 10.429 90 0.017 0.018 0.021 0.021 0.021
100 1.731 3.622 6.332 8.410 10.649 100 0.017 0.018 0.021 0.021 0.021

`

Gromily4 commented 4 years ago

I also have problems with the accuracy of the temperature within +20 +30 degrees Celsius. 200 degrees exposed get 230 degrees.

jjakob commented 4 years ago

I haven't had the time to test the custom firmware from above, I managed to get mine calibrated "close enough" by one of the methods I described above (can't remember whether it was immediately after it being unplugged for a long time, or letting it be in standby for a few minutes)

rezin8 commented 4 years ago

https://hackaday.io/project/94905-hakko-revenge/log/144548-hakko-t12-thermocouple-is-not-type-k/discussion-119449

Of course they're not. They're Type N. I thought this was common knowledge by now.

I found this issue because my iron is running much hotter than the set temp and was looking for calibration tips or tricks.

Is there any way to just use this ITS-90 table for the Type N thermocouple to cross reference the reading your getting from the tip to the temp?

https://srdata.nist.gov/its90/download/type_n.tab

Firebie commented 4 years ago

They are not Type N, they are custom type: https://patents.google.com/patent/US6087631A/en with coeff ~ 21 uV/C