Ralim / IronOS

Open Source Soldering Iron firmware
https://ralim.github.io/IronOS/
GNU General Public License v3.0
7k stars 700 forks source link

Simple high precision temperature calibration #303

Open ruslan-khudyakov opened 6 years ago

ruslan-khudyakov commented 6 years ago

The simplest way to calibrate is dip the tip into the boiling water. (water boils at 100 °C (212 °F)

The currently used method (ambient temp) expects ≈30°С and is not accurate.

Procedure:

  1. Heat the water to a boil;
  2. Dip tip in the water;
  3. Press "Calibrate"
  4. Done! )))
Ralim commented 6 years ago

Hi, This has been suggested before, and I might implement this as a second stage calibration.

However, "Water" does not boil at 100C. Pure H20 does, when its at 1 atmosphere (and thus standard pressure). Any deviation from this can cause errors in the measurement.

At the moment, the temperature sensor in the handle is used (Thus the warning to ensure the handle is the same temp as the tip). This provides a temperature in 0.1C increments that is used for the calibration.

ruslan-khudyakov commented 6 years ago

Hi, thanks for answer!

Right now in my room is 23°C degrees and after calibration soldering iron show me tip how 33°C. (Was in standby mode etc.)

However, this is so far from 0.1°C…

Yes, idial conditions for 100°C it sea level and 1 atmosphere but even at 1000 meters water does boil at 97°C. How many people living on 0.5 miles above sea level? )))

P.S. as a second way it would be cool!

yschaeff commented 6 years ago

What about calibrating at 37 degrees Celsius? More practical than boiling water and more accurate than "room temperature". Also Come to think of it the iron can double as a thermometer. ;)

ruslan-khudyakov commented 6 years ago

@yschaeff, yeah, "rectal calibration mode", LOL!!!

JohnEdwa commented 6 years ago

The temp calibration should simply just be user-settable offset like the voltage, so it doesn't matter if you do it at 15C, 37C or 100C. Everyone owns a thermometer of some sort.

Scratch that, didn't read Ralims post with any actual intelligence. (need more coffee).

Eldenroot commented 6 years ago

Anyway we need a better way how to calibrate properly - we need at least two points calibration - two temperatures - 30 and 100 C? What do you think

ruslan-khudyakov commented 6 years ago

The best way in my opinion is 100°C plus Manual User-setting like a voltage.

Eldenroot commented 6 years ago

37 and 100 for two points calibration would be the best. One point calibration is not precise.

Ralim commented 6 years ago

@Eldenroot I'm not against a multiple point calibration option. But generally it will just provide a means for the user to mess up the calibration of the iron.

As to go further would involve then calibrating the thermocouple and op-amp, and to calibrate out the tolerance of the op-amp circuitry etc would require two stages of calibration, and two different ambient temperatures as well to allow for a change of the cold junction.

At the moment the firmware is assuming an average value for the temperature gain for the system. And the calibration is there to calibrate the offset in the system that is unaccounted for after cold junction correction.

The actual correction required for cold junction can vary tip to tip, along with both the tolerance of the op-amp and the temperature curve for the tip.

So the calibrations required for this would ultimately inflate to a 4 step process, done for two different tips to profile the unit fully. Which still will only improve the tip actual temperature slightly.

Note that there is currently an issue with the screen showing a higher temperature than is actually at the tip due to the filtering done for the UI. This is most pronounced at the low end of the scale, and dimishes when the iron is operating. This offset is a bug that will be handled in future.

Eldenroot commented 6 years ago

Ok, you are right. I am looking forward to see this bug fixed, keep good work, perfect fw.

ruslan-khudyakov commented 6 years ago

We have two available constants — 1. Boiling water: 100°C; 2. Melting Ice: 0°C.

The best for home use.

blitzcode commented 6 years ago

Using melting ice & boiling water seems a bit weird to me. The more obvious and relevant calibration point would be the melting point of some eutectic solder alloy like normal 63/37. Besides, a multi meter with a thermocouple is pretty common equipment and cheap Hakko tip thermometer clones are available for under 10 bucks.

I noticed on my TS100 that the temperature is completely off:

http://www.minidso.com/forum.php?mod=viewthread&tid=3168&extra=page%3D1

This is also not a constant offset and increases with the set temperature, so a single bias value would not be enough. Calibration with the second temperature sensor at room temperature does not seem useful as the error increases with temperature, so at 20C or so it is negligible.

I would really appreciate a feature in the custom firmware that allows this to be mitigated as with the current official & custom firmware the difference between actual vs displayed temperature >30C.

ruslan-khudyakov commented 6 years ago

Melting & boiling points weird? Sorry, but the Celsius scale is based on them. )))

blitzcode commented 6 years ago

Yes, I think so. It's going to be tricky / messy / unreliable / impossible to heat the iron to exactly 100C with boiling water or cool it to exactly 0C with a bunch of ice cubes. Slowly increasing the set temperature till you reach the well-defined melting point of 63/37 solder (which you likely have) seems much simpler and more reliable. The melting point of solder is also closer to the temperatures which you'll actually be using the iron at, making the calibration more likely to be useful in the presence of any non-linearities like the ones I observed when measuring my TS100. Besides, if you have an external reference like a DMM or tip thermometer you can also save yourself from messing around with kitchen supplies to calibrate your iron ;-)

But I'm not an expert on measuring & calibrating soldering irons, I'm happy with any solution that would allow me to get my TS100 to heat up with <10C error from the set temperature.

ruslan-khudyakov commented 6 years ago

@blitzcode, the iron in the boiling water will have exactly 100°C and we will get accuracy ±1°C (or better)

JohnEdwa commented 6 years ago

If you really want to know and calibrate the tip temperature, get a Hakko FG-100 clone, you can find ones for under $15. Or if you really want to cheap out, the sensors are just bog-standard K-type thermocouples almost any multimeter can read and can be bought $4 for 10pcs.

Talking of the FG-100:

@Ralim, I tested my three tips (B2, BC2 and D24), and only the original B2 was actually anywhere near accurate, the other two are showing notably low values. Before each test, I let the iron cool down and did the tip temp calibration. Or did I just get two dodgy tips.

tiptemp

Ralim commented 6 years ago

Hi, I'll try and get something a bit better in a coming firmware.

Can I ask what tips everyone is using? I only have offset issues on the non production tip I was sent and hakko tips.

It is looking like some models of tips have a much higher error compared to some.

I own two of the clone hakko fg-100 units, and neither gives the same result (difference of about 10C).

When testing my older BC2 tips and C1 tips they come in real close to the set point, however the newer BC2 tip is about 15C.

I have noticed significant offset can occur if the offset error cancellation is done with a warm handle, Or warm tip.

At the least I will try and get an updated firmware that lists the temp offset out when I'm next working on this.

Also, none of the tips are rated above 400C, and prolonged use above 400 can degrade the temperature accuracy.

ruslan-khudyakov commented 6 years ago

@JohnEdwa Clone does not guarantee accuracy. (Original HAKKO FG-100 tolerance of ±3°C)

blitzcode commented 6 years ago

@JohnEdwa

Exactly, this stuff is not expensive. Nice graphs! ;-)

Here are my measurements from the other thread:

Set  TipTherm Fluke
300  267      269
350  308      314
400  355      358

TipTherm = FG-100 clone, Fluke is a Fluke DMM with a K-type thermocouple. I used a D24 tip. I also have a BC2 tip, also very inaccurate.

It seems the FG-100 clone and my DMM agree reasonably close. If I place the DMM thermocouple on the D24 tip the measurements are a few C below the FG-100, if I place it a bit higher on the tip they're a few degrees above.

I also found this video:

https://www.youtube.com/watch?v=DEEaLMv6dog&feature=youtu.be&t=12m49s

Some users claim they get very accurate temperatures with their TS100, I certainly have an error > 30C. Official firmware is no better. Both my tips have this huge error.

ruslan-khudyakov commented 6 years ago

@Ralim After replacement tip we need recalibration. All tips a little different. (thermocouple position, weight, volume, etc)

JohnEdwa commented 6 years ago

@ruslan-khudyakov

Clone does not guarantee accuracy. (Original HAKKO FG-100 tolerance of ±3°C)

Of course not, you pay $250 for that promise of accoracy, but the device itself is just a K-type thermocouple reader, and the sensor is a K-type thermocouple with a piece of metal crimped on top - both super simple things for a clone to do for a fraction of the price of the 'name brand'.

And seeing that my $15 clone, the device itself is within 1.7C from my Brymen BM869 (0.3% +-1.5C) and the testing the same temp with a regular K-type the difference is just 0.8C, I'd say it's accurate enough, seeing it's just 16% the cost, wouldn't you say?

device sensor

ruslan-khudyakov commented 6 years ago

@JohnEdwa 1.7°C between Clone and BM869 doesn't mean anything. Accuracy - it's tolerance from Real Correct Temperature, not comparative cheap gadgets measurements.

Here we don't know temperature you really have here. We just know that THIS measurements are close. For assess of the accuracy we need a temperature standard (etalon).

JohnEdwa commented 6 years ago

@ruslan-khudyakov

Sure, we have no idea what the actual spec for the FG-100 clone is, or how accurate and precise it actually is, or if the measurements are linear at all. But we do know it for the BM869 - it's has a 0.3% +- 1.5C spec on temperature. The 294.5C I measured, the real temperature should be between 292C and 297C. And when that same temperature was tested with a different thermocouple, the BM869 got 293.7C, meaning it should be between 291C to 296C.

And as in this instance, the FG-100 measured 292C which fits. If I wanted to do it, I could see if my FG-100 clone also fit the 0.3% +-1.5C spec - all I would have to do is do a bucketload of measurements with both, and check that they all fit within the range of the BM869 spec.

blitzcode commented 6 years ago

I got a KU tip today, quick test at 320C set shows ~290C with the tip thermometer, seems like the temperature is completely wrong with this one as well. I don't think there's much need to discuss the finer details of measuring temperature and calibrating soldering irons, the problem appears to be in a different order of magnitude ;-)

LarsSimonsen commented 6 years ago

Can I ask what tips everyone is using?

BC2

JohnEdwa commented 6 years ago

Just received my C1 tip, it is way off as well. Redid the measurements and graphs to suit, though I went from just 200 to 400 instead of 150-450 this time.

iohhhu

Ralim commented 5 years ago

FYI: I'm working on this at the moment slowly as I have time / patience. @JohnEdwa Since you have a few tips at your disposal, could you give me a hand by getting a spread of 2-4 temps over the temp range per tip and recording the "Rtip" measurement that is now shown in the debug menu (long hold the rear button on idle screen). Doesn't need to be perfect temps, but even just some rough values to get an idea of the different tip curves.

JohnEdwa commented 5 years ago

@Ralim Sure, though these tips cool off really fast and even more so if I actually try to measure the temperature at the same time, so a way to see Rtip while the iron is on would be immensely helpful.

Repled commented 5 years ago

Hi All,

I bought two TS100 and one with the full kit of tips. I have now made some tests and this is what I have come up with. I have a Fluke 87 mkV with a thermocouple. I have used all tips un/calibrated and the latest Ralim release (Maj 7:th 2018).

  1. It is very important to wet the tip with solder when measuring the tip temperature and give it time. Otherwise smaller tips give much lower values than larger tips and the time is needed to heat the external Fluke measurement thermocouple. Theses measurements gave that the tips where in the range 10-20 degC below the set temperature, at approx. 200 degC.

  2. When using a 63/37 Tin/Lead Solder with a melting point if about 183 degC, all my tips melt the Solder in the range 190 - 200 degC. The important thing here as well is to overheat the tip and wet it with solder before going down in temperature to test the melting point. Otherwise the same thing here, smaller tips give higher temperatures.

I have a Weller TS80 and that one is off by about 30 degC (just an analog turning knob though).

The problem I see when going up i temperature is that the tips start to deviate more and more the higher they get, this is also confirmed by JohnEdwa's measurements.

Thinking out loudly, I am wondering if this is partly due to the design of the tips where the thermocouple and the heater is integrated. If you have a fixed temp you will have a more or less fixed PWM dutycycle for that temp. The higher you go the higher will the dutycycle be. At higer temps, if the wire (heater + thermo) within the tip does not reach equilibrium you measure a higher temp than what the outer part of the tip actually has. The temperature gradient, cooling of the tip also gets steeper the higher you go in temp.

Does this make any sense?

)) P

P.S. I would try to push the ADC measurement as far away from the PWM-pulse as possible. (I have not looked at the exact implementation so this is perhaps already the case?)

Ralim commented 5 years ago

@johnedwa Good point, I'll try and get you a testing build soon.

@repled

I agree with your two points completely, and when measuring I usually try to ensure the top of the thermocouple is inside of the solder glob.

The deviation is a combination of non-linearity as well as when above 400C the tips starts to run out of headroom of the ADC/op-amp.

The design comes into this as Heating and measuring the tip temperature is mutually exclusive, and there is a recovery time after the end of the heating pulse train for the sensor to stabilise and the op amp to desaturate. The tip appears to recover faster than the op-amp.

In the system at the moment the hardware is setup to trigger the ADC automatically after a delay at the end of the pwm period. This delay is not perfect,and on some devices the first ADC sample will catch the tail end of this recovery,which will lead to a slight overreading, however this only occurs during full pwm duty (usually only the main heat up), which is mostly hidden by the pid and thermal reaction time of the tip

https://ralimtek.com/stm32_double_pwm/

That is the small article on how I setup the timers to gain nicely scheduled ADC readings.

Cherenkov11 commented 5 years ago

Just an idea, due to the variety of tips (types), I think it would be a good idea add presets, to store the calibration setting for each tip.

Repled commented 5 years ago

Thank you for the link, very enlightening, and your comments ! You have done your homework.

It's very tempting to start modifying the hardware, but that is perhaps to go one step too far ;o) (Even though it's a very tweak-friendly design).

In practice, when soldering you probably want to be in the range 330-380 degC to heat and wet the joint quickly. Just using a fixed offset for each tip would perhaps be the simplest solution. (This is what I personally would prefer). This on the other hand requires that you have same means to measure the tip temperature.

The other way is to try to get a model of the non linear behaviors in combination with some sort of calibration, but it might be a tough road to walk, or it's not. If you get the temperature right within +/- 20 degrees, that is probably good enough.

The current calibration does a fairly good job at getting the temperature right at 200 degC.

On the other hand, I have lived with my 'analog' Weller for almost two decades and have learned to set the dial between 350 and 420 degC depending on situation and tip, and I just now found out that the temperature is off by 30 degC.

Just curious, how do you debug this design? Eval board?

Ralim commented 5 years ago

I would not modify the hardware for two reasons; Safety and consistancy. Having two firmware builds for drastically different hardware is a pain. The cap (C10) actually provides a means of safety, where a stuck pin state will not cause heating.

The calibration in the unit already does an offset calibration, but as each model of tip has slightly different gain values, some are closer than others. This is all the stock firmware did which is why its what I carried forward, however going to an option to either do single point + select tip model OR two point calibration would be ideal. This allows a fast cal option for most users, with odd ball tips supporting a two stage cal.

I measure the tip temperature during the cal by asking the user to use a cool iron, so the handle sensor and tip should be in alignment.

To debug: Solder the SWDIO & SWCLK wires to the small mcu board inside the iron :)

Ralim commented 5 years ago

@JohnEdwa I have made the attached firmware It's accuracy will be worse than stock most likely as I'm partway through enabling the second ADC to get a bit more resolution in temp measurements, and haven't calibrated this in at all.

But, if you load this firmware and enable the detailed soldering screen, the top line will show you the raw tip temp. This will be as noisy as the thermocouple is (quite), so you may need to eyeball figures a bit i'm afraid. Generally its more stable the bigger the tip.

TS100A.zip

Would love a dataset of Measured temp Vs the raw temp (RTemp) for the different tips, with as much detail as you can be bothered to do. At the least I'll try to push out some updated curves while I try and nail down why my two stage cal isnt working yet ( I think mostly as boiling water isnt that hot doesnt help).

Cherenkov11 commented 5 years ago

I saw a video of the T12 soldering station where the calibration it is done by three point, for each tip, I do not know if it is applicable to the TS100, due to some limitations on the hardware.

3 Point Calibration

JohnEdwa commented 5 years ago

@Ralim I didn't manage to test them all yet, but as we are on exactly the opposite timezones I might as well link the BC2 and C1 I did so far before going to bed. https://docs.google.com/spreadsheets/d/1CJ1jZ5H-96_IjHNsdhNipEQ9NROWWIWTls93KeOXh8Y/edit?usp=sharing

The C1 does feel suspiciously low though, but, it is linear, and it did start melting the solder just before the 200C mark like all the others (60/40 melting point is 188C), so it might be true. If you do add that curve to the firmware and someone configs the iron for a C1, but puts a B2 in it and cranks that to 450C, it would heat it up to around 560C.

@Cherenkov11 That requires you to have a way to measure the tip accurately, which not everyone can do, so the "stick the tip into boiling water" would be better, if it ends up working well enough. Though, as both boil down (har har) to just being a temperature curve offset, I'd recon adding in a temp based calibration wouldn't be that hard either.

Repled commented 5 years ago

@Ralim

I would not modify the hardware for two reasons; Safety and consistancy. Having two firmware builds for drastically different hardware is a pain. The cap (C10) actually provides a means of safety, where a stuck pin state will not cause heating.

Yes, as I said, probably one step to far :o) and yes I fully understand the reason for C10.

If doing two point, I would personally prefer a well defined solder, though the ideal would be something higher up in the 300-400 degC range.

To debug: Solder the SWDIO & SWCLK wires to the small mcu board inside the iron :)

Simple and effective ! Thank you.

If I get some time and is back home again I will see if I can contribute with some tip measurements.

Ralim commented 5 years ago

@JohnEdwa I've build a slightly nicer firmware that is much, much easier to read (attached) The hardware will cap out at a maximum of somewhere between 430 - 455 C depending on the tip curve, so nothing will detonate but cal will be as bad as it is now with the wrong tip :P

If you could also record the H: reading for each time (handle temp) it would save me an extra step too! TS100A.zip

Edit: @JohnEdwa With your two sets of measurements for some of the tips, is this two different tips? @Repled Extra readings wont hurt :) Going to basically try and find average curve fits for each tip type. This will be faster to push out than support for a dual point calibration, but i'm slowly poking that along too.

A 2 point calibration with as high of a temp separation as possible would be ideal, since it improves accuracy. The dream would be to have two styles of calibration, one where you can use ambient + 100C water. And another where it heats up to what it thinks 300C is, and you can trim its gain until it lines up.

But more time to implement :(

Ralim commented 5 years ago

Hi All,

Pushing up here a new build, It's not a finished build, but it has new gain values from @JohnEdwa 's numbers.

Added to the advanced menu is a tip type selection. Note that the "Custom Tip" is not functional yet (needs next set of calibrations). I highly recommend you also do the normal calibration routine on your handle the first time, this makes a large difference in the accuracy. TS100A.zip

Build coming when i get time to finish the coding for a more advanced calibration option. This will be performed using a two point reference to derive the gain value for your tip. This will have two options, one using ambient and hot water, and another using a tip temp measurement tool.

Repled commented 5 years ago

I have not had the time yet to measure the tips but I have been experimenting with a themocouple made out of really thin wire, approx 0.07 mm. This seems to give much better, more accurate results that settles more or less at once. Will hopefully have some time to measure tomorrow.

Ralim commented 5 years ago

@Repled Thank you, If you could use the above firmware in my last post, would be nice to see how close it tracks for some of the tips as well. :)

Repled commented 5 years ago

It's not easy to measure Tip temperatures ! The wires in the very fine Thermocouple that found are not really good to solder on, they don't really go well with the solder and they got full of solder waste and oxides and more or less stopped working at higher temps due to that. A fix, squeezing on a very thin piece of nickel sheet metal (1.5x1.5mm) at the top of the Thermocouple made a huge difference.

Here is a preliminary peek on my findings so far. Will upload the data tomorrow when I hopefully have made some more measurements.

image

Repled commented 5 years ago

Hi All,

Here are the complete measurements. These values are 'Raw-values' so no calibration is added. In the graph a Linear Approximation has been added. This is used in the second graph to remove the linear part and magnify the differences.

Graph 1 image

In graph 2 below the Y-axis scale is approximately 15 to 16 degC per 1000 units.

Graph 2 image

Graph 3. These where the initial measurements with the latest FW 2.04 with only room temperature calibration. This Calibration does a fairly good job at getting it right at around 200 degC. Yes, the x- and y-axes are switched in comparison to Graph 1 and 2.

Graph 3 image

Attached are the measurements as an .ods-file (Open Document Format Spreadsheet). Please, let me know if you want some other format.

TS100_Tip_Measurements_2018-08-17.zip

@Ralim No, I have not yet tried the last FW. Please review what has been done here first :O)

Repled commented 5 years ago

Good to see some WIP even if slowly :o)

Ralim commented 5 years ago

It's getting there. The hot water cal appears to be working the few times I have tested it. Haven't implemented and tested the two point hot cal though (advanced cal)

dhiltonp commented 5 years ago

@Repled: Could you run the updated PID firmware and verify the wattage doesn't oscillate with the I tip?

While you're at it, could you double check the performance of the C4 tip - I suspect it'll overshoot then stabilize at the target temp.

My tuning has been with the BC2, you could use that as a reference for 'ideal' performance.

Here's the firmware: https://github.com/Ralim/ts100/issues/275#issuecomment-420197231

If there are irregularities, it'll make sense to add thermal capacity calibration!

Repled commented 5 years ago

Will try to get some time over the weekend doing some measurements.

dhiltonp commented 5 years ago

Excellent!

One thing to play with on that firmware - if you use the advanced display, you can see the watts going into the tip. The temp hits its target pretty fast, then the wattage slowly drops to a base wattage to maintain that temp.

That slow decay must be due to the tip slowly achieving thermal equilibrium. I look forward to your observations!

dhiltonp commented 5 years ago

There is a thermal gradient along the tip at all times. The base dissipates heat slowest, having the smallest surface area/volume, and the tip has the most area/volume. That gradient may not be linear, but it'll be good to have temp data from when the watts have stabilized.

Hitting equilibrium fastest probably requires overshoot, avoiding oscillation is going to be the annoyance...

Repled commented 5 years ago

Below are the latest measurements with observations. This is starting to look quite good. I could live with these results. The TS-ILS was off by 50 C before at 420 C, now It's more like 12 C excluding the offset.

Sorry, missed to check the C4 tip... might do that later.

image

TS100_Tip_Measurements_2018-09-15.zip

P.S. I have a really hard time loading the FW. Does not work at all with Win10. Seems to work well with an old Win7-computer but out of the two TS100, one is working and the other one looks to work, but nothing changes, still the same FW afterwards.