Closed Tig3rch3n closed 7 months ago
Let me share with you my understanding:
Geiger-Müller tubes are, obviously, made of matter. With the currently available technology, it is not possible to make a tube without atoms subject to radioactive decay. Thus the tube itself will produce radiation. And this intrinsic radiation will depend on the tube type, tube size, and even the production batch.
Therefore you should not take your measurements as the true, actual background level. It is even expected that two Geiger counters disagree on that.
For a proper comparison you should use a radioactive source and similar measurement conditions.
Also, the min-max values are not useful for comparing. Always use the average value, taking into consideration the confidence interval.
Nevertheless, there is a discrepancy between both firmwares. Can you make average measurements on both of your counters, with both the original firmware and Rad Pro? Try to average as long as possible.
Hey ho, You where absolutely right, with the Factory 2 Setting and J321/J613 Tube Settings the Avg Values are more or less Identical per device. There is still a Diff of x2-x5 between the two Devices.
So my Issue is Closed in on my side
Do your devices have WCH or Geehy microprocessors?
It seems that the electronics boards of devices with a WCH and a Geehy MCU are different.
A user with a WCH tried "Factory default 2" (in beta22, "Factory default (Geehy)") and reported that the voltage at the tube was too low.
So, if you happen to have a GC-01 with a Geehy, it could be that the "Factory default 1" (in beta22, "Factory default (WCH)") produces a too high voltage. And this could explain the increased count. Is this what you are seeing?
Both are Geehy but one has the J613 tube and one has the J321 Tube in it... so it had a higher detection rate with the original Firmware anyway Question is, if there is a option of ofsetting the Counting rate
In the "Conversion factor" menu you can set a custom conversion factor for converting cpm (counts per minutes) to µSv/h!
The important thing is that the "Factory default 2" HV profile led to the same readings in the original firmware and Rad Pro. Right?
Excellent.
Hey Ho, My main Issue is, I have two different GC-01 (with the J321 and with J613) the readings of the two Device differ a lot. The Device with the tiny tube (J613)? and original 1.62 Firmware I had a Reading of MIN-MAX 0.03-0.10μS/h while the J321 with the Original FW I got MIN-MAX 0.06-0.21μS/h over 10min. With the RadPro FW I get MIN-MAX 0.022-0.130μS/h Avg 0.056μS/h while the J321 with original FW still sits at MIN-MAX 0.10-0.26μS/s Avg 0.12μS/h.
All of the Testing with the J613 was done with the HV-Profile set to Factory 2. I have seen there are a bunch of different Conversion Factor presets. I tried all of the 'J' so J305-J614, the J613 Profile is indeed the Closest to the J315 Values, but still misses ~x2-x4
Any reccomendation, or how to deal with the 'missing' counts?