Reason: I had calibration times in the range from 1-20ns with task step times of 1ms. As a result, the runtimes were very off..
Changed the variables to long and calibration to store the 1000 loop average instead of the single loop average. The division truncates and causes up to 1 digit errors. For a 20ns calibration result, this makes a 5% error; with smaller values even more.
Changed configuration parse to detect also double, and store both int and double JSON values as a calibration value in picoseconds, again as long (resolution increased by 1000)
Reason: I had calibration times in the range from 1-20ns with task step times of 1ms. As a result, the runtimes were very off..
Changed the variables to long and calibration to store the 1000 loop average instead of the single loop average. The division truncates and causes up to 1 digit errors. For a 20ns calibration result, this makes a 5% error; with smaller values even more. Changed configuration parse to detect also double, and store both int and double JSON values as a calibration value in picoseconds, again as long (resolution increased by 1000)