Samraksh / eMote

eMote OS -- Multiple Ports (using .NET MF v4.3)
0 stars 0 forks source link

RealTime timer affected by background threads #253

Open WilliamAtSamraksh opened 9 years ago

WilliamAtSamraksh commented 9 years ago

The RealTime timer has more jitter when there's at least one background thread running. The code and the spreadsheet are in the share

Dev eMote\GitHub Issues Attachments\2015.03.11 Timer Jitter.zip

This was run under commit 4aceeb7f48f6d27265a190cf9343cd13aa9c6054 with calibration enabled.

Referring to the spreadsheet below, mean error is calculated as the sum of the absolute value of the difference of each sample time vs. the specified tick interval, 100 ms. When there is at least one background thread running, mean error, range and std increase substantially vs. the values for no background threads.

image

The RealTime timer should not be affected by other threads so this difference is surprising.

CLARIFICATION edit: RT = RealTime timer; DN = DotNow (the standard timer)

ChrisAtSamraksh commented 8 years ago

I looked into this issue and ran the app note that generates the data above.

If there are any number of make work threads running the C# user timer can be up to 20 ms (plus processing time) off because of the way MF gives time slices to different threads in 20ms chunks. If we had multiple user timers then the jitter would be even higher (I believe an additional 20 ms jitter per timer for the max values).

The real-time timer doesn't have this time slice problem. If there are additional threads then there is an additional 30us of processing time overhead that we can not avoid.

The only thing that could be done, is that the real-time timer can be made aware of the number of threads and when there is more than one it compensates by subtracting 30us of processing time from the timer.

Nathan-Stohs commented 8 years ago

The only thing that could be done, is that the real-time timer can be made aware of the number of threads and when there is more than one it compensates by subtracting 30us of processing time from the timer.

In general, this type of compensation is more trouble than its worth. Clock speeds change. Compilers change. Release vs. Debug mode changes. The resulting code is not maintainable.

Can you elaborate on the 30us difference?