IIIM-IS / AERA

Other
12 stars 4 forks source link

Use microseconds for Timestamp period #270

Closed jefft0 closed 1 year ago

jefft0 commented 1 year ago

Background: In pull request #71, we changed to distinguish int values for Timestamp from duration in std::chrono::microseconds. This works, but the Timestamp type uses std::chrono::system_clock which does not use microseconds internally. Instead it uses units of the system clock "tick" where each tick period is 100 nanoseconds. This means that each math operation between a Timestamp and a duration in microseconds requires converting between the two different representations. This is done automatically by the compiler, but it is not desirable. There can be loss of accuracy and the Timestamp does not hold the full precision of the 64-bit microseconds value.

Fortunately, the C++ standard library is parameterized and I noticed that it is easy to control the Timestamp representation. This pull request uses the updated CoreLibrary where the Timestamp type uses the new system_clock_us where the period is a microsecond (not 100 nanoseconds). We update to use this new definition and remove some methods which are redundant because Timestamp::duration is now the same as std::chrono::microseconds as desired. We also update Utils_MaxTime to be the full 64-bit range of Timestamp::max() as desired.