OpenSimulationInterface / open-simulation-interface

A generic interface for the environmental perception of automated driving functions in virtual scenarios.
Other
265 stars 124 forks source link

Timestamp Definition #433

Open doganulus opened 3 years ago

doganulus commented 3 years ago

As a tool developer I find tedious to work with a timestamp definition having two separate fields (seconds, nanos) that complicates the timing arithmetic, abundant in real time systems.

In osi_common.proto file, it is noted that this is done for a future compliance with google/protobuf/timestamp.proto definition but I am not sure that it's now a valid concern. So what are the use cases requiring two separate time fields, especially in automotive simulation domain?

Ideally I'd prefer a single int64 field with an implicit microseconds** precision (the simplest) or a more general solution a la std::chrono::duration (with num and den fields).

** I say microseconds because AUTOSAR uses microseconds but I wonder if there exists a use case requiring nanosecond precision in automotive.

DerBaertige commented 3 years ago

The representation of time stamps using integers for seconds and fractional part is one of the common method besides the string based ISO 8601 and RFC 3339.

This representation allows, out of the view of harmonization OSI, to be comparable to third party standards, implementations and run time environments like POSIX, ROS and the also the mentioned Protobuf time stamp. All these provide native functionalities working with such or similar structures.

The current C11 implementation and the POSIX standard provide its timing variables "struct timespec" and "struct timeval" as structures using two integer. In addition, the POSIX Real-time extensions (POSIX.1b) uses this representation, where also for real time systems this kind of arithmetic is used for several years and implementations.

The resolution of nanoseconds can be argued, due to application purpose like recording information of real sensors, where also the jitters can be interesting in the post processing. State of the art operating systems and platforms provide therefore timers with resolutions less than microseconds. The solution similar to the "std::chrono::duration" can causes some additional problems, if multiple scenarios with different resolutions has to be compared.

Furthermore, the standard should not only intended to reflect the current state of the art, but also should offer application scenarios for future scenarios. Here nanoseconds will probably play a bigger role than now.

doganulus commented 3 years ago

Thank you for the info and explanations @DerBaertige. Now I regard the issue as embedded C code (time.h) vs modern C++ (std::chrono). And perhaps this choice depends on the target audience of the standard, which I don't have a say.

For those are interested I was experimenting with a definition like below:

syntax = "proto2";

package chrono;

message Ratio {
  optional int32 num = 1 [default = 1];
  optional int32 denom = 2 [default = 1000000];
}

message Time {
  optional Ratio ratio = 1;
  optional int64 value = 2;
}

Then probably I would set the ratio in the first message and not change after that (see #439 not to send unchaged values). Besides if I know the system uses regular sampling with a fixed period, I would set the ratio to the period and then this would eliminate the need to send a timestamp as Time.value would increase one by one.