Closed joaobrittoneto closed 7 years ago
One thing about the slam components that have been bother me for a while is the lack of consistency in sample's timestamp.
It provides samples with zero timestamp (in the begin of stream) and with repeated timestamp. I am not sure if the component was supposed to work like that, but IMHO a stream should be a function of time (one timestamp/sample). The test shows both cases.
No you are right, this behavior should be fixed. Thanks for the unit tests. I'll take care of it.
Please check if rock-slam/slam-orogen-uwv_kalman_filters#4 solves the issue.
You probably have to increase the timeouts of your calls of assert_has_one_new_sample
or reduce the transformer_max_latency
in the configuration.
Is there a minimum amount of samples required by the filter? Looks like just increasing the timeout or reducing the latency don't work
No, but the stream aligner might wait for older samples or drop samples. Check that this isn't the case.
The unit test was working on my machine when setting the transformer_max_latency
to zero (telling the stream aligner not to wait).
It works. I include the base_time, but if different input samples have the same timestamp, it would be propagate to the output, making the stream with repeated timestamp. Do you think it worths to handle this case? Or the unlikelihood of this happens justify it to be ignored?
but if different input samples have the same timestamp, it would be propagate to the output, making the stream with repeated timestamp. Do you think it worths to handle this case? Or the unlikelihood of this happens justify it to be ignored?
I think that is a valid behavior.
:+1:
I include the base_time, but if different input samples have the same timestamp, it would be propagate to the output, making the stream with repeated timestamp.
From the same stream, or from different streams ?
I tested for different streams. I'd say it may also happens for the same stream. Do those cases require tests?
From the point of view of the estimator, I agree with Sascha, it probably makes sense (you basically get two estimates of the same time). @saarnold you sure that the filter would like that however ?
It is difficult for everyone else, though. In practice, the only time it really could happen is if two of the inputs come from the same source.
@saarnold you sure that the filter would like that however ?
The prediction step will not be executed if the time delta is smaller then a certain threshold (default 1.0e-9) in order to overcome numeric issues.
It is difficult for everyone else, though. In practice, the only time it really could happen is if two of the inputs come from the same source.
Currently the task will provide a new sample every time at least one new measurement was integrated (and therefore the state of the filter has changed). If it happened to be at the same micro second the timestamp will be the same of course. However two measurements from the same source having the same timestamp should be prevented, the input must be constant in time.
However two measurements from the same source having the same timestamp should be prevented, the input must be constant in time.
I did not express myself properly. I was thinking of a device that would provide two different measurements (kind of a composite device, or e.g. another filter that would provide both IMU and depth information). These measurements could be synchronized in time, but would be seen by the filter as two separate data streams.
How did you determine the 1 microsecond threshold for numerical stability ?
How did you determine the 1 microsecond threshold for numerical stability ?
There is actually no matrix inversion involved in the prediction step, so there shouldn't be numerical issues. Not executing the prediction if the delta is smaller then 1 nanosecond therefor is only a performance decision.
Something that bothers me in the logic ...
However two measurements from the same source having the same timestamp should be prevented, the input must be constant in time.
So, you're actually based on the assumption that the inputs should be constant in time ... which is a common assumption. But you don't hold the estimators to the same standard. Does that really make sense ?
I'm not sure if I can follow. Since our timestamps have microsecond resolution only zero steps will be skipped right now.
I interpret the last part of:
Currently the task will provide a new sample every time at least one new measurement was integrated (and therefore the state of the filter has changed). If it happened to be at the same micro second the timestamp will be the same of course.
as "if two input streams have the same timestamp, the output stream will have two estimates with the same timestamp". However, you yourself see that having two inputs with the exact same timestamp as a problem.
The question is whether we should enforce "never two samples with the same timestamp on any given stream".
If we consider to use the output as input to another filter it would make sense to enforce it yes. But more importantly the task should run periodically instead of port driven. Maybe it is time to change the default setting as well. However this will just guarantee that output rate <= periodicity.
Check for zero and repeated timestamps