When using the program SimulateOrbit for simulating an orbit over a longer period (e.g., 31 days), I can see an increasing time lag between the epochs of the integrated orbit and the input time series (in my case a simple uniform time series with 5s sampling). The last epoch in the orbit file (after 31 days) has the value (MJD) 52306.000000000262559752 which is about 2.3µs off with respect to the last epoch in the uniform time series (=52306). 2.3µs correspond already to an 18cm displacement for a LEO satellite. Numerically, I cannot really comprehend this, because the integrator should anyway interpolate the result on the requested epochs when using dynamic time steps. Hence, to my understanding, the result should refer exactly (up to double precision accuracy) to the input epochs.
Description
When using the program SimulateOrbit for simulating an orbit over a longer period (e.g., 31 days), I can see an increasing time lag between the epochs of the integrated orbit and the input time series (in my case a simple uniform time series with 5s sampling). The last epoch in the orbit file (after 31 days) has the value (MJD) 52306.000000000262559752 which is about 2.3µs off with respect to the last epoch in the uniform time series (=52306). 2.3µs correspond already to an 18cm displacement for a LEO satellite. Numerically, I cannot really comprehend this, because the integrator should anyway interpolate the result on the requested epochs when using dynamic time steps. Hence, to my understanding, the result should refer exactly (up to double precision accuracy) to the input epochs.
GROOPS version
main (latest commit)
Operating systems
Log output
No response