Closed ICHydro closed 1 month ago
In principle, this is done like this by design. The user inputs an expected Delta T
and checks the data against that Delta T
during the validation, flagging entries that do not follow that separation as suspicious. See the checks done in the validation. This was implemented by Pablo and, I presume, it was one of the checks done directly in SQL code.
Delta T
is not needed for anything else, so if it is not input by the user, I don't think there's any need to calculate it out of the data and can be removed altogether. It always looked to me as a fragile setting, anyway. What do you think, @ICHydro ?
Let's remove this for now. I agree that it looks fragile, and in principle, the system should be able to deal with irregular time series. If there is a need to do a sanity check on time intervals then there are more sophisticated ways to deal with this. Also, if we want to store a delta T value then it may be easier to store this directly as a float in the sensor table.
Is your feature request related to a problem or limitation? Please describe.
We should not need the user to create an object "Delta T". This is the time interval of the time series of a station, and can be calculated from the data. Also, the time interval is not necessarily constant for a (raw) time series, for example data may be available every 5 minutes for some part of the time series and 10 minutes for another part.
Describe the solution you'd like The system should be able to calculate Delta T from the time series, and ideally be able to deal with irregular time series (which have a varying delta T).