Open GoogleCodeExporter opened 8 years ago
I agree using N for time-stamp is not accurate but from reading man rrdtutorial
my understanding is. When measurement does not fit to exact intervals. Then
"RRDtool therefore interpolates the data, so they are stored on exact
intervals."
With the rounding to multiple of 300 you introduce a failure and make it
impossible to fix that by interpolation.
I think a more accurate approach is to get the time-stamp from the downloaded
file the content from downloaded data and update of the time-stamp by the
download utility is very close together.
Last but not least the most accurate time-stamp IMHO is inside the XML file
tagged with <lsup> it it the time-stamp when data is recorded and what our
scripts are doing is recoding the time-stamp when we query the XML file.
BTW: For my location the downloaded XML-file is changing in 20min interval
weather.com say it will change in 25min interval
Summary:
- putting time-stamp in a shell VAR and reuse it for every "rrdtool update"
make sense because also the data was sampled only once. (issue occur on slow or
heavy loaded machines. then 'N' could be different on each "rrdtool update" )
which is definitely wrong.
- more difficult is what is the right time-stamp
* sample time-stamp on every call to rrdtool update
rrdtool update ... N: ... # 1. current solution
* sample timestamp only once
TS=$(date +%s); # 2. use local time
TS=$(stat -c "%Z" "${ZIP}.xml") # 3. use time when XML-file is downloaded
TS=$(get_time_stamp_from_XML) # 4. use time when data was captured
rrdtool update ... $TS: ... # pass time to ALL "rrdtool update"
My personal choice for now is
#3. (easy to implement and low impact) but
#4 is the most accurate solution but i have no glue about the side effects
because
RRDs are defined with a 300 intervals and the time-stamp seems to be update
less frequently.
best regards heinz
Original comment by heinz.bl...@gmail.com
on 12 Apr 2012 at 1:35
Original issue reported on code.google.com by
sebastien.wains
on 28 Jul 2009 at 10:43