twitter / AnomalyDetection

Anomaly Detection with R
GNU General Public License v3.0
3.55k stars 776 forks source link

Period not set if granularity is "sec" #66

Open SFerrazLeite opened 8 years ago

SFerrazLeite commented 8 years ago

Hi there,

I'm trying to run an analysis for a time series that has a granularity of 20 seconds. Unfortunatley, there is a bug in the code...

In AnomalyDetectionTs, I run into the following problem: I have tested the get_gran function, and it does return "sec". Later on the data is aggregated to minutes, but unfirtunately, the variable gran is never set to "min". When period is defined, the switch statement does not check for "sec" - which is still the value of gran - and thus remains null. This makes the call of detect_anoms crash with the error message "must supply period length for time series decomposition".

  # Aggregate data to minutely if secondly
  if(gran == "sec"){
    x <- format_timestamp(aggregate(x[2], format(x[1], "%Y-%m-%d %H:%M:00"), eval(parse(text="sum"))))
  }

  period = switch(gran,
                  min = 1440,
                  hr = 24,
                  # if the data is daily, then we need to bump the period to weekly to get multiple examples
                  day = 7)
  num_obs <- length(x[[2]])

  if(max_anoms < 1/num_obs){
    max_anoms <- 1/num_obs
  }

I'm pretty sure that setting gran <- "min" in the if(gran=="sec") block would fix the problem.