Marcnuth / AnomalyDetection

Twitter's Anomaly Detection in Pure Python
Apache License 2.0
304 stars 76 forks source link

Issues with recent commits (args,kwargs, and more) #9

Open triciascully opened 5 years ago

triciascully commented 5 years ago

Hi there -

When running:

anomaly_detect_vec(data, max_anoms=0.02, period=96, direction="both", plot=True)

I keep getting this:

TypeError: __verbose_if() missing 1 required positional argument: 'kwargs'

I changed the assert validation from

assert isinstance(x) == pd.Series, 'x must be pandas series'

to

assert isinstance(x, pd.Series), 'Data must be a series(Pandas.Series)'

because I kept getting an error that the isinstance() needed 2 arguments, and now I'm getting the __verbose error.

I have a feeling this might be a Python 2 vs. Python 3 issue but I'm not familiar with python enough to really know, and I think continuing on and trying to debug every new issue that pops up will not end up working... any suggestions/easily-identifiable-fixes you can see?

triciascully commented 5 years ago

So I went into the function code and added * in front of args and * in front of kwargs, which fixed it (I saw that you took those s away earlier this year in an update @hokiegeek2?), but now I've got new issues that I can't debug, and I'm wondering if it's maybe a version issue with my systems and your code, or if there's a bug or something in both the anomaly_detection_ts() and the anomaly_detection_vec() code.

For more context, this is what my data looks like:

test.head()

Out[241]: timeblock 2018-08-13 00:00:00 10.607767 2018-08-13 00:15:00 11.221798 2018-08-13 00:30:00 11.112807 2018-08-13 00:45:00 10.220628 2018-08-13 01:00:00 12.900123 dtype: float64

There are 2783 records, and each at 15 min. intervals apart (the example data is minute-level, so it's a little different, and I changed the hard-coded min. period setting from 1440 to 96 manually in the ts code).

When I run it, it only spits out 3 records from the 2783 that it would call "outliers":

AnomalyDetectionVec(raw_data[,2], max_anoms=0.02, period=1440, direction="both", plot=TRUE)

anomaly_detect_ts(test, max_anoms=0.04,direction="both") {'anoms': 2018-08-15 11:45:00 31.482831 2018-08-17 13:30:00 25.210884 2018-08-15 09:15:00 4.685535 dtype: float64, 'expected': None, 'plot': None}

I honestly believe it should be picking up a lot more than 3 outliers in this full dataset, based on some plots, but also, note that I didn't use plot=TRUE in the code -- it throws the error " NameError: name 'TRUE' is not defined". I tried different iterations of how to write true ("TRUE","True",True) and plot =True worked, but then I got a different error: " UnboundLocalError: local variable 'num_days_per_line' referenced before assignment"

I was going into a rabbit hole of errors at that point, so I decided to back out and just work with the output for the time being.

When I take that same Series and run it through the anomaly_detection_vec() function, I get the following error:

anomaly_detect_vec(test, period=96) Out[241]:

KeyError Traceback (most recent call last)

in () ----> 1 anomaly_detect_vec(test, period=96) in anomaly_detect_vec(x, max_anoms, direction, alpha, period, only_last, threshold, e_value, longterm_period, plot, y_log, xlabel, ylabel, title, verbose) 174 data_decomp = tmp['stl'] 175 --> 176 anoms = ts.loc[s_h_esd_timestamps] 177 if threshold: 178 end = longterm_period - 1 if longterm_period else x.size - 1 ~/anaconda3/lib/python3.6/site-packages/pandas/core/indexing.py in __getitem__(self, key) 1476 1477 maybe_callable = com._apply_if_callable(key, self.obj) -> 1478 return self._getitem_axis(maybe_callable, axis=axis) 1479 1480 def _is_scalar_access(self, key): ~/anaconda3/lib/python3.6/site-packages/pandas/core/indexing.py in _getitem_axis(self, key, axis) 1899 raise ValueError('Cannot index with multidimensional key') 1900 -> 1901 return self._getitem_iterable(key, axis=axis) 1902 1903 # nested tuple slicing ~/anaconda3/lib/python3.6/site-packages/pandas/core/indexing.py in _getitem_iterable(self, key, axis) 1141 if labels.is_unique and Index(keyarr).is_unique: 1142 indexer = ax.get_indexer_for(key) -> 1143 self._validate_read_indexer(key, indexer, axis) 1144 1145 d = {axis: [ax.reindex(keyarr)[0], indexer]} ~/anaconda3/lib/python3.6/site-packages/pandas/core/indexing.py in _validate_read_indexer(self, key, indexer, axis) 1204 raise KeyError( 1205 u"None of [{key}] are in the [{axis}]".format( -> 1206 key=key, axis=self.obj._get_axis_name(axis))) 1207 1208 # we skip the warning on Categorical/Interval KeyError: 'None of [2018-08-15 11:45:00 17.587259\n2018-08-17 13:30:00 11.658043\ndtype: float64] are in the [index]' So I tried the following: print(test.loc['2018-08-15 11:45:00']) print(test.loc['2018-08-17 13:30:00']) Out[254]: 31.482830599160348 25.21088365335222 These DateTime values ARE in the index, but they're paired with different values than what the error is showing, and I'm very perplexed by this. Also, the two values in this error are two of the three anomalies that were detected in the TS algo. I've tried reindexing the Series, recreating the dataframe then recreating the Series from the dataframe a few different ways, then setting the index as the DateTime column a few different ways, but nothing seems to fix the error that's being thrown. Any ideas?
hokiegeek2 commented 5 years ago

@triciascully I am working these and a couple of issues I encountered when working with anomaly_detect_vec. I plan to getting these wrapped up and a patch submitted soon.

hokiegeek2 commented 5 years ago

@triciascully @Marcnuth Okay, fixed the following:

  1. index issue @triciascully reported and was unsure how to fix
  2. the *args **kwargs (sorry about the latter, I introduced that bug)
  3. import statements that prevent running this code post-installation in Python (as opposed to running scripts, not deploying that way)
  4. added return atoms to return anomalies from anomaly_detect_vec

Just submitted pull request

hokiegeek2 commented 5 years ago

@triciascully this is implemented and tested in the master branch of my fork

hokiegeek2 commented 5 years ago

merged, thanks @Marcnuth

hokiegeek2 commented 5 years ago

@triciascully please close this, thanks