LeeDoYup / RobustSTL

Unofficial Implementation of RobustSTL: A Robust Seasonal-Trend Decomposition Algorithm for Long Time Series (AAAI 2019)
MIT License
271 stars 53 forks source link

Any idea about this error? #9

Open Math9999 opened 4 years ago

Math9999 commented 4 years ago

Hello.

It would be great if you could support.

C:\XYZ\XYZ\RobustSTL.py:54: RuntimeWarning: invalid value encountered in double_scalars season_value = np.sum(weight_sample * weights)/np.sum(weights) [!] 2 iteration will strat

Intel MKL ERROR: Parameter 7 was incorrect on entry to DGELS. Traceback (most recent call last): File "", line 2, in File "", line 16, in main File "C:\XYZ\XYZ\RobustSTL.py", line 121, in RobustSTL return _RobustSTL(input, season_len, reg1, reg2, K, H, dn1, dn2, ds1, ds2) File "C:\XYZ\XYZ\RobustSTL.py", line 97, in _RobustSTL trend_extraction(denoise_sample, season_len, reg1, reg2) File "C:\XYZ\XYZ\RobustSTL.py", line 36, in trend_extraction delta_trends = l1(P,q) File "C:\XYZ\XYZ\l1.py", line 41, in l1 lapack.gels(+P, uls) ValueError: -7

All the best

A.B.

LeeDoYup commented 4 years ago

did you check whether the weights contains np.inf, NaN ? Another possibility of the error is when np.sum(weights)=0

anirudhasundaresan commented 4 years ago

@LeeDoYup I also get the same error. Yes, you are right, the weights are close to 0.

If there is a huge level shift from one time period to the next, i.e. |y_j - y_t| is large, then the denoising through bilateral filters involves computing the product of terms (which are close to 0. because of exponential factors). This causes the weights to go to zero. This problem is also amplified if the time period set is high. Do you have a workaround for dealing with this issue?

LeeDoYup commented 4 years ago

@anirudhasundaresan I think the problem is from the algorithm itself... (I am not the author, but i implemented it). Nowadays, i am busy for other submission (until next week).

I will look around the issue after i finish my work. If you solve the problem before i start, please make a pull request !

chuckcoleman commented 4 years ago

I've encountered a similar error, probably for the same reason:

main:54: RuntimeWarning: invalid value encountered in double_scalars Traceback (most recent call last): File "", line 1, in RobustSTL(y,12) File "/Users/Common 1/SA/RobustSTL/RobustSTL/RobustSTL.py", line 121, in RobustSTL return _RobustSTL(input, season_len, reg1, reg2, K, H, dn1, dn2, ds1, ds2) File "/Users/Common 1/SA/RobustSTL/RobustSTL/RobustSTL.py", line 97, in _RobustSTL trend_extraction(denoise_sample, season_len, reg1, reg2) File "/Users/Common 1/SA/RobustSTL/RobustSTL/RobustSTL.py", line 36, in trend_extraction delta_trends = l1(P,q) File "/Users/Common 1/SA/RobustSTL/RobustSTL/l1.py", line 56, in l1 primalstart={'x': x0, 's': s0}, dualstart={'z': z0}) File "/Users/Shared/anaconda3/lib/python3.7/site-packages/cvxopt/coneprog.py", line 1033, in conelp W = misc.compute_scaling(s, z, lmbda, dims, mnl = 0) File "/Users/Shared/anaconda3/lib/python3.7/site-packages/cvxopt/misc.py", line 285, in compute_scaling W['d'] = base.sqrt( base.div( s[mnl:mnl+m], z[mnl:mnl+m] ))

ValueError: domain error

I printed the value of s in coneprog.py just before line 1033 and found that it was all nan's.

salman087 commented 3 years ago

I am getting the same error for most of the time series. Does anyone has solved this or any idea? image

LeeDoYup commented 3 years ago

I think the error comes from l1 optimizer, which is a part of cvxopt library. I didn't manually implement the l1.py, but use the part of cvxopt. Can you debug which values have -7?

SeungHyunAhn commented 3 years ago

I'm not sure, but I think when sol['x'] is NonType return sol['y'][:n] in 'l1.py' line 57. image

david-waterworth commented 2 years ago

I'm seeing this as well, and (for me at least) the root cause is the weights returned by bilateral_filter all go to zero causing nan's on the following line due to divide by zero

season_value = np.sum(weight_sample * weights)/np.sum(weights)

Edit: So the issue in my case was fixed by setting the bilateral_filter hyper-parameters, in particular ds2. Since the denominator is squared large values of | y_j - y_t | cause nan's so ds2 is required to scale this back

I have energy consumption data with daily and weekly seasonality - if I set T to daily (48 samples) then this issue occurs since there's a large difference between the weekday and weekend level at the same time of the day. I think there are other issues - if I train on a largish dataset in jupyter the kernel crashes