Open Math9999 opened 4 years ago
did you check whether the weights contains np.inf, NaN ? Another possibility of the error is when np.sum(weights)=0
@LeeDoYup I also get the same error. Yes, you are right, the weights are close to 0.
If there is a huge level shift from one time period to the next, i.e. |y_j - y_t| is large, then the denoising through bilateral filters involves computing the product of terms (which are close to 0. because of exponential factors). This causes the weights to go to zero. This problem is also amplified if the time period set is high. Do you have a workaround for dealing with this issue?
@anirudhasundaresan I think the problem is from the algorithm itself... (I am not the author, but i implemented it). Nowadays, i am busy for other submission (until next week).
I will look around the issue after i finish my work. If you solve the problem before i start, please make a pull request !
I've encountered a similar error, probably for the same reason:
main:54: RuntimeWarning: invalid value encountered in double_scalars
Traceback (most recent call last):
File "
ValueError: domain error
I printed the value of s in coneprog.py just before line 1033 and found that it was all nan's.
I am getting the same error for most of the time series. Does anyone has solved this or any idea?
I think the error comes from l1 optimizer, which is a part of cvxopt library. I didn't manually implement the l1.py, but use the part of cvxopt. Can you debug which values have -7?
I'm not sure, but I think when sol['x'] is NonType return sol['y'][:n] in 'l1.py' line 57.
I'm seeing this as well, and (for me at least) the root cause is the weights returned by bilateral_filter
all go to zero causing nan's on the following line due to divide by zero
season_value = np.sum(weight_sample * weights)/np.sum(weights)
Edit: So the issue in my case was fixed by setting the bilateral_filter
hyper-parameters, in particular ds2
. Since the denominator is squared large values of | y_j - y_t | cause nan's so ds2 is required to scale this back
I have energy consumption data with daily and weekly seasonality - if I set T to daily (48 samples) then this issue occurs since there's a large difference between the weekday and weekend level at the same time of the day. I think there are other issues - if I train on a largish dataset in jupyter the kernel crashes
Hello.
It would be great if you could support.
C:\XYZ\XYZ\RobustSTL.py:54: RuntimeWarning: invalid value encountered in double_scalars season_value = np.sum(weight_sample * weights)/np.sum(weights) [!] 2 iteration will strat
Intel MKL ERROR: Parameter 7 was incorrect on entry to DGELS. Traceback (most recent call last): File "", line 2, in
File "", line 16, in main
File "C:\XYZ\XYZ\RobustSTL.py", line 121, in RobustSTL
return _RobustSTL(input, season_len, reg1, reg2, K, H, dn1, dn2, ds1, ds2)
File "C:\XYZ\XYZ\RobustSTL.py", line 97, in _RobustSTL
trend_extraction(denoise_sample, season_len, reg1, reg2)
File "C:\XYZ\XYZ\RobustSTL.py", line 36, in trend_extraction
delta_trends = l1(P,q)
File "C:\XYZ\XYZ\l1.py", line 41, in l1
lapack.gels(+P, uls)
ValueError: -7
All the best
A.B.