Closed aws-haoyuli closed 6 years ago
how big is your dataset?
can you try minimal settings on your dataset?
My dataset contains 80 instances and every instance contains 2001 rows so it contains about 160000 rows. On Linux I tried thousands of rows and I can get correct result. But for 20, 40 and 80 instances I can't get any results. The RAM may be too small? Because I only allocate 2 GB RAM for it. On Windows I never get any results on any dataset of any size.
None of the developers has access to a windows machine, so we can only give rudimentary support for tsfresh on windows.
Do you have any logs or error messages that pop up?
As said, please try to run your feature extraction with MinimalFCParameters or EfficientFCParameters, not the ComprehensiveFCSettings. The comprehensiveFCSettings contain some features that may not converge or take a long time to calculate for certain types of time series.
I just wanted to say that I was having the same issue on a windows machine within an Anaconda environment, and what solved the issue for me was uninstalling tsfresh using pip and installing with
conda install -c conda-forge tsfresh
I just wanted to say that I was having the same issue on a windows machine within an Anaconda environment, and what solved the issue for me was uninstalling tsfresh using pip and installing with
conda install -c conda-forge tsfresh
Thanks, this worked for me too, side note no need to uninstall with pip, just overwritten previous installation.
I run this code with Python 3.6 on Windows 10 OS and it is always stuck whichever data set I use. After a long time like more than 10 hours, it displays 'MemoryErr'. I run the code with Python 2.7 on Ubuntu 16.04 LTS and it can finish correctly on robot_execution_failures data set but it is stuck at 80% at my own data set.
For questions, you can also use our gitter chatroom