26medias / timeseries-analysis

NPM Package - Timeseries analysis, noise removal, stats, ...
238 stars 48 forks source link

coeffs [null,null,null,null,null] #9

Open ujjwalgarg1995 opened 6 years ago

ujjwalgarg1995 commented 6 years ago

Initialized t as var t = new timeseries.main(data); console.log(t) Result is : { "options": {}, "data": [ [ "2017-08-01T00:00:00.000Z", "38.62" ], [ "2017-07-01T00:00:00.000Z", "38.96" ], [ "2017-06-01T00:00:00.000Z", "21.64" ], [ "2017-05-01T00:00:00.000Z", "22.96" ], [ "2017-04-01T00:00:00.000Z", "24.60" ], [ "2017-03-01T00:00:00.000Z", "25.87" ], [ "2017-02-01T00:00:00.000Z", "24.02" ], [ "2017-01-01T00:00:00.000Z", "22.47" ], [ "2016-12-01T00:00:00.000Z", "20.32" ], [ "2016-11-01T00:00:00.000Z", "18.26" ], [ "2016-10-01T00:00:00.000Z", "28.92" ], [ "2016-09-01T00:00:00.000Z", "28.19" ], [ "2016-08-01T00:00:00.000Z", "27.64" ], [ "2016-07-01T00:00:00.000Z", "26.99" ], [ "2016-06-01T00:00:00.000Z", "27.90" ], [ "2016-05-01T00:00:00.000Z", "31.42" ], [ "2016-04-01T00:00:00.000Z", "34.21" ], [ "2016-03-01T00:00:00.000Z", "34.73" ], [ "2016-02-01T00:00:00.000Z", "36.54" ], [ "2016-01-01T00:00:00.000Z", "38.06" ], [ "2015-12-01T00:00:00.000Z", "40.19" ], [ "2015-11-01T00:00:00.000Z", "38.33" ], [ "2015-10-01T00:00:00.000Z", "38.43" ], [ "2015-09-01T00:00:00.000Z", "36.88" ] ], "original": [ [ "2017-08-01T00:00:00.000Z", "38.62" ], [ "2017-07-01T00:00:00.000Z", "38.96" ], [ "2017-06-01T00:00:00.000Z", "21.64" ], [ "2017-05-01T00:00:00.000Z", "22.96" ], [ "2017-04-01T00:00:00.000Z", "24.60" ], [ "2017-03-01T00:00:00.000Z", "25.87" ], [ "2017-02-01T00:00:00.000Z", "24.02" ], [ "2017-01-01T00:00:00.000Z", "22.47" ], [ "2016-12-01T00:00:00.000Z", "20.32" ], [ "2016-11-01T00:00:00.000Z", "18.26" ], [ "2016-10-01T00:00:00.000Z", "28.92" ], [ "2016-09-01T00:00:00.000Z", "28.19" ], [ "2016-08-01T00:00:00.000Z", "27.64" ], [ "2016-07-01T00:00:00.000Z", "26.99" ], [ "2016-06-01T00:00:00.000Z", "27.90" ], [ "2016-05-01T00:00:00.000Z", "31.42" ], [ "2016-04-01T00:00:00.000Z", "34.21" ], [ "2016-03-01T00:00:00.000Z", "34.73" ], [ "2016-02-01T00:00:00.000Z", "36.54" ], [ "2016-01-01T00:00:00.000Z", "38.06" ], [ "2015-12-01T00:00:00.000Z", "40.19" ], [ "2015-11-01T00:00:00.000Z", "38.33" ], [ "2015-10-01T00:00:00.000Z", "38.43" ], [ "2015-09-01T00:00:00.000Z", "36.88" ] ], "buffer": [], "saved": [] }

Then declare coeffs as var coeffs = t.ARMaxEntropy({data: t.data}); and var coeffs = t.ARMaxEntropy();

In both cases coeffs is logged as [null,null,null,null,null]

Laurix1983 commented 6 years ago

I have the same problem but I get coifs as [NAN,NAN,NAN...]

Laurix1983 commented 6 years ago

Seems that in my case #2 has the solution - my dataset is not large enough

lauri108 commented 5 years ago

@ujjwalgarg1995 @Laurix1983 you can experiment with a smaller degree, as well as use ARLeastSquare instead of ARMaxEntropy, which uses the method of Least Squares to extract the coefficients. Code sample as follows:

const timeseries = require("timeseries-analysis");
const data = [
      ['2019-04-15T03:00:00.000Z', 0],
      ['2019-04-16T03:00:00.000Z', 1],
      ['2019-04-17T03:00:00.000Z', 3],
      ['2019-04-18T03:00:00.000Z', 6],
      ['2019-04-19T03:00:00.000Z', 8],
      ['2019-04-20T03:00:00.000Z', 9]
   ];
const currentTimeSeries = new timeseries.main(data);
const coeffs = currentTimeSeries.ARLeastSquare({degree: 3});
console.log('coeffs', coeffs);

I've also found that in order to make the forecast meaningful using real-world data, It's better to use a subset of the data that's closer to the estimated value, as mentioned in #10:

const coeffsslice = currentTimeSeries.ARMaxEntropy({
      data: currentTimeSeries.data.slice(0, 4),
      degree: 3
});
console.log('slicedcoeff', coeffsslice);

Keeping in mind that the way the coefficients are used to construct the forecast differs between ARMaxEntropy and ARLeastSquares. You can check the source code for this on line 801 of timeseries-analysis.js.