Closed antoinecarme closed 3 years ago
~/dev/python/packages/timeseries/pyaf$ grep mComplexity pyaf/TS/SignalDecomposition*.py
pyaf/TS/SignalDecomposition_AR.py: self.mComplexity = None;
pyaf/TS/SignalDecomposition_AR.py: self.mComplexity = 0;
pyaf/TS/SignalDecomposition_Cycle.py: self.mComplexity = None;
pyaf/TS/SignalDecomposition_Cycle.py: self.mComplexity = 0;
pyaf/TS/SignalDecomposition_Cycle.py: self.mComplexity = 1;
pyaf/TS/SignalDecomposition_Cycle.py: self.mComplexity = 2;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = None;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 0;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 2;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 3;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 3;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 1;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 1;
~/dev/python/packages/timeseries/pyaf$ grep mComplexity pyaf/TS/Scikit_Models*.py
self.mComplexity = len(self.mInputNamesAfterSelection)
self.mComplexity = P;
self.mComplexity = 2*P;
self.mComplexity = 2*P;
self.mComplexity = 2*P;
Overall weighting (weights should be 1, each model uses its own weight and defines its own complexity)
def getComplexity(self):
- lComplexity = 32 * self.mTransformation.mComplexity + 16 * self.mTrend.mComplexity + 4 * self.mCycle.mComplexity + 1 * self.mAR.mComplexity;
New values
~/dev/python/packages/timeseries/pyaf$ grep mComplexity pyaf/TS/SignalDecomposition*.py
pyaf/TS/SignalDecomposition_AR.py: self.mComplexity = None;
pyaf/TS/SignalDecomposition_AR.py: self.mComplexity = 0;
pyaf/TS/SignalDecomposition_Cycle.py: self.mComplexity = None;
pyaf/TS/SignalDecomposition_Cycle.py: self.mComplexity = 0;
pyaf/TS/SignalDecomposition_Cycle.py: self.mComplexity = len(self.mEncodedValueDict.keys())
pyaf/TS/SignalDecomposition_Cycle.py: self.mComplexity = 0
pyaf/TS/SignalDecomposition_Cycle.py: self.mComplexity = len(lDict.keys())
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = None;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 0;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 2;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = iWindow;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = iWindow;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 1;
pyaf/TS/SignalDecomposition_Trend.py: self.mComplexity = 3;
New weighting
def getComplexity(self):
lComplexity = self.mTransformation.mComplexity + self.mTrend.mComplexity + self.mCycle.mComplexity + self.mAR.mComplexity;
return lComplexity;
PyAF uses a notion of complexity for each component of the model (signal transformation, trend, cycle.seasonal, AR, ... )
The complexity value is used when two models have almost the same performance (MAPE) , the less complex model is kept in this case (Occam's razor).
By introducing new models (Croston, XGBoost, LightGBM), the notion of complexity has to be dependent of each component and the overall weighting is no longer relevant.
this change will not impact too much the output of PyAF as the complexity is not always used in the model selection.