Closed SkyStreet closed 3 years ago
You are correct, smooth functions only report the holdout sample performance of models. This is because the in-sample one is typically not useful.
Another thing is that accuracy()
method is not supported by smooth functions and I don't plan on implementing it. But there is a measure()
function from greybox
that you can use to get several measures for the provided holdout, forecasts and actuals. You can substitute holdout and forecasts by respective in-sample actuals and fitted values, and get a set of error measures.
You are correct, smooth functions only report the holdout sample performance of models. This is because the in-sample one is typically not useful. Another thing is that
accuracy()
method is not supported by smooth functions and I don't plan on implementing it. But there is ameasure()
function fromgreybox
that you can use to get several measures for the provided holdout, forecasts and actuals. You can substitute holdout and forecasts by respective in-sample actuals and fitted values, and get a set of error measures.
Thanks for the tip. Well, now a bit confused by the arguments in measures() function, I thought the input holdout and actual are the same in my attempt? However in the acuuracy given by msarima (I assume accuracy is calculated through measures()), those above two should be different, respectively the holdout(actual value of the test) and the value of the training, and the actual is for the relative errors calculation? So far so good?
That really depends on what you want to measure. If you want the in-sample performance, then your actuals
will be the same as the holdout
. If you are interested in the holdout performance, they will differ.
Perfectly solved. Thanks.
This is not a bug report (I've done some searching in google and stackexchange but no luck). I am now using the auto.msarima in the smooth package, however I find that the accuracy the function returned focused on the holdout sample, i.e. the forecast, the testing set, in my view. But I'd also like to know the accuracy of the training set. There does have s2 value (variance of the residuals). When I apply forecast and forecast::acucuracy, like I usually do in arima or tbats situation, there comes the error: Error in NextMethod(.Generic) : cannot assign 'tsp' to zero-length vector. Is there anything more I could do to return the accuracy of the training set or I just missed the point and misunderstood?