Closed camontanezp closed 6 years ago
The paper (https://peerj.com/preprints/3190v2.pdf) attempts to describe this in Section 3.1.4. The code is here (in R): https://github.com/facebook/prophet/blob/master/R/R/prophet.R#L1312
Basically the procedure is:
That is now a simulated trend. That is done many (by default 1000) times and then quantiles can be computed.
The role of the changepoints prior scale: As you can see in the generative process outlined above, it does not appear directly. However, shrinking the changepoints prior scale will shrink the fitted trend change magnitudes. This will in turn shrink the magnitudes of the simulated future trend changes and so leads to narrower uncertainty bands.
Does that clarify things?
That is very clear @bletham, thanks a lot!
I think the source of doubt was that in the paper the changepoints prior scale is not (directly) mentioned.
Will I find, in the code provided, the (mathematical) relation between lambda and this rate?
Many thanks again!
It is in the code here: https://github.com/facebook/prophet/blob/master/R/R/prophet.R#L1347
deltas
are the fitted changepoint magnitudes.
Dear all,
Can someone share with me how Prophet generates simulations of the forecast in order to calculate the uncertainty intervals? For example, is it done by sampling errors with respect to the adjusted curve and calculating percentiles?
In the paper it explains something and also in the documentation. But there is not much detail. For example in the documentation it says that there are various assumptions made. Which ones?
Also. I'd like to know what's the generative process. Does it depends only on the change points scale? I did notice that when this value is high, the uncertainty grows into the future. But when it is not, the uncertainty is kind of constant.
Many thanks!