Closed wiseodd closed 2 months ago
I think this is not a bug because further down both are initialized correctly. The reason for this is that one can pass a prior prec that fits shape-wise to the last layer but that would fail for the super call to parametric Laplace so a scalar works fine. It is then subsequently corrected by either retrieving the last layer if it's given or otherwise setting prior to None and setting it later when the last layer has been extracted during fit. So the hardcoding is no bug but there could be one in the subsequent prior precision setting. Have you experienced that happening?
Oh I see it now that prior_precision
and prior_mean
args are still used but down below. Nvm then!
I'm not sure why this is the case: https://github.com/aleximmer/Laplace/blob/76a04ebc36f9a56c7ae9bea5d62bc64a53909756/laplace/lllaplace.py#L81-L91
The git blame traces back to the early days of this library. @aleximmer @runame do you remember why?