JuliaTrustworthyAI / LaplaceRedux.jl

Effortless Bayesian Deep Learning through Laplace Approximation for Flux.jl neural networks.
https://juliatrustworthyai.github.io/LaplaceRedux.jl/
MIT License
37 stars 3 forks source link

rounding the predicted mean up to the order of the most significative digit of the variance #102

Closed pasq-cat closed 1 week ago

pasq-cat commented 1 month ago

as mentioned in msteams, i think we could round the variance to the first 1or 2 significative digits. we can introduce this change either in the .predict() function or directly in glm_predictive_distribution.

pat-alt commented 1 month ago

I'm not sure about this one: I think I would still return the value at the precision that we've computed it but print a rounded value in the console. As I understand it, you're mostly bothered by this from a user perspective? I don't see why mathematically a more precise value what be less correct as I believe you suggested on msteams?

pasq-cat commented 1 month ago

https://web.ics.purdue.edu/~lewicki/physics218/significant#:~:text=(1)%20The%20number%20of%20significant,when%20stating%20the%20experimental%20uncertainty. https://en.wikipedia.org/wiki/Significant_figures#Writing_uncertainty_and_implied_uncertainty @pat-alt

it just doesn't make sense to ask the computer to create a probability distribution with a variance with a huge number of digits when you are already uncertain on the first one. if i tell you: the distance is 10 meters +-1.234134523543264363636347 what's the point of giving you the "7" when i am already uncertain at the magnitude of meters?

pasq-cat commented 1 month ago

@pat-alt i will just create an example of what i think should be done in this branch.

pat-alt commented 1 month ago

Good idea!

pasq-cat commented 1 month ago

@pat-alt ok it requires a major overhaul and the juice isn't worth the squeeze. i guess we can close this issue.

pat-alt commented 1 month ago

Sure? Seems like rounding shouldn't require too much of an overhaul? I'm happy to keep this open and just park it for now (could be good one to pick up if during my absence you run out of things to do or hit roadblocks somewhere else). But I'm also OK with closing it if you want to abandon it

pasq-cat commented 1 month ago

Sure? Seems like rounding shouldn't require too much of an overhaul? I'm happy to keep this open and just park it for now (could be good one to pick up if during my absence you run out of things to do or hit roadblocks somewhere else). But I'm also OK with closing it if you want to abandon it

rounding makes the sum of the vectors of probabilities slightly different from 1 , which creates problem when we try to return categorical distributions. additionaly , the pytorch comparison tests fails because they do not implement any rounding.

pat-alt commented 1 week ago

Got it! Maybe we can just print the rounded values? Might be an overkill though, so also happy to close this

pasq-cat commented 1 week ago

Got it! Maybe we can just print the rounded values? Might be an overkill though, so also happy to close this

if someone uses it on a huge dataset the print statements will become annoying. i think we can close.