Closed mlkrock closed 2 years ago
Hi @mlkrock, I suspect this is due to our implementation clipping the sigmoid function to prevent overflow or underflow from causing NaN
s during optimization:
https://github.com/facebookincubator/flowtorch/blob/main/flowtorch/bijectors/sigmoid.py#L18 https://github.com/facebookincubator/flowtorch/blob/coupling/flowtorch/ops/__init__.py#L19
I.e. it's by design :)
Issue Description
Hello, I think I stumbled across a potential issue when i was trying to compare to tensorflow, you can see it here. https://github.com/tensorflow/probability/issues/1556
Steps to Reproduce
setup
Expected Behavior
flow.log_prob(torch.zeros(2))
gives -7280.1641, but I think it should give a nan since sigmoid(sigmoid(x)) is in [0.5, 0.73]flow.log_prob(torch.zeros(2)+0.5)
gives -7452.0645 butflow.log_prob(torch.zeros(2)+0.51)
gives -2.6445. So it does reflect the domain [0.5,0.73] in terms of the magnitude of these probabilities, but should it give nans when outside the domain, or are these extremely small probabilities not a bug ?System Info