zhulingchen / tfp-tutorial

TensorFlow Probability Tutorial
36 stars 12 forks source link

Analogy between example script in TFP library and tfp_bnn #2

Open patelmiteshn opened 4 years ago

patelmiteshn commented 4 years ago

Hi Zhulingchen,

This is not an issue with your script but more like a discussion.

Thanks for the tutorial on BNN.. they are really helpful for a starter like myself.

I tried tfp_bnn.ipynb and it works fine. I spent time on trying to draw an analogy between the bayesian neural network example code (https://github.com/tensorflow/probability/blob/master/tensorflow_probability/examples/bayesian_neural_network.py) provided in the tensorflow_probability library and tfp_bnn.ipynb notebook. To that end below is my observation and wanted to see if your understanding is same as mine.

  1. Loss Comparsion
    In the tfp_bnn.ipynb script, the loss fuction is defined neg log likelihood function whereas the second part of KL divergence is integrated using get_kernel_divergence_fn() in box 25 and 26.

Question is: Is this similar what the example code is doing in line (https://github.com/tensorflow/probability/blob/master/tensorflow_probability/examples/bayesian_neural_network.py#L264) ?

  1. Inference In box 28 of your script, y_pred_logits_list provides prediction over 100 runs. Looking closely the inference number in the list are negative and positive number which are then coverted in probabilities using softmax function.

Question is: Is this similar to probabilities derived from the categorical label_distribution defined in line 316 (https://github.com/tensorflow/probability/blob/master/tensorflow_probability/examples/bayesian_neural_network.py#L316)

Thanks once again for putting this script together. Its very helpful. Hopefully a brief discussion on deriving analogies between the example script and your script will give better understanding to people using this script as starters.

purva98 commented 4 years ago

yes, thanks for the script. it's very interesting