yaringal / BayesianRNN

Code for the paper "A Theoretically Grounded Application of Dropout in Recurrent Neural Networks"
http://mlg.eng.cam.ac.uk/yarin/publications.html#Gal2015Theoretically
MIT License
376 stars 76 forks source link

Variational Dropout in Keras #5

Open cjnolet opened 6 years ago

cjnolet commented 6 years ago

I notice in your readme it states the Variational Dropout algorithm has been implemented in Keras’ RNN library.

I want to verify that it is implemented exactly as described in your paper (that the exact dropped out connections remain constant throughout training).

I’m implementing the encoder-decoder framework from the paper on Uber’s timeseries anomaly prediction model.

Thank you.

yaringal commented 6 years ago

It should be - I helped with the implementation!

rohitash-chandra commented 6 years ago

go ahead now :)

Namaste

Rohit

On Thu, Dec 28, 2017 at 8:11 PM, yaringal notifications@github.com wrote:

It should be - I helped with the implementation!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/yaringal/BayesianRNN/issues/5#issuecomment-354255980, or mute the thread https://github.com/notifications/unsubscribe-auth/AIIqmkici_pS8JmfJL2dipVl2Z7bKPcbks5tE1tcgaJpZM4RN90L .

cjnolet commented 6 years ago

Awesome! Any ideas on how I might implement Monte Carlo dropout with it?

I know this might be the wrong place to ask but given that you helped with the implementation, I figure it’s good to get it from the source. Is there some way to keep the recurrent dropout running in the forward pass?

Thanks again!

Sent from my iPhone

On Dec 28, 2017, at 4:23 AM, Rohitash Chandra notifications@github.com wrote:

go ahead now :)

Namaste

Rohit

On Thu, Dec 28, 2017 at 8:11 PM, yaringal notifications@github.com wrote:

It should be - I helped with the implementation!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/yaringal/BayesianRNN/issues/5#issuecomment-354255980, or mute the thread https://github.com/notifications/unsubscribe-auth/AIIqmkici_pS8JmfJL2dipVl2Z7bKPcbks5tE1tcgaJpZM4RN90L .

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

yaringal commented 6 years ago

Keras has a new flag Training (I think) that you can pass in the construction of the layer - this should keep dropout enabled at test time as well. Otherwise, have a look at some of my recent repos for how to compile a Keras model to do sampling at test time (eg acquisition function example)

rutagara commented 5 years ago

Hi @cjnolet,

I am also trying to reproduce the framework from the Uber's paper in Python with Keras and Tensorflow backend but I'm not sure about the correctness of my implementation. Did you manage to code it, and is the code available somewhere?

Thanks a lot