Add the ability to create "always-on" dropout layers as proposed here.
Current behavior/state.
tf.keras.layers.Dropout operates differently at train and test time. It randomly drops node during training while simply passing through the inputs during testing.
Currently, implementing the suggested feature requires writing a custom dropout layer:
class Dropout(tf.keras.layers.Layer):
"""Always-on dropout layer, i.e. it does not respect the training flag set to
true by model.fit and false by model.predict. Unlike tf.keras.layers.Dropout,
this layer does not return input unchanged if training=false, but always
randomly drops a fraction self.rate of the input nodes.
"""
def __init__(self, rate, **kwargs):
super().__init__(**kwargs)
self.rate = rate
def call(self, inputs):
return tf.nn.dropout(inputs, self.rate)
def get_config(self):
"""enables model.save and restoration through tf.keras.models.load_model"""
config = super().get_config()
config["rate"] = self.rate
return config
Who will benefit from this feature?
Everyone using dropout to run Bayesian neural networks. Hence tfp users in particular may benefit from this feature.
I would like to help to implement this if everyone is good with that. I have been looking at the source code, and I think the approach should be:
Fork.
Create MonteCarloDropout layer on ./tensorflow_probability/python/layers/monte_carlo_dropout.py .
Create tests for the MonteCarloDropout layer on ./tensorflow_probability/python/layers/monte_carlo_dropout_test.py . I'm not quite sure about what kind of tests should I implement for a layer like this. Any ideas?
Add MonteCarloDropout import and name to _allowed_symbols on ./tensorflow_probability/python/layers/init.py .
Add new entry for test and layer on ./tensorflow_probability/python/layers/BUILD .
Feature
Add the ability to create "always-on" dropout layers as proposed here.
Current behavior/state.
tf.keras.layers.Dropout
operates differently at train and test time. It randomly drops node during training while simply passing through the inputs during testing.Currently, implementing the suggested feature requires writing a custom dropout layer:
Who will benefit from this feature?
Everyone using dropout to run Bayesian neural networks. Hence
tfp
users in particular may benefit from this feature.Additional info
This issue is a follow up to tf#28484.