To calculate the value of weight_regularizer & dropout_regularizer the dataset_size N is used. What if we don't know the dataset size in advance like in case of Lifelong Reinforcement learning in which case re-training happens on a periodic basis on new data as it comes.
I am not able to figure out in such cases how we will be able to use this concrete dropout implementation.
Can you please suggest on how to handle such situations using Concrete dropout. Will really appreciate it.
Hi,
To calculate the value of weight_regularizer & dropout_regularizer the dataset_size N is used. What if we don't know the dataset size in advance like in case of Lifelong Reinforcement learning in which case re-training happens on a periodic basis on new data as it comes.
I am not able to figure out in such cases how we will be able to use this concrete dropout implementation.
Can you please suggest on how to handle such situations using Concrete dropout. Will really appreciate it.
Thanks in Advance Nikhil