Closed 0StackOverflow0 closed 6 years ago
The other method to fix this would be to have an alternate path in backprop for non-training (seems like an oxymoron).
If the forward for the training multiplied the dropRatio, I'd assume you'd divide in reverse (but again, not sure this is a worthy action; as it seems to defeat the purpose of a Dropout Layer). This would bypass loading the weights at all.
Thanks! :)
A runtime error was created when backprop was ran without first generating weights.
Placing
generateWeights()
with the variable declaration insetBatchSize()
should always gaurentee that weights will be available.Though, it is recommended to
net->setTraining(true)
before forward/back prop to ensure freshly generated weights each training input.