hughperkins / DeepCL

OpenCL library to train deep convolutional neural networks
Mozilla Public License 2.0
865 stars 199 forks source link

Dropout Backprop Runtime Error Fix #130

Closed 0StackOverflow0 closed 6 years ago

0StackOverflow0 commented 6 years ago

A runtime error was created when backprop was ran without first generating weights.

Placing generateWeights() with the variable declaration in setBatchSize() should always gaurentee that weights will be available.

Though, it is recommended to net->setTraining(true) before forward/back prop to ensure freshly generated weights each training input.

0StackOverflow0 commented 6 years ago

The other method to fix this would be to have an alternate path in backprop for non-training (seems like an oxymoron).

If the forward for the training multiplied the dropRatio, I'd assume you'd divide in reverse (but again, not sure this is a worthy action; as it seems to defeat the purpose of a Dropout Layer). This would bypass loading the weights at all.

hughperkins commented 6 years ago

Thanks! :)