Using k-fold cross validation with callback ReduceLROnPlateau, there is an issue.
When the training of the first fold is over, learning rate is still reduced...
Because of that, starting with the second fold, training becomes longer.
So, I update the code of cross_validation function a little.
Hey,
Using k-fold cross validation with callback ReduceLROnPlateau, there is an issue. When the training of the first fold is over, learning rate is still reduced... Because of that, starting with the second fold, training becomes longer. So, I update the code of cross_validation function a little.
Cheers, Jim