Closed hewitta closed 10 months ago
Hi, it is by design. Each layer may specify its own dropout rate.
Yes, but even when the the hidden layer builder has its own specified dropout rate it doesn't seem to pass from the HiddenLayerBuilder to the HiddenLayer. The build function from the HiddenLayerBuilder reads:
public HiddenLayer build(int p) { return new HiddenLayer(neurons, p, activation); }
Fixed. Thanks for reporting.
Thank you much!
Describe the bug The HiddenLayerBuilder class does not pass the dropout rate to the HiddenLayer with the build method
Expected behavior Using Layer class .of method and specifying layers "input(50, 0.2)|relu(50, 0.2)" should apply 0.2 dropout to each layer
Actual behavior The dropout rate is applied to the input layer but no subsequent layers
Code snippet The code to reproduce the behavior.
Input data The sample data
Additional context