Closed schrum2 closed 7 years ago
Regarding the ReLU feature, I think that each HyperNEAT substrate should have the ability to indicate which activation functions all of its nodes use. A value of -1 can indicate that the default ftype is used. Add this when others are not fiddling with the substrates as much.
Add support for zero padding, and use of ReLU (or other activation functions?) in convolutional layers.