Our framework currently does not receive hyperparameters when implementing the activation layer. Because of this, layers that require separate hyperparameter settings such as Leaky ReLU are always implemented to operate with a fixed constant value for now. If technically possible but not applied, we need to update existing activation layers(leaky, selu, and so on).
Our framework currently does not receive hyperparameters when implementing the activation layer. Because of this, layers that require separate hyperparameter settings such as Leaky ReLU are always implemented to operate with a fixed constant value for now. If technically possible but not applied, we need to update existing activation layers(leaky, selu, and so on).