In the text we say: "Activation layers apply an element-wise non-linear activation function rescaling their inputs onto the range [−1, 1]" lines 200-202.
On inspection, the ELU function we use is between the ranges [-a,infinity], where a is a set constant (a=1 by default). So it is worth generalizing the text to match our case.
In the text we say: "Activation layers apply an element-wise non-linear activation function rescaling their inputs onto the range [−1, 1]" lines 200-202.
On inspection, the ELU function we use is between the ranges [-a,infinity], where a is a set constant (a=1 by default). So it is worth generalizing the text to match our case.