I wonder if it is possible to use different activation functions in one layer? For example, there is a layer with 10 neurons, and I want to deal with the first 5 neurons using tanh and the rest 5 neurons using a sigmoid function. I tried to directly add a piecewise function is Lux.Chain while it belongs to WrappedFunction, not a common layer like form 'Lux.Dense(10 => 10, activation_func)'. So how to solve this problem?
Hello.
I wonder if it is possible to use different activation functions in one layer? For example, there is a layer with 10 neurons, and I want to deal with the first 5 neurons using tanh and the rest 5 neurons using a sigmoid function. I tried to directly add a piecewise function is Lux.Chain while it belongs to WrappedFunction, not a common layer like form 'Lux.Dense(10 => 10, activation_func)'. So how to solve this problem?
Thanks!