LuxDL / Lux.jl

Elegant & Performant Scientific Machine Learning in Julia
https://lux.csail.mit.edu/
MIT License
446 stars 50 forks source link

Different activation functions in one layer #680

Closed CatYukino closed 1 month ago

CatYukino commented 1 month ago

Hello.

I wonder if it is possible to use different activation functions in one layer? For example, there is a layer with 10 neurons, and I want to deal with the first 5 neurons using tanh and the rest 5 neurons using a sigmoid function. I tried to directly add a piecewise function is Lux.Chain while it belongs to WrappedFunction, not a common layer like form 'Lux.Dense(10 => 10, activation_func)'. So how to solve this problem?

Thanks!

avik-pal commented 1 month ago

WrappedFunction(x -> vcat(tanh.(x[1:5]), sigmoid.(x[1:5]))).

Please use Discourse/GH Discussions/Julia Slack for questions regarding usage. Issues are for reporting bugs or feature requests.