Framework providing pythonic APIs, algorithms and utilities to be used with Modulus core to physics inform model training as well as higher level abstraction for domain experts
In activation_mapping, Activation.GELU is mapping to F.gelu, but in module_activation_mapping, Activation.GELU is mapping to nn.GLU. I geuss these two function is totally different. GLU means "gated linear unit", and GELU means "Gaussian Error Linear Units". So is there any special settings for this or just a typo?
Version
main
On which installation method(s) does this occur?
Pip, Source
Describe the issue
In
activation_mapping
,Activation.GELU
is mapping toF.gelu
, but inmodule_activation_mapping
,Activation.GELU
is mapping tonn.GLU
. I geuss these two function is totally different. GLU means "gated linear unit", and GELU means "Gaussian Error Linear Units". So is there any special settings for this or just a typo?https://github.com/NVIDIA/modulus-sym/blob/357b119ec993ec4ace327ed9bca7f801f3a91b90/modulus/sym/models/activation.py#L93-L122
Minimum reproducible example
Relevant log output
No response
Environment details
No response
Other/Misc.
No response