NVIDIA / modulus-sym

Framework providing pythonic APIs, algorithms and utilities to be used with Modulus core to physics inform model training as well as higher level abstraction for domain experts
https://developer.nvidia.com/modulus
Apache License 2.0
138 stars 56 forks source link

🐛[BUG]: Confusion of GLU or GELU in activation.py #100

Open HydrogenSulfate opened 6 months ago

HydrogenSulfate commented 6 months ago

Version

main

On which installation method(s) does this occur?

Pip, Source

Describe the issue

In activation_mapping, Activation.GELU is mapping to F.gelu, but in module_activation_mapping, Activation.GELU is mapping to nn.GLU. I geuss these two function is totally different. GLU means "gated linear unit", and GELU means "Gaussian Error Linear Units". So is there any special settings for this or just a typo?

https://github.com/NVIDIA/modulus-sym/blob/357b119ec993ec4ace327ed9bca7f801f3a91b90/modulus/sym/models/activation.py#L93-L122

Minimum reproducible example

Relevant log output

No response

Environment details

No response

Other/Misc.

No response