Closed FilipKubackiSoton closed 1 year ago
I realized that implementing only iNALU is not flexible enough. Therefore, I would like to close this issue due to the citation threshold. I want to raise a new issue related to NALU itself with configurable parameters that can utilize iNALU improvements.
Community, can I do this without causing too much disruption, or is it better to modify this thread?
Please use a single issue for tracking related to one topic
TensorFlow Addons is transitioning to a minimal maintenance and release mode. New features will not be added to this repository. For more information, please see our public messaging on this decision: TensorFlow Addons Wind Down
Please consider sending feature requests / contributions to other repositories in the TF community with a similar charters to TFA: Keras Keras-CV Keras-NLP
Describe the feature and the current behavior/state. NALU (Neural Arithmetic Logic Modules) is the layer to learn addition, subtraction, multiplication, and division in a fully explainable/transparent way. Due to the mathematical nature of the layer, inference works well on OOD (out of distribution) inputs. Relevant information
CONTRIBUTING.md
Which API type would this fall under (layer, metric, optimizer, etc.) layer Who will benefit with this feature? Everyone Any other info. To increase the robustness of NALU I can incorporate iNALU improvements using the following arguments:
gate_as_vector (bool)
: If True use gate as vector, otherwise like scalarclipping (float)
: Clipping value for weights if None do not use clippingforce_operation (["add", "mul", None])
: force final result to belong to add/subtract, multiply/divide, or both.input_gate_dependance (bool)
: If True make gate input dependent; otherwise independent.