tensorflow / addons

Useful extra functionality for TensorFlow 2.x maintained by SIG-addons
Apache License 2.0
1.69k stars 610 forks source link

Improved Neural Arithmetic Logic Modules - layer to learn +-*/ operations in transparent way #2768

Closed FilipKubackiSoton closed 1 year ago

FilipKubackiSoton commented 1 year ago

Describe the feature and the current behavior/state. NALU (Neural Arithmetic Logic Modules) is the layer to learn addition, subtraction, multiplication, and division in a fully explainable/transparent way. Due to the mathematical nature of the layer, inference works well on OOD (out of distribution) inputs. Relevant information

Which API type would this fall under (layer, metric, optimizer, etc.) layer Who will benefit with this feature? Everyone Any other info. To increase the robustness of NALU I can incorporate iNALU improvements using the following arguments: gate_as_vector (bool): If True use gate as vector, otherwise like scalar clipping (float): Clipping value for weights if None do not use clipping force_operation (["add", "mul", None]): force final result to belong to add/subtract, multiply/divide, or both. input_gate_dependance (bool): If True make gate input dependent; otherwise independent.

FilipKubackiSoton commented 1 year ago

I realized that implementing only iNALU is not flexible enough. Therefore, I would like to close this issue due to the citation threshold. I want to raise a new issue related to NALU itself with configurable parameters that can utilize iNALU improvements.

Community, can I do this without causing too much disruption, or is it better to modify this thread?

AakashKumarNain commented 1 year ago

Please use a single issue for tracking related to one topic

seanpmorgan commented 1 year ago

TensorFlow Addons is transitioning to a minimal maintenance and release mode. New features will not be added to this repository. For more information, please see our public messaging on this decision: TensorFlow Addons Wind Down

Please consider sending feature requests / contributions to other repositories in the TF community with a similar charters to TFA: Keras Keras-CV Keras-NLP