Deep learning seems to contain every algorithm imaginable! ChatGPT mentions the following kinds of slightly less conventional "neural networks":
Capsule Networks (CapsNets)
Neural Turing Machines (NTMs)
Neural Cellular Automata
Neural Architecture Search (NAS)
Graph Neural Networks (GNNs)
Reservoir Computing
Echo State Networks (ESNs)
Transformers (not so less conventional :))
Radial Basis Function Networks (RBFNs)
Neural Differential Equations (ODE-Nets)
Self-Organizing Maps (SOMs)
Extreme Learning Machines (ELMs)
Holographic Neural Networks
Kohonen Networks (Self-Organizing Feature Maps)
Evolving Neural Networks (ENN)
Fuzzy Neural Networks
Counterpropagation Networks (CPNs)
Retrograde Neural Networks (Retros)
Hierarchical Temporal Memory (HTM)
Neural Programmer-Interpreters (NPIs)
And then there are GANs, Autoencoders, Siamese Networks and more.
It is clear then, that we can't create components tailored to each network, but instead construct a system that allows the user to write any kind of program they want to write, while adhering to a layer-based paradigm (at least as long as Keras adheres to it). This is not a trivial task, as this will span the entire development of DNNCASE (in other words, will continue in next semester as well), but we will have to ensure that most if not all of the mentioned networks can be implemented in DNNCASE (even though the user may have to define custom artefacts).
This can be done by coming up with a logical design of DNNCASE's functionalities, then seeing if something is not implementable, and then modifying our designs to incorporate that thing as well.
Note: This is not a very complex task. All we need to do is keep a proper record of what component of the system does what, and how do components interact with each other. We will look into this when working with artefacts as well.
Deep learning seems to contain every algorithm imaginable! ChatGPT mentions the following kinds of slightly less conventional "neural networks":
And then there are GANs, Autoencoders, Siamese Networks and more.
It is clear then, that we can't create components tailored to each network, but instead construct a system that allows the user to write any kind of program they want to write, while adhering to a layer-based paradigm (at least as long as Keras adheres to it). This is not a trivial task, as this will span the entire development of DNNCASE (in other words, will continue in next semester as well), but we will have to ensure that most if not all of the mentioned networks can be implemented in DNNCASE (even though the user may have to define custom artefacts).
This can be done by coming up with a logical design of DNNCASE's functionalities, then seeing if something is not implementable, and then modifying our designs to incorporate that thing as well.
Note: This is not a very complex task. All we need to do is keep a proper record of what component of the system does what, and how do components interact with each other. We will look into this when working with artefacts as well.