Closed nec4 closed 4 years ago
As for the structure of the filter generating network, here is the diagram from the paper:
This block is characterized by the following sequential operations:
I am thinking:
FilterGeneratorNetwork
class in nnet.py
RBFlayer
class in layers.py
ShiftedSoftplus
class in layers.py
CFilterConv
class in layers.py`maybe a class for wrapping the interaction block too. Or perhaps is easier to adapt from the Schnet code? Not sure if they have a repo somewhere.
See #35.
I made a PR with some pseudo-code to outline the SchNet implementation.
See #42
I think now that most of the tools are already implemented via #47 and #53 we don't need this issue anymore.
Further discussions about the implementation/integration are continued in issue #80
see https://aip.scitation.org/doi/10.1063/1.5019779
Schnet is a variant of deep tensor neural networks (DTNNs). The main characteristic of Schnet is its interaction block which involves the following operations:![schnet_arch](https://user-images.githubusercontent.com/42926839/59898256-8d453f00-93b5-11e9-82df-f7502229f0d8.png)
The interaction block is characterized by a residual branch:
The output of this residual branch is added back to the input features (x1...xn) from the previous block, which is the forwarded through to the next block of the network. Essentially, there are two residual connections to account for: the first connects back to the cartesian coordinates through the filter generator, while the second connects to the feature inputs from the previous block.
Regarding the filter generator, it is essentially used in place of a filter-tensor (much like those found in convolutional networks). More accurately, the filter generating network is a special case of a factorized tensor layer found in traditional DTNN architectures.