keiserlab / keras-neural-graph-fingerprint

Keras implementation of Neural Graph Fingerprints as proposed by Duvenaud et al., 2015
MIT License
46 stars 24 forks source link

Add support for multiple internal layers. #5

Closed tivaro closed 8 years ago

tivaro commented 8 years ago

Currently the NeuralGraphHidden and NeuralGraphOutput rely on an internal layer that can be specified by providing the class dense_layer_type at initialisation.

Internally, a TimeDistributed() is wrapped around this class. In order to apply batch_normalisation or dropout, it needs to be possible to provide more than one layer.

Dropout, for example will need to be called just before the Dense and redirect the input one to one.

The thus needs to be support for multiple layers, like dropout or batchnorm.

The easiest solution I can see now is providing a list of layers (initialised, but not build) that get called subsequently. An adavantage is this that the NeuralGraph-layers do not have to deal with the kwargs of the internal layers.

For the NeuralGraphOutput this would definitely work, but for the NeuralGraphHidden we would somehow have to copy the layers for each timestep (as they are already initialised).

tivaro commented 8 years ago

This can be arranged outside of NeuralGraphHidden (e.g. as is the case in NGF/models.py ).