GateNLP / gate-lf-pytorch-json

PyTorch wrapper for the LearningFramework GATE plugin
Apache License 2.0
1 stars 2 forks source link

Calculate / override LSTM hidden layer size, override architecture details #2

Open johann-petrak opened 6 years ago

johann-petrak commented 6 years ago

The default size of the hidden layer for the LSTM should get calculated based on the LSTM input size and maybe other factors. It should be configurable from the modelwrapper constructor's config settings.

This requires that every layer we create can be identified uniquely by the name shown when the network is printed and that be used for setting the parameters, e.g. "{lstm:{hiddenunits=200, bidirectional=True}}"

johann-petrak commented 6 years ago

Depends on #3