AntreasAntoniou / HowToTrainYourMAMLPytorch

The original code for the paper "How to train your MAML" along with a replication of the original "Model Agnostic Meta Learning" (MAML) paper in Pytorch.
https://arxiv.org/abs/1810.09502
Other
759 stars 137 forks source link

how to add lstm layer as meta learner #18

Open boy-be-ambitious opened 5 years ago

boy-be-ambitious commented 5 years ago

Most of maml code use cnn and use nn.functional to realize it. But sometimes we need lstm or other module which doesn't appear in nn.functional. I try to load net parameters repeatly in inner loop but the accuracy doesn't raise.

AntreasAntoniou commented 5 years ago

You would need to build an LSTM in the same flavour as the 'Meta' layers. This can be achieved by diving into the nn.LSTM layer and creating a new layer which has that functionality + can receive parameters to use for its internal functions.

tk1363704 commented 4 years ago

You would need to build an LSTM in the same flavour as the 'Meta' layers. This can be achieved by diving into the nn.LSTM layer and creating a new layer which has that functionality + can receive parameters to use for its internal functions.

I am faced with the same problem but I can not understand what is 'diving into the nn.LSTM layer and creating a new layer which has that functionality + can receive parameters to use for its internal functions'. If possible, could you please explain it more deeply, if it's not too much trouble. Cheers!