schelotto / Neural_Speed_Reading_via_Skim-RNN_PyTorch

PyTorch implementation of "Neural Speed Reading via Skim-RNN"
MIT License
18 stars 9 forks source link
pytorch skim-rnn

Introduction

This is a PyTorch implementation of Neural Speed Reading via Skim-RNN published on ICLR 2018. Skim RNN

The imdb dataset is used by default and stored in the ./data folder. Besides, the 300 dimensional GloVe word embedding trained under 840 billion words is used.

Unlike Skip RNN or Jump LSTM where the objective is discrete, Skim RNN introduces the Gumbel-softmax parametrization trick that makes the skimming objective differentiable:

Gumbel-Softmax

Usage

python main.py [arguments]

Arguments

-h, help                    help
-large_cell_size            size of the large LSTM
-small_cell_size            size ofthe small LSTM