harvardnlp / seq2seq-attn

Sequence-to-sequence model with LSTM encoder/decoders and attention
http://nlp.seas.harvard.edu/code
MIT License
1.26k stars 278 forks source link

Can prune.lua apply to the model trained by OpenNMT #91

Closed IdiosyncraticDragon closed 7 years ago

IdiosyncraticDragon commented 7 years ago

The ReadMe of this project says the openNMT project rewrite the seq2seq. It seems like these two projects are related. Does it possible to directly apply prune.lua of this project to the models trained by OpenNMT? I tried, but failed. Here is the error message: ➜ seq2seq-attn git:(master) : th prune.lua -model ../OpenNMT/model_epoch13_861.69.t7 -savefile test -ratio 0.6 -prune blind -gpuid 1 (below is the log) using CUDA on GPU 1... loading model ../OpenNMT/model_epoch13_861.69.t7 /home/gitProject/torch/install/bin/luajit: /home/gitProject/torch/install/share/lua/5.1/torch/File.lua:343: unknown Torch class <Dict> stack traceback: [C]: in function 'error' /home/gitProject/torch/install/share/lua/5.1/torch/File.lua:343: in function 'readObject' /home/gitProject/torch/install/share/lua/5.1/torch/File.lua:369: in function 'readObject' /home/gitProject/torch/install/share/lua/5.1/torch/File.lua:369: in function 'readObject' /home/gitProject/torch/install/share/lua/5.1/torch/File.lua:369: in function 'readObject' /home/gitProject/torch/install/share/lua/5.1/torch/File.lua:409: in function 'load' prune.lua:87: in function 'main' prune.lua:122: in main chunk [C]: in function 'dofile' ...ject/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk [C]: at 0x00406670

guillaumekln commented 7 years ago

The two projects are indeed related but not compatible with each other.

IdiosyncraticDragon commented 7 years ago

@guillaumekln I see. Thank you!