Closed Tiiiger closed 4 years ago
Hi @Tiiiger, thanks for the interest in our project.
You can find the Gigaword experiments here: https://github.com/ddkang/OpenNMT-py/tree/ll-compute
Please note that the c in that branch is 1-c in this repository.
As an example of a run:
time python ../OpenNMT-py/train.py -gpu_ranks 0 -world_size 1 -data sumdata/train/GIGA \
-save_model $OUTDIR/giga_new \
-encoder brnn \
-bridge \
-train_steps $NBTRAIN \
--expc 0.999 --min_count 10000 \
--use_dropper True \
--use_sent_norm False \
--learning_rate 0.1 \
--train_from "./models/hotstart_base/start_1x.pt" \
--dropc 0.4
Thank you!
hi @ddkang ,
Thank you for making the code available!
Quick question: do you plan to release the code for the experiments on Gigaword and E2E?