ddkang / loss_dropper

Apache License 2.0
51 stars 9 forks source link

Code Release for Experiments #1

Closed Tiiiger closed 4 years ago

Tiiiger commented 4 years ago

hi @ddkang ,

Thank you for making the code available!

Quick question: do you plan to release the code for the experiments on Gigaword and E2E?

ddkang commented 4 years ago

Hi @Tiiiger, thanks for the interest in our project.

You can find the Gigaword experiments here: https://github.com/ddkang/OpenNMT-py/tree/ll-compute

Please note that the c in that branch is 1-c in this repository.

ddkang commented 4 years ago

As an example of a run:

  time python ../OpenNMT-py/train.py -gpu_ranks 0 -world_size 1 -data sumdata/train/GIGA \
    -save_model $OUTDIR/giga_new \
    -encoder brnn \
    -bridge \
    -train_steps $NBTRAIN \
    --expc 0.999 --min_count 10000 \
    --use_dropper True \
    --use_sent_norm False \
    --learning_rate 0.1 \
    --train_from "./models/hotstart_base/start_1x.pt" \
    --dropc 0.4
Tiiiger commented 4 years ago

Thank you!