nlpyang / PreSumm

code for EMNLP 2019 paper Text Summarization with Pretrained Encoders
MIT License
1.29k stars 465 forks source link

How the loss back propagate when using multiple gpus for extract task #215

Open gaozhiguang opened 3 years ago

gaozhiguang commented 3 years ago

Hi, i read your code, and i have some questions, how the loss back propagate when using multiple gpus for extract task,here is your code: image Here, the loss back propagation is just the loss back propagation in a process, and the distributed module is not used to add the loss of each process running on different gpu, it's a little diferent from your code in abs task.