issues
search
lesliejackson
/
PyTorch-Distributed-Training
Example of PyTorch DistributedDataParallel
59
stars
24
forks
source link
关于batch_size
#3
Open
qiusg
opened
3 years ago
qiusg
commented
3 years ago
如果传进来的Batch_size=100,有两张卡,那么是一张卡50还是一张卡100呢
lesliejackson
commented
3 years ago
batch size会均分到每张卡上
如果传进来的Batch_size=100,有两张卡,那么是一张卡50还是一张卡100呢