issues
search
bigscience-workshop
/
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Other
1.31k
stars
213
forks
source link
Add option to normalize loss per target
#326
Closed
Muennighoff
closed
1 year ago