liuwei1206 / LEBERT

Code for the ACL2021 paper "Lexicon Enhanced Chinese Sequence Labelling Using BERT Adapter"
336 stars 60 forks source link

when fp16 run some variables is not defined. #34

Closed lvjiujin closed 2 years ago

lvjiujin commented 2 years ago
      if args.fp16 and _use_native_amp:

                scaler.scale(loss).backward()

the scaler maybe you forget to define it , it can be defined as the following: scaler = torch.cuda.amp.GradScaler()

liuwei1206 commented 2 years ago

Hi,

Thanks for your correction and sharing!