Closed essential-gx closed 2 years ago
error : FP16_Optimizer is not an Optimizer.
my code is error : lr_scheduler = torch.optim.lr_scheduler.MultiStepLR( optimizer.optimizer, cfg.TRAIN.LR_STEP, cfg.TRAIN.LR_FACTOR, last_epoch=last_epoch ) ‘optimizer.optimizer’ is missing.
error : FP16_Optimizer is not an Optimizer.