Closed Liupei-Luna closed 1 year ago
By default, we finetune our model based on previous optimizer weights. If you are using our provided weights without optimizer, you could comment this line. https://github.com/ewrfcas/MVSFormer/blob/a635b1714a51948f6e808766e9dafa73d05ed51a/train.py#L117
ok, it also appears, Error line : self.scaler.step(self.optimizer)
assert self._scale is not None, "Attempted {} but _scale is None. ".format(funcname) + fix AssertionError: Attempted step but _scale is None. This may indicate your script did not use scaler.scale(loss or outputs) earlier in the iteration.
Strange problem.
If you have set fp16, scaler.scale
should be run in
https://github.com/ewrfcas/MVSFormer/blob/77abe395cc32844478733e00aed9f73ad40ba05f/trainer/mvsformer_trainer.py#L134-L137
the default set is fp16 = true, can it be changed to false if self.fp16: self.scaler.scale(loss).backward() else: loss.backward() this is the same of my code, but the question appears
fp16 could be set to False, which is only used to save memory and speed up the training.
Same error has occured: AssertionError: Attempted step but _scale is None. This may indicate your script did not use scaler.scale(loss or outputs) earlier in the iteration. @Liupei-Luna I would like to ask if the problem has been solved? Thanks. @ewrfcas I use the default config with "config_mvsformer-p.json", only batchsize is changed to 2 (Beacuse i only have 2 RTX 3090 cards with 24GB memory).
when training,is_finetune = True, it appears KeyError: 'optimizer'