Closed drkp4 closed 2 years ago
@drkp4 Yes, either should work because the training_step
passes the input through the forward
method. But PyTorch Lightning documentation suggests -
"In Lightning we suggest separating training from inference. The training_step
defines the full training loop. We encourage users to use the forward
to define inference actions.
Of course, nothing is preventing you from using forward from within the training_step.
It really comes down to your application. We do, however, recommend that you keep both intents separate.
Use forward for inference (predicting).
Use training_step for training."
So, it's more of a guideline than a rule.
Closing as no further response is received.
Shouldn't
def training_step(self, batch, batch_idx):
/outputs = self.forward(inputs)
instead ofoutputs = self(inputs)
?