Closed etetteh closed 3 years ago
Dear @etetteh,
Here is the correct signature for training_step_end
. It just takes outputs from the training_step
for you to make more processing on them. batch_idx
shouldn't be there.
def training_step_end(self, outputs):
# only use when on dp
outputs = torch.cat(outputs, dim=1)
softmax = softmax(outputs, dim=1)
out = softmax.mean()
return out
Small advice. When implementing model with Lightning, use auto-completion from your IDE. It will automatically add the right function signature.
Best, T.C
Thanks for the advice on the IDE. The idx issue got resolved, but the training_step_end function is not performing the expected job. My goal is to get the loss and y0 from the training_step, to perform the operations under the training_step_end
My first time using lightning. Basically, I am trying to convert the following code into lightning format:
What I have done so far is this:
When I run the code, I get the error: