Hi, I'd like to implement a hook that saves hard training samples, choosing them by their loss value. Training my model in batches returns either a cumulative loss (mean/sum) or, when using reduction='none' raw loss tensors for my 5 feature maps, flattened into (n_batches*n_anchors,1) shape. To be clear, i'm getting these from a forward pass calling losses = runner.model(**runner.data_batch)
My questions is how to unpack these flattened tensors, to get the sample-wise losses? (e.g. tensor of shape (n_batches, n_anchors) or (n_batches, n_anchors_row, n_anchors_column), or similar).
@marselap This requires a lot of code changes. The most important thing is that you need to return the sample-wise loss in the loss calculation function.
Hi, I'd like to implement a hook that saves hard training samples, choosing them by their loss value. Training my model in batches returns either a cumulative loss (
mean
/sum
) or, when usingreduction='none'
raw loss tensors for my 5 feature maps, flattened into(n_batches*n_anchors,1)
shape. To be clear, i'm getting these from a forward pass callinglosses = runner.model(**runner.data_batch)
My questions is how to unpack these flattened tensors, to get the sample-wise losses? (e.g. tensor of shape
(n_batches, n_anchors)
or(n_batches, n_anchors_row, n_anchors_column)
, or similar).