Closed paul-bd closed 2 years ago
Thanks for your interest in blob loss.
"Maybe this is of my poor understanding"
It might also be my poor communication. This is actually where the looping through blobs in the batch starts;Line 110:
Maybe this is of my poor understanding
Now a batch can consist of multiple elements. In the example of the paper, an element might be an MS brain with n
amount of blobs. We loop through all the elements/items in the batch and through each blob in each item. In the end, we compute the mean loss for all elements in the batch.
Maybe mean_element_blob_loss
should be renamed to batch_blob_loss
so it becomes more clear?
Well indeed
but lets take a quick example of a brain (batch 1) with 5 multiple sclerosis lesions (5 blobs / connected components / ula) with as criterion, the dice loss
for each ula let's say the respective _blobloss be _labelloss = [0.9, 0.8, 0.7, 0,6, 0.5]
With the actual implementation: the _element_blobloss would be a list incremented at each ula being [0.9, 0.85, 0.8, 0.75, 0.7] ; so _mean_element_blobloss = 0.8. Whereas from my suggestion of implementation would be 0.7 (the average of _blobloss). Presently isn't the firsts ula weighted to impact more on final loss ?
Sorry for the late response, I was busy submitting my PhD thesis and starting a new job. You are completely right, this should not be part of the for loop and luckily it was not during the trainings.
This seems to have somehow happened during the copy-pasting for the GitHub release, I looked at the code in my IDE and everything looked fine but now spotted what you mean by looking at the GitHub version you see.
Learning for me: Releasing code through the GitHub web editor is dangerous.
Thanks once again for reporting!
closing here, feel free to reopen :)
Hi!
Maybe this is of my poor understanding but I would find it more "logical"
to unindent from line 148 to 157 in blob_loss.py
so that the element_blob_loss is only added once for each blob in batch ? Is there a specific reason for which it is at this indentation level ?
Best