Hi, thank you for your U-net implementation.
I am using your code to segment the aortic lumen from CTA images. I've tried both cross entropy and dice losses, with bad results. I think that the reason is that the classes are highly unbalanced.
I have implemented my own loss function to separately handle the class errors in this way:
I have used layers=4, features_root=64 and 400 iterations with batch_size = 6.
After the first epoch I get this result, where the last colum show the output map with values > 0.7:
The average loss after the first epoch is low (0.12).
I think the network is segmenting the aorta without considering the contextual information but only the graylevels. Do you think the problem is in my loss function?
Hi, thank you for your U-net implementation. I am using your code to segment the aortic lumen from CTA images. I've tried both cross entropy and dice losses, with bad results. I think that the reason is that the classes are highly unbalanced. I have implemented my own loss function to separately handle the class errors in this way:
elif cost_name == "my_cost": eps = 1e-5
Define error in aorta identification
I have used layers=4, features_root=64 and 400 iterations with batch_size = 6. After the first epoch I get this result, where the last colum show the output map with values > 0.7: The average loss after the first epoch is low (0.12). I think the network is segmenting the aorta without considering the contextual information but only the graylevels. Do you think the problem is in my loss function?
Thanks for your help,
Alice