Nikronic / CoarseNet

Modified version of U-net based on "Convolutional Networks for Biomedical Image Segmentation (Ronneberger et al., 2015)" paper.
https://arxiv.org/abs/1505.04597
MIT License
25 stars 2 forks source link

Loss function has been implemented wrongly #7

Closed Nikronic closed 5 years ago

Nikronic commented 5 years ago

The loss function of CoarseNet is adapted from VGG style transfer loss which means, first we have to convert Ground Truth Image and Estimated Image to a low spatial tensors, then calculate loss using Gram Matrix and L2 (MSE) error.

We have similar situations in DetailsNet.

Please revise these line in DetailsNet implementation: loss.py:58 vgg.py

image

Nikronic commented 5 years ago

Which hidden layer should be chosen as a low-spatial latent vector to pass to loss function? In the DetailsNet, layers 2 and 5 have been chosen on VGG19BN. But here we are dealing with VGG16BN which has fewer hidden layers.

Note: the last layer of VGGs is classifier so we cannot use them because of 1-D vector representation of the last layer to match with a class of object.

Nikronic commented 5 years ago

We merge maxpool2 and maxpool 5 in loss function. https://github.com/geonm/EnhanceNet-Tensorflow/blob/master/train_SR.py#L90 Note: Before combining them, first we should patch them then calculate loss on merged ones.

Nikronic commented 5 years ago

Note we should not apply patch-wise loss. Patch-wise loss only has been used in DetailsNet, but loss function of CoarseNet has been motivated by neural style transfer.![image] image

Note: in the reference paper of Neural Style Transfer, they used different layers from EnhanceNet. Note: EnhanceNet patch-loss which uses VGG is required for DetailsNet Note: NeuralStyleTransfer loss (not patched) which uses VGG is required for CoarseNet