jcjohnson / fast-neural-style

Feedforward style transfer
4.29k stars 816 forks source link

modified fast neural net losses #124

Open mmkqasim opened 7 years ago

mmkqasim commented 7 years ago

Hi,

I have pruned fast neural net based on the magnitude of the weights. Now, I want to evaluate the prediction loss. I have gone through slow_neural_style.lua and train.lua to figure out how the losses are computed. I have tried to do the same but with no success. Is there a simple script that I could write to evaluate the perceptual loss or total loss, without training. I don't want to train. All I want is to evaluate the losses after pruning/modification.

Greatly appreciate your response.

Thanks!

htoyryla commented 7 years ago

If I am not mistaken, fast-neural-style calculates losses only during training. When the trained model is used to process an image, loss calculation is not needed and not used. Furthermore, during training losses are evaluated using a VGG network, which is not used by fast-neural-style after training. So if you want to evaluate losses for a fast-neural-style transfer, you would have to implement a script that takes the content, style and result images and uses a vgg network to capture targets and calculate the loss. I am not sure how useful this is but I guess it could be used as a comparative indicator provided the same script with the same layer settings is used to compare losses for an image from the original net and the pruned net.

I would use slow-neural-style or the original neural-style as a basis. Take the content and style images and run them through the vgg network with the loss modules in capture mode. Then instead of starting the optimization, replace the content image with a result from fast-neural-style and run through vgg again but with the loss modules in loss mode. This gives the losses for this result image, e.g. from an unmodified network. Then get the corresponding losses for a result image from a pruned network. Use the same style image and layer settings every time.