DmitryUlyanov / texture_nets

Code for "Texture Networks: Feed-forward Synthesis of Textures and Stylized Images" paper.
Apache License 2.0
1.22k stars 217 forks source link

Assign different weights to different style layers #65

Open michaelhuang74 opened 7 years ago

michaelhuang74 commented 7 years ago

@DmitryUlyanov Currently a single weight is assigned to all style layers during the training process. Is it possible to assign different weights to different style layers?

For example, if "style_layers" = "relu1_2,relu2_2,relu3_2,relu4_2", is it possible to use "style_weight" = "2,4,5,3" so that the weights of the four layers (relu1_2,relu2_2,relu3_2,relu4_2) are 2, 4, 5, 3, respectively, in training? Thanks.

Vladkryvoruchko commented 7 years ago

actually relu layers - is just activation layers which you include in overall image reconstruction, and you cannot set weights for each of them independently, alternatively you can play with different relu layers like relu1_1,relu2_1, relu3_2,relu4_3,relu5_2 etc. And see output results it can be even just relu1_1,relu4_4

michaelhuang74 commented 7 years ago

@Vladkryvoruchko Thanks for your response.

I asked this question because it is allowed to specify the individual weights of different style layers in the similar "fast-neural-style" project by Justin Johnson. Following is the explanation of the parameter in fast-neural-style.

-style_weights: Weights to use for style reconstruction terms. Either a single number, in which case the same weight is used for all style reconstruction terms, or a comma-separated list of weights of the same length as -style_layers.

DmitryUlyanov commented 7 years ago

Hey @michaelhuang74 , it is not possible with this code, I cut it for simplicity to be true. Did you find the cases when these weights are important?

michaelhuang74 commented 7 years ago

@DmitryUlyanov Thanks for the response. I just asked this question for curiosity. Sometimes, I found that it was difficult to get good results by simply excluding some layers. So I thought it might be good to include those layers but with lower weights.