Closed spot92 closed 4 years ago
It's not the same thing as -normalize_gradients
and thus far no one has been able to get that feature working in PyTorch.
By default, the same style and content weight values that you choose are used for all the content and style layers. The -normalize_weights
parameter alters weight values on a per layer basis, by dividing the weight for each layer by the number of channels in the layer. Lower layers have fewer channels and higher layer have more channels.
The feature was inspired by what Leon Gatys does in: NeuralImageSynthesis
@spot92 I finally figure out how to recreate the -normalize_gradients
parameter in PyTorch, and I've added it to neural-style-pt!
Does normalize_weights work the same as normalize_gradients from jcjohnson? I have seen you posting about setting content_weight to 0 in order to achieve the same effect as normalize_gradients, though. So if this is not the case, what does normalize_weights do?