Open prabindh opened 2 years ago
I did not find the option for the mentioned control in their hub model @ https://tfhub.dev/google/magenta/arbitrary-image-stylization-v1-256/2. It is possible to use a different hub model that has this control as the input option. Or as an easy alternative (though not a complete match with what you want), you can write a linear interpolation (or other types of interpolation, alpha blending may also be an option) between the source image A and the style transferred image B. Something like C = \alpha A + (1-\alpha) B where alpha controls how close it look towards the original image.
If we take the style transferred image as-is, it already has the style and content together, any alpha we apply will apply across both, so not what I was looking for. I think it will have to be fed into a loss metric, will check further.
Compared with other style transfer approaches, this repo seems to lack an option to set the (%) amount of style to be applied on the input content. Where can this be added ?