Open chenhaibin2019 opened 5 years ago
Maybe you can increase the receptive field of your discriminator. (add one conv layer for example). There is a texture synthesis work related to your task. You may want to have a look.
thank you. The reference is exactly what I want. Though I haven't wrap up my mind on how to train it with cyclegan, as I need to train A-> B generator to generate texture right. Any suggestions on how it can be implement with cyclegan?
Your implementation with CycleGAN looks good. Probably need to tweak some small details. In their paper, Figure 15 shows some texture transfer results with pix2pix.
I added one conv layer to the discriminator using nlayer = 4. The training images looks great and capture the long range pattern. However, the inference image (Fake A) clarity got compromised. It is hard to see if it captures the long range textile pattern. Is there a way, Cyclegan can tune the sharpness of the images, particularly at inference phase_.
There is no explicit way of doing that. Maybe you can add your test images to your training set. In your case, it is fine to train and test on the dataset.
thanks for the advice, I will try your suggestion But not sure I get it. Unfortunately, I don't have ground truth of my inference (real B). I only have the inference real A images. The objective of this work is to use cyclegan to synthesize the inference Fake A close to ground truth. If I only mix the inference real A with training real A, without providing the inference real B. Does it help?
If you have ground truth (real B), you should then use pix2pix rather than CycleGAN. Otherwise, you don't need the corresponding real B.
Using cyclegan, I am able to generate image close to real. However, the generated images miss the non-local textile patterns. For example in images below, the generated image missed the long range waved textile from ground truth. I am wondering if it is caused by capacity of discriminator/generator? How could I generate image include such non-local patterns? which will make the results much promising.