Open SpaceCowboy850 opened 7 years ago
Hi
thanks a lot for the comment, I may take a look and see what settings work best for that texture. It is indeed sometimes trial-and-error to choose the right values for patch size, image size of the texture, and network depth (which determines the receptive field)
As a sidenote, please take also a look at https://github.com/ubergmann/psgan , our latest algorithm can handle much more diverse texture images.
That's great to hear! We've implemented gatys on C++ windows caffe but are now looking to get the speed and texture size up, so this looks promising if we can get it to work.
Some of the artifacts of the images may be a general issue with deconv. filter implementations, see http://distill.pub/2016/deconv-checkerboard/
I changed a bit the parameters and tried with your texture
nz = 20 # num of dim for Z at each field position zx = 7 # number of spatial dimensions in Z batch_size = 20 epoch_iters = batch_size * 500
also used 4 mirrored versions of your texture for data augmentation - since it is a bit small for a good texture
This is what I got, looks a bit better than yours ?
Ah, okay, that's good to know. I'll play around with it some more. I'm aware of the checkerboard link, so I'll take a look at that after we've played around with parameterization a bit. Thank you!
Hello,
I was interested in checking this out as it proposes to preserve larger features, create larger textures, and generate faster than Gatys. But outside your model that you've uploaded, I cannot produce good results on textures that I give it myself.
This is the source texture I used, a 512x512
After over an hour and half of training (on a Titan X Pascal), this is the snapshot:
It doesn't seem like the default parameterization do very well. We've trained for over 20 hours and it pretty much stays at this level of quality.