Open RaveryNet opened 8 years ago
I'm getting x1400 ish and then using waifu2x to double the final image size which works a treat
In addition to Waifu2x, there is also: NIN Upres. This technique works best for traditional-art styles (like van gogh's starry night) where you can see some grunge/noise. For smooth styles (anime/sharp-edged) you're better off using waifu2x.
1400 seems a bit high imho.
on 4gb (amazon ec2) one gets around 800px, since we have twice the ram
800*sqrt(2)=1130
... are you 100% sure? it would be amazing if that's the resolution.
on my 4GB 970GTX (where ~300 MB are taken up by the desktop) I can do 920x920 images using ADAM (try normalize_gradients and learn_rate 2.0 for starters). Maybe it also has something to do with better compression rates in later NVIDIA cards, since those on AWS are quite old? Also, if you increase style_scale, the RAM usage will go up respectively.
with adam it sound feasible then, but I always read that adam was a really bad choice.
does it work well for you?
It's really not that bad. I suppose most people saying that didn't turn on gradient normalization, which is even recommended in the readme.md
GTX 1080
th neural_style.lua -content_image ./examples/inputs/brad_pitt.jpg -style_image ./examples/inputs/frida_kahlo.jpg -image_size 1500 -backend cudnn -cudnn_autotune -optimizer adam -num_iterations 100 -init image -style_scale 1
1187x1500 with adam 871x1100 without adam
thanks pocketmoon for those informations ! so may I assume that as the 1070 has the same amount of vram, results would be similar ?
Thank you for the info! Pocketmoon, how long does learning + rendering take with images this size?
I am considering to get a GTX 1080 or 1070 (8GB VRAM).
Does anyone know how big the DeepArt pictures can be, which I can create on it?
on a GTX 570 (1280 MB VRAM) but could not create any images, because I was out of RAM immediately (even if I tried 64x64 pictures). Any experiences or educated guesses?