Open ArtlyStyles opened 7 years ago
transfer style between any two images . ,Have you use fast-neural-style? Or only AdaIN-style?
How fast is the style transfer? Considering the README claims 17 second execution time for i7
On iphone 6, it takes about 24 seconds to process two 512x512 images. The image size is due to the memory limitation. It should be faster and able to process larger image on more recent devices.
Why not try to transfer it in GPU ,I mean if people use a app which progress a photo need 24s,I think most of people will delete it...
2017年6月22日 00:49,"Artly" notifications@github.com写道:
On iphone 6, it takes about 24 seconds to process two 512x512 images. The image size is due to the memory limitation. It should be faster and able to process larger image on more recent devices.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/yusuketomoto/chainer-fast-neuralstyle/issues/93#issuecomment-310139138, or mute the thread https://github.com/notifications/unsubscribe-auth/AY7E2TxEqim7zrxL5o1RoVkos57MA2F8ks5sGUmpgaJpZM4N9a1K .
It is already using the GPU
It's from gatys' Neural style?
2017年6月22日 00:58,"Artly" notifications@github.com写道:
It is already using the GPU
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/yusuketomoto/chainer-fast-neuralstyle/issues/93#issuecomment-310141204, or mute the thread https://github.com/notifications/unsubscribe-auth/AY7E2QeXXiFpvTaGw_PkbZZiFpPleVPXks5sGUuAgaJpZM4N9a1K .
Hi, I built an App that can transfer style between any two images. It uses fast-neural-style and AdaIN style transfer (https://github.com/xunhuang1995/AdaIN-style) and some of my own modification. The rendering part is implemented in pure native iOS code. No server, no internet connection is needed when you run the style.
Currently I am looking for beta tester for the App. If you are interested in trying it, please email to csong@smallpixelinc.com