anishathalye / neural-style

Neural style in TensorFlow! 🎨
https://anishathalye.com/an-ai-that-can-mimic-any-artist/
GNU General Public License v3.0
5.54k stars 1.52k forks source link

Is there a way to make it work with bigger images? #96

Closed ErfolgreichCharismatisch closed 6 years ago

ErfolgreichCharismatisch commented 7 years ago

Anything beyond 640 x 480 pixels leads to an out of memory error.

Is there a way to make it work with bigger images?

Steven-N-Hart commented 6 years ago

As a workaround, you can decrease the image size on the fly by modifying the imread function:

    img = scipy.misc.imread(path).astype(np.float)
    if len(img.shape) == 2:
        # grayscale
        img = np.dstack((img, img, img))
    elif img.shape[2] == 4:
        # PNG with alpha channel
        img = img[:, :, :3]
    return scipy.misc.imresize(img, size=(255, 255)) # change to this
anishathalye commented 6 years ago

Or get a GPU with more memory: I can handle ~1000x1000 images on my Titan X

On Nov 25, 2017, at 10:39 AM, Steven N Hart notifications@github.com wrote:

As a workaround, you can decrease the image size on the fly by modifying the imread function:

img = scipy.misc.imread(path).astype(np.float)
if len(img.shape) == 2:
    # grayscale
    img = np.dstack((img, img, img))
elif img.shape[2] == 4:
    # PNG with alpha channel
    img = img[:, :, :3]
return scipy.misc.imresize(img, size=(255, 255)) # change to this

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

vitalykovalgit commented 6 years ago

Is it possible to make it work on even bigger images like, 3000x3000 or 5000x4000 or even bigger, I do not care about time, how long it will take to process, just want to fix out of memory exception. Right now it process around 1000x1000 on my PC, but it is small one

anishathalye commented 6 years ago

You can do it on CPU using system memory, but it will take a very long time 😛

Alternatively, you could try using fast style transfer which uses an entirely different algorithm to do all processing in a feedforward way (once you train a model on a particular style).