coobird / thumbnailator

Thumbnailator - a thumbnail generation library for Java
MIT License
5.16k stars 786 forks source link

Code improvement large images - OutOfMemoryError #187

Open JayJayBinks opened 2 years ago

JayJayBinks commented 2 years ago

Expected behavior

When uploading multipe large images (5000x7000) the resizing fails with OutOfMemoryError described in #1

Actual behavior

Resizing of multipe large images images is successfull

Environment

Extended description

Hi,

i am asking for help regarding memory optimization for my code.

I have a Spring Boot application where users can upload images with a max size of 5MB. Somehow someone managed to have an image with 1,2MB but a ratio of 5000x7000 The image is resized to several dimensions after uploading.

I am experiencing the memory issue described in #1 and applied the thumbnailator.conserveMemoryWorkaround****

So now the upload finally works for one image, but if multiple images with this dimension are uploaded at once the application fails with OutOfMemory again.

Thanks in advance!

public void saveBildInSizes(long id, byte[] uploadedImage) {
    bildRepository.updateSmallestByImageId(id, imageResizeService.resize(uploadedImage, BildSizeRequest.SMALLEST));
    bildRepository.updateSmallByImageId(id, imageResizeService.resize(uploadedImage, BildSizeRequest.SMALL));
    bildRepository.updateMediumByImageId(id, imageResizeService.resize(uploadedImage, BildSizeRequest.MEDIUM));
    bildRepository.updateLargeByImageId(id, imageResizeService.resize(uploadedImage, BildSizeRequest.LARGE));
    bildRepository.updateLargestByImageId(id, imageResizeService.resize(uploadedImage, BildSizeRequest.LARGEST));
}

private byte[] resize(byte[] image, BildSizeRequest bildSizeRequest) throws IOException {
    Thumbnails.Builder builder = Thumbnails.of(new ByteArrayInputStream(image);
    ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
    switch (bildSizeRequest) {
        case SMALLEST:
            builder.height(62);
            break;
        case SMALL:
            builder.height(124);
            break;
        case MEDIUM:
            builder.height(240);
            break;
        case LARGE:
            builder.height(366);
            break;
        case LARGEST:
            builder.height(810);
            break;
    }
    builder
        .keepAspectRatio(true)
        .imageType(BufferedImage.TYPE_INT_RGB)
        .outputFormat("jpg")
        .toOutputStream(byteArrayOutputStream);

    return byteArrayOutputStream.toByteArray();
}
coobird commented 2 years ago

@JayJayBinks, while the thumbnailator.conserveMemoryWorkaround could help, it does have a hardcoded lower bound to use at least a 600 x 600 image as the source. (The workaround uses "subsampling" settings to read the source image in a smaller dimensions to start with, then does the actual resize.)

In this particular case, it looks like you're resizing the original image (byte[] uploadedImage) multiple times, which may be contributing to a bit more memory usage than necessary. (The compressed image is decoded every time, which might be contributing to more memory usage.)

One suggestion -- it depends on the size of the image you're working with, but if you always are going to make the LARGEST through SMALLEST images, you could first resize to the LARGEST size first and hold onto it as a BufferedImage, then use that as a source to make your LARGE through SMALLEST images. That way, you reduce the amount of processing necessary by starting with a smaller image for the subsequent resizes. The image quality will likely suffer for the LARGE size (because you're starting off with less pixels) so you'll need to make a judgement call whether that's a compromise you can make.

Also note, file size for compressed images is not a good indicator for the dimension of images, because JPEGs could be very strongly compressed, and PNGs with little color variation can compress to very small sizes. (I just saved a blank 1920x1080 image to a 9 KB PNG.)