kean / Nuke

Image loading system
https://kean.blog/nuke
MIT License
8.11k stars 528 forks source link

Gaussian blur without border artifacts (CoreImage bug) #308

Open mtrezza opened 5 years ago

mtrezza commented 5 years ago

Issue

Gaussian blur CIFilter naturally creates gray artifacts at the borders of the output image. A common use case when blurring an image would be avoiding these artifacts.

Proposed solution

Add a new filter that extends the edge pixels indefinitely and crop the image to its original extend after blurring.

/// A Gaussian blur filter that clamps to extent to avoid the gray border artefact.
struct GaussianBlurClampedToExtent: ImageProcessing, Hashable, CustomStringConvertible {    
    private let radius: Int

    /// Initializes the receiver with a blur radius.
    public init(radius: Int = 8) {
        self.radius = radius
    }

    /// Applies `CIGaussianBlur` filter to the image.
    public func process(image: Image, context: ImageProcessingContext?) -> Image? {

        // Get CI image
        let ciImageOptional: CoreImage.CIImage? = {
            if let image = image.ciImage {
                return image
            }
            if let image = image.cgImage {
                return CoreImage.CIImage(cgImage: image)
            }
            return nil
        }()

        // Ensure CI image was retrieved
        guard let ciImage = ciImageOptional else { return nil }

        // Remember original image extent
        let extent = ciImage.extent

        // Create image with infinitely extended border pixels to prevent gray edges from blur filter
        let inputImage: CIImage = ciImage.clampedToExtent()

        // Create blur filter
        let filter = CIFilter(name: "CIGaussianBlur", parameters: [kCIInputRadiusKey: radius, kCIInputImageKey: inputImage])

        // Get filtered image
        guard let filteredImage = filter?.outputImage else { return nil }

        // Get default context shared between all Core Image filters.
        let context = CIContext(options: [.priorityRequestLow: true])

        // Create CI image cropped to original extent
        guard let imageRef: CGImage = context.createCGImage(filteredImage, from: extent) else { return nil }
        return UIImage(cgImage: imageRef, scale: image.scale, orientation: image.imageOrientation)
    }

    public var identifier: String {
        return "com.github.kean/nuke/gaussian_blur_clamped_to_extent?radius=\(radius)"
    }

    public var hashableIdentifier: AnyHashable {
        return self
    }

    public var description: String {
        return "GaussianBlurClampedToExtend(radius: \(radius))"
    }
}

Alternative solutions

Extend existing GaussianBlur filter with a Boolean clampToExtend option: "com.github.kean/nuke/gaussian_blur?radius=\(radius)&clampToExtend=\(clampToExtend)"

kean commented 5 years ago

Hi, @mtrezza. Thanks for raising this issue, it is something I overlooked.

In case the only difference is the "extent", I think adding an option to the existing processor should be fine. I would probably go as far as making clamping to the original image extend a default, what do you think?

mtrezza commented 5 years ago

@kean sound good to me.

I've implemented my previous suggestion in a live app and get these crash reports, not always but quite frequently:

Crashed: NSOperationQueue 0x281e53fc0 (QOS: UNSPECIFIED)
0  CoreImage                      0x18f1aa534 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 2660
1  CoreImage                      0x18f1aa534 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 2660
2  CoreImage                      0x18f1a9f74 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 1188
3  CoreImage                      0x18f1a9f74 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 1188
4  CoreImage                      0x18f1a9f74 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 1188
5  CoreImage                      0x18f1a9f74 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 1188
6  CoreImage                      0x18f1a9f74 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 1188
7  CoreImage                      0x18f1a9f74 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 1188
8  CoreImage                      0x18f1a9f74 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 1188
9  CoreImage                      0x18f1a9f74 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 1188
10 CoreImage                      0x18f1a9f74 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 1188
11 CoreImage                      0x18f1aa9d0 CI::Context::render(CI::ProgramNode*, CGRect const&) + 116
12 CoreImage                      0x18f1c59b8 CI::create_cgimage(CI::Context*, CI::Image*, CGRect, CGColorSpace*, CI::PixelFormat, bool, unsigned long) + 2404
13 CoreImage                      0x18f160360 -[CIContext(Internal) _createCGImage:fromRect:format:colorSpace:deferred:textureLimit:] + 1076
14 CoreImage                      0x18f15efe4 -[CIContext createCGImage:fromRect:] + 196
15 MyApp                         0x1030bcefc specialized ImageProcessor.GaussianBlurClampedToExtent.process(image:context:) + 235 (RemoteMediaManager.swift:235)
16 Nuke                           0x103d879c4 $s4Nuke13ImagePipelineC07processB033_64FA2D582EFB9D605832F3421CEE67ADLL_11isCompleted3for9processor3jobyAA0B8ResponseC_SbAA0B7RequestVAA0B10Processing_pAA4TaskC3JobCyAkC5ErrorO_GtFyycfU_So7UIImageCSgAWXEfU_TA + 112
17 Nuke                           0x103da448c $s4Nuke13ImageResponseC3mapyACSgSo7UIImageCSgAGXEFAEyXEfU_ + 40
18 Nuke                           0x103d87a14 $s4Nuke13ImageResponseCSgs5Error_pIgozo_ADsAE_pIegrzo_TRTA + 24
19 libswiftObjectiveC.dylib       0x1036decec autoreleasepool(invoking:) + 4392201452
20 Nuke                           0x103d803dc $s4Nuke13ImagePipelineC07processB033_64FA2D582EFB9D605832F3421CEE67ADLL_11isCompleted3for9processor3jobyAA0B8ResponseC_SbAA0B7RequestVAA0B10Processing_pAA4TaskC3JobCyAkC5ErrorO_GtFyycfU_ + 1252
21 Nuke                           0x103d7b7a8 $sIeg_IeyB_TR + 28
22 Foundation                     0x18a4ee82c __NSBLOCKOPERATION_IS_CALLING_OUT_TO_A_BLOCK__ + 16
23 Foundation                     0x18a3f6a28 -[NSBlockOperation main] + 72
24 Foundation                     0x18a3f5efc -[__NSOperationInternal _start:] + 740
25 Foundation                     0x18a4f0700 __NSOQSchedule_f + 272
26 libdispatch.dylib              0x1894996c8 _dispatch_call_block_and_release + 24
27 libdispatch.dylib              0x18949a484 _dispatch_client_callout + 16
28 libdispatch.dylib              0x189470e14 _dispatch_continuation_pop$VARIANT$armv81 + 404
29 libdispatch.dylib              0x1894704f8 _dispatch_async_redirect_invoke + 592
30 libdispatch.dylib              0x18947cafc _dispatch_root_queue_drain + 344
31 libdispatch.dylib              0x18947d35c _dispatch_worker_thread2 + 116
32 libsystem_pthread.dylib        0x18967c17c _pthread_wqthread + 472
33 libsystem_pthread.dylib        0x18967ecec start_wqthread + 4
Crashed: com.github.kean.Nuke.ImagePipeline
0  libswiftFoundation.dylib       0x1b2cddf24 specialized __DataStorage.init(bytes:length:copy:deallocator:offset:) + 448
1  libswiftFoundation.dylib       0x1b2c000fc Data.InlineSlice.ensureUniqueReference() + 188
2  libswiftFoundation.dylib       0x1b2cde574 specialized Data.InlineSlice.append(contentsOf:) + 48
3  libswiftFoundation.dylib       0x1b2cdf4d4 specialized Data._Representation.append(contentsOf:) + 320
4  Nuke                           0x1036cc8c8 $s4Nuke13ImagePipelineC19imageDataLoadingJob33_64FA2D582EFB9D605832F3421CEE67ADLL_7context010didReceiveE08response8signpostyAA4TaskC0G0Cy10Foundation0E0V_So13NSURLResponseCSgtAC5ErrorO_G_AC08OriginalbE12FetchContextAELLCAprA8SignpostCtF + 1008
5  Nuke                           0x1036cf6b8 $s4Nuke13ImagePipelineC04loadB4Data33_64FA2D582EFB9D605832F3421CEE67ADLL3for7context6finishyAA4TaskC3JobCy10Foundation0E0V_So13NSURLResponseCSgtAC5ErrorO_G_AC08OriginalbE12FetchContextAELLCyyctFyAO_AQtcfU_yycfU_TA + 36
6  Nuke                           0x1036c37a8 $sIeg_IeyB_TR + 28
7  libdispatch.dylib              0x18483ca38 _dispatch_call_block_and_release + 24
8  libdispatch.dylib              0x18483d7d4 _dispatch_client_callout + 16
9  libdispatch.dylib              0x1847e6324 _dispatch_lane_serial_drain$VARIANT$mp + 592
10 libdispatch.dylib              0x1847e6e40 _dispatch_lane_invoke$VARIANT$mp + 428
11 libdispatch.dylib              0x1847ef4ac _dispatch_workloop_worker_thread + 596
12 libsystem_pthread.dylib        0x184a1e114 _pthread_wqthread + 304
13 libsystem_pthread.dylib        0x184a20cd4 start_wqthread + 4

I am using Nuke to load images into a UIImageView in a UICollectionViewCell. Any idea why?

kean commented 5 years ago

Are there any messages in the console from Core Image aside from the stack trace?

I found similar crash reports online https://github.com/pinterest/PINRemoteImage/issues/109, but so far no solutions.

mtrezza commented 4 years ago

@kean I did not see any other CoreImage related errors in the logs.

I’ve reached out to Apple and the engineer said that it may either be a threading issue or indeed a CImage bug.

kean commented 4 years ago

By default, ImagePipeline allows up to two images processing operations to run in parallel (self.imageProcessingQueue.maxConcurrentOperationCount = 2). All the processors must be thread safe. When I was working on these Core Image based filters I made sure that they are.

CIContext and CIImage objects are immutable, which means each can be shared safely among threads. Multiple threads can use the same GPU or CPU CIContext object to render CIImage objects. However, this is not the case for CIFilter objects, which are mutable. A CIFilter object cannot be shared safely among threads. If you app is multithreaded, each thread must create its own CIFilter objects. Otherwise, your app could behave unexpectedly.

From https://developer.apple.com/documentation/coreimage/ciimage

mtrezza commented 4 years ago

I filed a bug report with Apple. An Apple engineer suggested that the recursive stack trace is likely an indication that the indefinitely extended edges may cause the crash in CIImage. I’ll update here.

kean commented 4 years ago

Hi, @mtrezza. Did you receive any updates from Apple regarding this issue?

mtrezza commented 4 years ago

@kean Unfortunately not.

I think looking for an alternative to clampedToExtent() for solving the gray frame problem would be a workaround. Maybe it would be enough to manually expend the pixels in each direction for n pixels, then blur, then cut off the extension.

The challenge would be to find a correct n depending on the image size. The larger the image, the larger the gray border becomes. So maybe n = image.width / 2 would be enough, or maybe it only needs a 1/4 of the image width. That's something to empirically find out I guess, unless you have more insight into how and why the gray border the created with Gaussian blur?

mtrezza commented 4 years ago

For the blur matrix of CIGaussianBlur my assumption was that radius defined in kCIInputRadiusKey is the number of surrounding pixels taken into account. So that would lead to a n x n matrix where n = kCIInputRadiusKey * 2 + 1.

But that does not seem to be the case. The extent change of CIImage actually shows that kCIInputRadiusKey=50 changes an image of (origin, size) (0.0, 0.0, 100.0, 100.0) to (-150.0, -150.0, 400.0, 400.0). A radius of 100 creates an image of (-300.0, -300.0, 700.0, 700.0).

So for CIGaussianBlur it seems that n = kCIInputRadiusKey * 3 * 2 + 1, but I don't know where the factor 3 comes from. The references I found online for Gaussian Blur all seem to indicate that the "radius" is usually the number of pixels in all directions from the center of the pixel to calculate.

Anyway, instead of clampedToExtent() I will extend the image by kCIInputRadiusKey * 3:

/// Returns a new image created by making the pixel colors along the edges extend by a given number of pixels in all directions.
/// - Parameter pixels: The number of pixels to extend, e.g. a value of 2 adds 2 pixels in each direction.
/// - Returns: The extended image.
func clamped(byPixels pixels: CGFloat) -> CIImage {

    // Extend image indefinitely into all directions
    let indefiniteImage = clampedToExtent()

    // Calculate expanded extent
    let expandedExtent = extent.insetBy(dx: -pixels, dy: -pixels)

    // Crop image to expanded extent
    let clampedImage = indefiniteImage.cropped(to: expandedExtent)
    return clampedImage
}

After the new app release I will update here if that fixed the crashes.

mtrezza commented 4 years ago

The crashes still occur with the solution above, specifically:

Crashed: NSOperationQueue 0x159999e00 (QOS: UNSPECIFIED)
EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000000

That looks like the app is trying to access an already deallocated object. Either the issue still occurs because of still using clampedToExtent(), or there is another issue, unrelated to the image manipulation.

As a next step I am removing any change to the image before blurring it to see if the issue is related to clamping by removing the line let inputImage: CIImage = ciImage.clampedToExtent() from the filter described above. I am also changing .priorityRequestLow to false, just in case.

mtrezza commented 4 years ago

With the changes described in my previous comment, the crashes still occur. Therefore I assume the issue is unrelated to CIFilter and CIImage. Closing for now.

mtrezza commented 4 years ago

Sorry, instead of closing I think we can address the original issue of gray border artifacts when blurring. Since the crashes are unrelated to clampToExtent we can add it as an optional parameter to the existing filter, like:

com.github.kean/nuke/gaussian_blur_clamped_to_extent?radius=\(radius)&clamped=\(true|false)

@kean what do you think?

mtrezza commented 4 years ago

Update after a call regarding the issue with an Apple engineer who works on the CoreImage team:

The issue has been observed across different applications which leads to assume a CoreImage bug, but Apple hasn't been able to nail it down quite yet. The crash seems almost certainly related to resource constraints especially when many CoreImage contexts are created in parallel (for example blurring images on-the-fly as they appear in a UICollectionView) or the images to process are large in size. The workaround I am trying now is to significantly downsample images to a few pixels then blurring them with a very low radius - visually it leads to the same result and there are no crashes reported yet.

kean commented 3 years ago

Hey, @mtrezza. I would appreciate if you could add a "clamp to extent" option to the existing GaussianBlur processor.

kean commented 3 years ago

Since it appears to be a Core Image bug, I'm going to close it. If you have other idea, please let me know.

Bluestart83 commented 3 years ago

This is not a bug, the method is not correct, gaussian loose image border proportionally to gaussian radius natively! Library is buggy, you need to crop the result, it can be done with CIAffineClamp:

//
//  NukeImageProcessor+gaussianCropped.swift
//

import Foundation
import Nuke

extension ImageProcessors {
    /// Blurs an image using `CIGaussianBlur` filter.
    public struct GaussianBlurFixed: ImageProcessing, Hashable, CustomStringConvertible {
        private let radius: Int

        /// Initializes the receiver with a blur radius.
        public init(radius: Int = 8) {
            self.radius = radius
        }

        /// Applies `CIGaussianBlur` filter to the image.
        public func process(_ image: PlatformImage) -> PlatformImage? {
            /*
             let filter = CIFilter(name: "CIGaussianBlur", parameters: ["inputRadius": radius])
             return CoreImageFilter.apply(filter: filter, to: image)
             */

            let ciimage: CIImage = CIImage(image: image)!

            // Added "CIAffineClamp" filter
            let affineClampFilter = CIFilter(name: "CIAffineClamp")!
            affineClampFilter.setDefaults()
            affineClampFilter.setValue(ciimage, forKey: kCIInputImageKey)
            let resultClamp = affineClampFilter.value(forKey: kCIOutputImageKey)

            // resultClamp is used as input for "CIGaussianBlur" filter
            let filter: CIFilter = CIFilter(name:"CIGaussianBlur")!
            filter.setDefaults()
            filter.setValue(resultClamp, forKey: kCIInputImageKey)
            filter.setValue(radius, forKey: kCIInputRadiusKey)

            let ciContext = CIContext(options: nil)
            let result = filter.value(forKey: kCIOutputImageKey) as! CIImage
            let cgImage = ciContext.createCGImage(result, from: ciimage.extent)! // changed to ciiimage.extend

            return UIImage(cgImage: cgImage)
        }

        public var identifier: String {
            "com.github.kean/nuke/gaussian_blur_fixed?radius=\(radius)&clamped=false"
        }

        public var hashableIdentifier: AnyHashable { self }

        public var description: String {
            "GaussianBlurFixed(radius: \(radius))"
        }
    }
}
kean commented 3 years ago

Cool, makes sense. I will appreciate a pull request. I'm using this processor myself, so the help is welcome. If not, I can test and merge it later myself. Thanks.

JAGUAR108 commented 4 months ago

Hello, @kean, is this case still not progressing? are there any plans for this?

MartinStrambach commented 2 months ago

Any plans to fix this issue? I wanted to add glow to my image by repeating and blurring the same image in ZStack. Unfortunately, gaussianBlur is unusable for me without repeating inner pixels outside of the image area.