daprice / Variablur

Variable blur effects for SwiftUI, powered by Metal
https://swiftpackageindex.com/daprice/Variablur
MIT License
200 stars 2 forks source link

Mitigating 'streaks' near mask edges #2

Closed AlexGingell closed 6 months ago

AlexGingell commented 6 months ago

Your comments indicate: /// The two-pass approach is better for performance as it scales linearly rather than exponentially with pixel count radius sample count, but can result in "streak" artifacts where blurred areas meet unblurred areas.

I was working on a similar shader to overcome CIMaskedVariableBlur's limited inputRadius (capped at 100) and ended up with an implementation very similar to yours. I was wondering if you ever explored ways to mitigate the streaking near mask edges? As I need to use mask images of an arbitrary nature it's actually quite a difficult problem to solve. CIMaskedVariableBlur uses a lot of optimisation techniques and works very well so I'm thinking that applying it multiple times to overcome the inputRadius cap is probably the best option overall but I'm interested in your thoughts! Thanks.

daprice commented 6 months ago

Well, for one thing, you can sample horizontally and vertically in a single pass and eliminate these artifacts entirely. It’s more demanding on the GPU since it requires a lot more samples (x sample count * y sample count instead of x sample count + y sample count), but depending on your use case it might be ok. I think I had an early version like this and it worked fine for smaller or non-animating views – I’ll see if I can find it.

I would guess that doing extra passes by applying an implementation like ours multiple times with the same mask would also reduce the artifacts. No idea whether that would be more or less performant than applying CIMaskedVariableBlur multiple times, though.

daprice commented 6 months ago

Here’s the basic single-pass blur implementation I started with. This version didn’t take a mask yet, but I don’t think you’d get any of the mask-edge artifacts by taking all the samples in one pass like this.

[[ stitchable ]] half4 boxBlur(float2 pos, SwiftUI::Layer layer, float radius, float quality) {
    half4 total = half4(0.0h, 0.0h, 0.0h, 0.0h);
    int count = 0;

    half maxX = pos.x + radius;
    half maxY = pos.y + radius;
    half interval = max(1.0h, half(radius) / 10.0h) * (1.0h / max(0.1h, min(1.0h, half(quality))));
    for(half x = pos.x - radius; x <= maxX; x += interval) {
        for(half y = pos.y - radius; y <= maxY; y += interval) {
            total += layer.sample(float2(x, y));
            count += 1;
        }
    }

    return total / count;
}
AlexGingell commented 6 months ago

Thanks, I appreciate your response. I should have added a lot more context. My use case is as part of an arbitrary core image filter chain for photo/video editing and needs to handle quite large blur radii with high performance. An approach that scales multiplicatively such as sampling with a square kernel (which does produce an artifact-free result as you say) is prohibitively slow unless the sample count is too low to produce a smooth enough result. I've a few things to try still though. I'll close this with a few useful links others may also find worth reading: http://www.sunsetlakesoftware.com/2013/10/21/optimizing-gaussian-blurs-mobile-gpu/index.html https://www.rastergrid.com/blog/2010/09/efficient-gaussian-blur-with-linear-sampling/ https://patents.google.com/patent/US7397964