michaellevin / kuwahara-filter-nuke

Blink script code: Anisotropic Kuwahara filtering
0 stars 0 forks source link

Usage of A and D in Nuke #1

Open sharktacos opened 5 months ago

sharktacos commented 5 months ago

Hi thanks for providing this! Could you elaborate on how kuwahara_filter_buffer_A.blink is meant to be used with kuwahara_filter_buffer_D.blink in Nuke? I have both A and D loaded into BlinkScript nodes. Connecting the src input of D to an image produces the painterly effect. Unsure what to do with A. Thanks!

michaellevin commented 5 months ago

Hi Derek!

I apologise for the late reply. I also don't understand exactly how this edge detection (filter A) is used in the final shader. I didn't have enough time to implement buffer B from the original shadertoy implementation.

image So we have buffer A edge detection with gradient colouring (green for horizontal edges, red for vertical edges) -> buffer B has no idea what it does; buffer D creates the most important part.

Since I needed a working solution, and this didn't work with the alpha channel, I found another good one online: https://community.foundry.com/discuss/topic/158570/kuwahara-filter The code of the last post works like a charm, thanks to the author! Sorry I couldn't answer your main maths question - if you find the answer I'd be very happy to hear it, or vice versa, if I come back to this problem more thoroughly I'll post here what I've understood.

sharktacos commented 4 months ago

I think your implementation is quite nice. The one's on the Foundry forums in comparison look kind of jagged.

The main issue I see is that the anisotropy does not appear to be working (via the eccentricity or "alpha" parameter, so the result is essentially a classic Kuwahara filter. I tried implementing the Blender code in Blink and got pretty much identical results, i.e. the anisotropy was not working. What I get is this:

image

when what we want is this: image

I believe this has something to do with how Blink handles rotations, but I'm afraid this is a bit over my head. I'd be happy to share my Blink code for the Blender version if you'd like to take a look.

michaellevin commented 4 months ago

Interesting. Yes, I see what you mean now. I'd be happy to take a look if you share your version of the implementation in Blender.

sharktacos commented 4 months ago

I was actually able to get it working! The trick was a combination of my 2x2 matrix being in the wrong order, and also needing to blur the structure tensor.

I also added the ability to use the alpha channel to control the brush size, so you can shuffle a depth map into it and get more abstract renderings instead of a defocus.

I put it all into a gizmo you can check out here: https://github.com/sharktacos/VFX-software-prefs/blob/main/Nuke/df_kuwahara_map.gizmo

michaellevin commented 4 months ago

Hi Derek!

I ran your test in Nuke and the result is excellent. Wow! Thank you very much!

Did you happen to find any bugs in the shadertoy source code? Which I tried to repeat in Blinkscript. Because I wasn't really able to match it to the original paper.

If you have the same solution for compositing in Blender and can share it to look at there, that would be awesome!

sharktacos commented 4 months ago

I'm really only stumbling through all of this, so please take everything I say as just sharing in that stumbling journey.

As far as I can tell, in the ShaderToy buffer A does a Sobel Filter, which I don't understand why it's needed. Buffer B does a structure tensor with a Gaussian smoothing before. I thought the paper said to apply the Gaussian after the structure tensor, so I'm puzzled that it comes before.

In buffer D the line vec4 t = texture(iChannel1, uv); is where the structure tensor is read in. iChannel1 is black/empty. I would think it should read in the results of buffer B, but I don't really get how this works in ShaderToy. At any rate, the resulting image is not using anisotropic but rather isotropic Kuwahara. This is the part that I suspect is causing the shaderToy code to not be anisotropic. The image shown is isotropic:

image

as opposed to this anisotropic one which it should be showing:

image

What I have in my code is two blinkScripts. One for the structure tensor, which I pass through a blur node, and then input into a second blinkScript for the Kuwahara. I imagine the same could be done with yours (using buffer B rather than A), but have not had the opportunity to test this.

The code from Blender is here: https://projects.blender.org/blender/blender/src/commit/92525d6c27323c7dbe765f86d447457a6bc099dc/source/blender/compositor/realtime_compositor/shaders/compositor_kuwahara_anisotropic.glsl docs here: https://docs.blender.org/manual/en/latest/compositing/types/filter/kuwahara.html

sharktacos commented 4 months ago

update, I think I get how ShaderToy is using the buffers. Buffer A reads in the image of London, and does sobel edge detection on it.

image

Buffer B reads in A and does a structure tensor.

image

In comparison here is my structure tensor (from the Blender) and blur:

image

Buffer D reads in the image, and outputs this:

image

Buffer D has the line vec4 t = texture(iChannel1, uv); which again is empty/black. I think buffer B should be read in here.

Finally Image does a "Line Integral Convolution" inputting buffer B (structure tensor) into iChannel0 and buffer D (Kuwahara) into iChannel1.

Hope that helps.