CaffeineViking / vkhr

Real-Time Hybrid Hair Rendering using Vulkan™
MIT License
446 stars 34 forks source link

Create Proxy-Geometry for the Volume Rendering #36

Closed CaffeineViking closed 5 years ago

CaffeineViking commented 5 years ago

When the hair style is far away, rasterizing line segments becomes expensive and sort of useless (we only see a little of the detail anyway). Instead, we can use our pretty fast strand voxelization (which only runs once per frame, not twice like the line rasterization) to find the strand density. We can use this density to find a surface on the volume by raymarching through it, and also to find normals by the gradient ∇D of the density field D.

Before doing any of this we need to setup a proxy-geometry so that the fragment shader can be called. That will be the raymarch starting point. A simple one is to simply use the AABB of the hair style (that's also used in the voxelization) as the proxy-geometry. For this we'll need to add a new Drawable, the Volume object in our abstraction which has the correct volume rendering pipeline bound and the correct world transforms too.

volume_rendering

CaffeineViking commented 5 years ago

Done! I've used an AABB as a proxy-geometry. Raycasting through the volume and doing discard from fs_in.position to fs_in.position + normalize(fs_in.position - camera.position) * volume.max gives me these results. I've used around 100 steps through each fragment, and discard when density is below 0.005. Performance, as expected, varies a lot on the number of fragments we are evaluating. For very close (covers the entire screen), I get around 0.4ms for the raymarching, while in a more realistic scenario for what we are going to be using it for (far hair) we can get away with 0.01-0.08ms. So that's pretty nice :-).

2019-01-23 16-04-46