CaffeineViking / vkhr

Real-Time Hybrid Hair Rendering using Vulkan™
MIT License
446 stars 34 forks source link

Implement the Screen-Space Strand Density Metric #31

Open CaffeineViking opened 5 years ago

CaffeineViking commented 5 years ago

In order to guide the transparency and AA method we'll probably want to find the screen-space hair density, and classify fragments (or tiles) as having low or high density with some sort of threshold. We'll use this data to adapt the PPLL and AA algorithms (exactly how, is till TBD, but we have some good ideas already rolling). I can see two ways to do this: use the volume we already have, or to accumulate rasterized lines onto a buffer. i.e. we do more-or-less what we already do with the volume, but project and increment it on a plane instead.

Some ideas would be to conditionally insert fragments into the PPLL or vary the number of fragments we are actually sorting properly (and not just squashing together), or decide how many of the fragments should use a "high-quality" version of the shading. Most of the time we are spending is shading the strands, which is a drastic change from the previous non-OIT version of our shader, that only needed to shade the top-part of the hair-style (and the rest of the fragments were culled). Maybe the SSHD could be used here somehow?!?