CaffeineViking / vkhr

Real-Time Hybrid Hair Rendering using Vulkan™
MIT License
446 stars 34 forks source link

Implement the Raytraced Reference AO Solution #25

Closed CaffeineViking closed 5 years ago

CaffeineViking commented 5 years ago

We would like to check if our volume-based AO implementation approaches the reference raytraced AO that is the ground truth. The raytracer is lagging behind quite a lot, so basically we use it only to cross-check that the results we get from the rasterizer are valid. In a nutshell, here is what the raytracer should be doing to it:

  1. Shoot rays from the camera towards the scene, for each intersection with a surface point we must shoot:
  2. Random rays in the half-hemisphere (with a normal pointing outside the hair style) and gather the rays that intersect and don't intersect any other hair strands. The ratio of intersected vs non-intersected rays is the amount of ambient occlusion in the scene. Which is what is generally done, as in the thesis: here.
CaffeineViking commented 5 years ago

Here are some results after cleaning up the implementation. I've raytraced a 1280x720 image of the ponytail, which launches 64 rays on hit towards a random direction (in a sphere for now). As you may imagine, that's not possible in interactive frame-rates :-). On the Threadripper I can get the real-time AO when using 8 rays.

I've started implementing #27, and should hopefully have something running towards the end of the day. It will be very interesting to see what we get!

ambient-occlusion

Figure 1: raytraced AO by launching 64 rays in random directions within a sphere, this is NOT real-time.

combined-shadow-map-densities

Figure 2: combined ADSM and inverse voxelized density occlusion by using a 256^3 voxelized ponytail.

shadow-map

Figure 3: the ADSM shadow map term from a rotating spotlight (with smoothing and Gaussian 3x3 PCF).

densities

Figure 4: inverse voxelized density of a 256^3 voxelization of Lara Croft's ponytail (not normalized yet).