CaffeineViking / vkhr

Real-Time Hybrid Hair Rendering using Vulkan™
MIT License
446 stars 34 forks source link

Integrate Raytracer into the Swapchain #9

Closed CaffeineViking closed 6 years ago

CaffeineViking commented 6 years ago

Right now the raytracer produces the correct rendered output to a vkhr::Image, but we need a way to get it into the Vulkan framebuffer somehow. Right now my strategy will be to render to the vkhr::Image image, create a vkpp::DeviceImage (i.e. a staged vkpp::Image) with that data, and create vkpp::ImageView from it. I am still a bit uncertain on what's the best way to update it, since the results might change every frame or so.

In my initial implementation I plan to have two vkhr::Image for double buffering. The backbuffer is updated, i.e. raytraced in this case, and then uploaded to GPU memory along with the descriptor sets. The frontbuffer is shown by setting the correct descriptors for the combined image sampler, and then using my billboards shader. The backbuffer is made into a frontbuffer when the render to the backbuffer is done.

Still now completely 100% if the approach is the best one out there, but we can come back and fix this later.

CaffeineViking commented 6 years ago

After the latest round of bug fixing, everything works a lot smoother now. I'll close this issue now. I tracked down a set of operations that were taking a LOT of CPU-time, and was able to eliminate them. We can now render the raytraced output at around 20-30 FPS on my shitty laptop. I'll try it on the AMD setup using the Threadripper and report my results, that should be fun :-). Of course, once we start sending off more than 1 ray per pixel, or bouncing off for transparency, things will very quickly become non-interactive, so in the end we'll have to "freeze" a frame of the raytraced output, and use that when displaying instead.