ingowald / optix7course

Apache License 2.0
452 stars 80 forks source link

GPU memory leak in example 2 #24

Closed XinShengHUST closed 3 years ago

XinShengHUST commented 3 years ago

Dear ingowald, Thank you for your course code, and it is really helpful to the beginners. I have tested the ex02 on my computer, and the rendering run well. In the meantime, I tried to trace the memory usage, so I printed the memory before and after I used the SampleRenderer:

PrintGpuMemory();
{
    SampleRenderer sample;
    ...
    sample.render();
    ...
}
PrintGpuMemory();

The avaiable GPU memory decrease by about 300MB, but the destructor should be called inside the bracket. How should I clean up the optix related memory? Hope to get some advice from you. Thanks a lot.

ingowald commented 3 years ago

This example was mostly intended to teach how to get started, so I've left out all kinds of things like safety checks and memory deallocations etc that one would usually have had in a more complete application. If you do want to hunt them down: all "large" allocations (BVH memory, vertex arrays, SBT, etc) in OptiX are all handled by the user, using regular CUDA mallocs (cudaMalloc/cudaMallocManaged); if you find any that do not have a corresponding cudaFree in the destructor, then there'll indeed be a memory leak (and I'm sure there'll be multiple). If you do find and fix them, I'd be happy to take a PR that fixes them!

Alternatively, if you're ready to start using OptiX for some real application you might also want to have a look at my OWL project - it'll not only make the writing of your OptiX programs to be much easier (eg, it can build BVHes and SBTs with a single function call), it'll also be more "finished" in terms of releasing memory.

XinShengHUST commented 3 years ago

Thank you for your reply. The OWL project seems to have a good memory management mechanism. I will look into this awesome project. :)