Closed l1346792580123 closed 2 years ago
11G should be fine. but you want to double check you scene resolution and spp, sppe, sppse. take a look at the config.py for the bunny_env_2, the default value sppse=64 might be too high for 11G memory I suggest you to change that value into 4, or even less for both spp, sppe and sppse You can always have less spp/sppe/sppse but increase npass to have a clean gradient estimation "spp" : 4, "sppe" : 4, "sppse" : 4,
Thanks for your explanation.
When I try to run the example code, specificaly the bunny_env_2 test case, it raises the following error Traceback (most recent call last): File "psdr_test.py", line 102, in
process(name, psdr_tests[name])
File "psdr_test.py", line 77, in process
direct_test(name, args)
File "psdr_test.py", line 63, in direct_test
run_ad(integrator, sc, dirname + "/direct_AD.exr", args)
File "/home/llx/psdr-cuda/examples/run_test.py", line 129, in run_ad
img = integrator.renderD(sc, sensor_id)
RuntimeError: cuda_malloc(): out of memory!
I use the nvidia 2080ti which has 11G memory. I wonder how much gpu memory does this test case need and is there method to reduce the required memory?