Originally posted by **BennetLeff** August 22, 2023
Howdy! I got everything building and running on my 3080ti (so there's a new card confirmed working).
The truck demo scene worked fine. Later I tried to import my own scene and torch reserved too much GPU memory. I don't have this problem with the same dataset in the python/original implementation. This project would be more accessible if some configs existed to control this! I don't have time at the moment to fix it but might soon :)
Discussed in https://github.com/MrNeRF/gaussian-splatting-cuda/discussions/20
TODO:
Maybe make sure that it works synchronously first. Might be a hustle to get it asynchronous without blocking or introducing bugs. See https://github.com/MrNeRF/gaussian-splatting-cuda/discussions/20