Thank you for open-sourcing your code, it is indeed a great job. I have a question regarding its usage. I have used your code to render my own dataset, but I noticed that the rendering speed is relatively slow, taking about 2 minutes per image. Therefore, I would like to inquire if you have any suggestions on how to accelerate the rendering process using GPU?
IIRC, it should automatically use the GPU. Did you watch nvdia-smi? Is the GPU completely idle during rendering? Feel free to reopen if you still have issues.
Thank you for open-sourcing your code, it is indeed a great job. I have a question regarding its usage. I have used your code to render my own dataset, but I noticed that the rendering speed is relatively slow, taking about 2 minutes per image. Therefore, I would like to inquire if you have any suggestions on how to accelerate the rendering process using GPU?