Closed houshuaipeng closed 1 year ago
Hi Houshuai,
Thanks for your interest in our work. Currently, this codebase only supports single-GPU training, we conducted our experiments primarily on NVIDIA A6000s or A100s. The GPU memory requirements vary per setting, usually capped at 23GB when you enable everything, but 16GB memory is enough for most of the settings.
Dear author,
I hope this message finds you well. First and foremost, I would like to express my appreciation for the work you've done on EmerNeRF. It's a fantastic project, and I'm eager to explore it further.
I am currently interested in training models using your project and I was wondering if you could provide some guidance regarding the minimum GPU requirements for the training process. Specifically, I would like to know:
1、The minimum number of GPUs recommended for efficient training. 2、Any specific GPU models or specifications that you have found to work well during your development.
Understanding these details will greatly assist me in planning the hardware resources for my own experiments. I understand that hardware requirements can vary based on the dataset and model complexity, but having a general idea would be incredibly helpful.
Thank you for your time, and I look forward to hearing from you.