Closed hanajibsa closed 1 year ago
Thank you for your interest in our work and for raising this issue. Our models are optimized for Nvidia RTX3090 (24 GB) but were initially intended to work with 16GB GPUs. If you're encountering out-of-memory errors, consider adjusting the model complexity by modifying the ch_mult or num_res_blocks configurations.
We trust that this codebase will prove valuable for your research endeavors. If you have any other questions or concerns, please feel free to ask.
I also think there are something that accumulate memory, but I can't find. please check it.
During training, CUDA out of memory emerged. I think there are something that accumulate memory, but I can't find.