Open matigekunstintelligentie opened 4 years ago
You could convert weights without using gpu memory. But convert_weight.py script generates some samples for sanity check, so it could cause OOM. But converted .pt checkpoints will be generated anyway.
I was able to use 8 batch sizes during training with 24 GB memory.
I get a CUDA_ERROR_OUT_OF_MEMORY error when trying to convert a pkl file. I only have 8GB of memory (1080 GTX). What is the minimum requirement for converting a 1024x1024 model?
The NVIDIA repository mentions 16GB. 16 GB GPUs aren't in my budget, unfortunately. Considering buying a 2080TI with 11GB of memory. I would also like to know what the maximum batch size is for GPUs with more memory. I've been able to rain with a batch size of 2.