Open rrrtype opened 5 years ago
Hi, No.
Thank you, I misunderstood! In my model tiny - yolov 3 and batch = subdivision = 24 it will consume memory immediately but batch = 64 subdivision = 16 will work fine. Is this normal behavior?
@rrrtype batch = 64, subdivision = 16
is fairly good if you have about 12 GB of GPU memory, if not try to increase subdivision
to 32
Increase subdivision
till you don't get the CUDA memory error.
Hi , I train with my data set. If the image size used for training is large, will you get "CUDA out of memory" error?