Hello everyone,
I am doing a medical image segmentation task where I am using the semantic segmentation method.
When I load my dataset using a data generator and load my model start my model training on low batch_size. Then after a few epochs, my memory is going up and on 14 epochs my memory is exhausted and the kernel dies. I don't know the reason behind this and why this is happing.
I share my code below for data loading and model loading please help me where I am doing mistakes.
Have you thought about maybe the fit_generator can be very memory hungry and can load in too much training data. Maybe try to lower the number of workers, but that's just a guess
Hello everyone, I am doing a medical image segmentation task where I am using the semantic segmentation method.
When I load my dataset using a data generator and load my model start my model training on low
batch_size
. Then after a few epochs, my memory is going up and on14 epochs
my memory is exhausted and the kernel dies. I don't know the reason behind this and why this is happing. I share my code below for data loading and model loading please help me where I am doing mistakes.any help will be appreciated.