qubvel / segmentation_models

Segmentation models with pretrained backbones. Keras and TensorFlow Keras.
MIT License
4.76k stars 1.03k forks source link

massive datasets loading and training #583

Open davidvct opened 10 months ago

davidvct commented 10 months ago

if I have a massive datasets of images, and unable to fit all of them into memory, how do I load and train them by batches? my folder structure is:

datasets | |----Train | |---- images | |-----masks | |---- Val |---- images |---- masks

Vipin2705 commented 1 month ago

If you are using GPU instead of loading all image at ones, try using datagenerator it allows you to supply images to GPU in batches, earlier I used to get OOM issue, now I am able to train my model.