Closed gabrielleyr closed 5 years ago
Data feed method 1, placeholders + feed_dict, is used in this code, as shown in the unet.py's Unet object init function (and in Issue#227 https://github.com/jakeret/tf_unet/issues/227).
This was my initial answer, which was wrong. It's actually a native python generator and therefore much better for large training datasets that can't all be loaded simultaneously!
Does the ImageDataProvider's implementation avoid creating additional copies of the image database, similar to the Native Python Generators section of this blog: https://medium.com/tensorflow/an-introduction-to-biomedical-image-analysis-with-tensorflow-and-dltk-2c25304e7c13 -- unlike other methods for loading data prior to or during training into Tensorflow? It seems to load data during training, which is desirable for large datasets, as long as the file gets closed after loading it for training.