Shathe / Semantic-Segmentation-Tensorflow-Eager

An example of semantic segmentation using tensorflow in eager execution.
26 stars 3 forks source link

Move to Dataset API #6

Open Shathe opened 5 years ago

Shathe commented 5 years ago

Change the custom loader your are using for the Dataset API. Keep the same functionalities like data augmentation. Change the rest of the code to integrate it. https://www.tensorflow.org/tutorials/eager/eager_basics#datasets https://www.tensorflow.org/performance/datasets_performance https://colab.research.google.com/github/tensorflow/tensorflow/blob/master/tensorflow/contrib/eager/python/examples/generative_examples/image_captioning_with_attention.ipynb

Once the Dataset API is integrated, move into the TFrecords data format http://warmspringwinds.github.io/tensorflow/tf-slim/2016/12/21/tfrecords-guide/

bhack commented 5 years ago

Remember that pre-processed filestorage is a double-edged sword. Generally it is faster but realtime augmentation, if performant enough, could help you to change augmentation hypotesis on the fly without to wait for pre-processe again a large dataset.

Shathe commented 5 years ago

Starting to code it: ef739ba2a6f042506aa7e873c9d6aaaac9a75037 I want to learn about this APIs (I hope Tensorflow 2.0 will maintain them), so I will take my time to do it well, understand it and code it :)