Shathe / Semantic-Segmentation-Tensorflow-Eager

An example of semantic segmentation using tensorflow in eager execution.
26 stars 3 forks source link

Multiprocess load batch #5

Closed Shathe closed 5 years ago

Shathe commented 5 years ago

Add to the Loaders _getbatch function the possibility of loading it in another process.

If its the first time to call it, load a batch as normal and creates anew process to load next batch

it is not the first time to load it, creates a new process to load next batch and waits for the previous loaded batch in the queue.

https://stackoverflow.com/questions/2046603/is-it-possible-to-run-function-in-a-subprocess-without-threading-or-writing-a-se

compare both performances

bhack commented 5 years ago

Generally the dataset API is already highly optimized.

Shathe commented 5 years ago

Yeah, you're right I am currently using my custom Loader for loading and performing data augmentation but... I think it's about time to move into the data.Dataset API haha I am gonna open another issue for that

bhack commented 5 years ago

https://www.tensorflow.org/performance/datasets_performance

Shathe commented 5 years ago

Nevertheless... I just saw it andd... Do you know if you can use it when the data does not fit into RAM? I mean, if you cannot load the entire dataset at once... You can still use the Dataset API?

bhack commented 5 years ago

Yes of course

bhack commented 5 years ago

If you want to continue in Eager mode check also https://www.tensorflow.org/tutorials/eager/eager_basics#datasets

Shathe commented 5 years ago

You know how? I mean, can you provide me a link or a function name or something? I'll try to search on internet and on the guide you linked in previous messages too

bhack commented 5 years ago

Almost all the real datasets don't fit in memory so you will find many many dataset API examples. Just to mention one https://colab.research.google.com/github/tensorflow/tensorflow/blob/master/tensorflow/contrib/eager/python/examples/generative_examples/image_captioning_with_attention.ipynb

Shathe commented 5 years ago

I mean, referring to the size of the dataset

bhack commented 5 years ago

That dataset doesn't fit in memory

Shathe commented 5 years ago

Thanks a lot!