bdzyubak / tensorflow-sandbox

A repository for studying applications of Deep Learning across fields, and demonstrating samples of my code and project managment
0 stars 0 forks source link

Implement module to set batch size to maximum allowable by memory #13

Closed bdzyubak closed 2 years ago

bdzyubak commented 2 years ago

Generally, minibatch optimization is recommended due to inherent stochasticity. This approach was used by seminal publications like the original ReLU and Network-in-Network papers:

https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf https://arxiv.org/pdf/1312.4400v3.pdf http://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf

Using e.g. using 32 examples rather than all of the available data. makes the optimization to oscillate during a given epoch. Although this makes the optimization slower, it is less likely to get stuck in local minima.

On the other hand, it is possible to optimize quickly and then try to bypass local minima. This request is to implement a method to set batch size to the maximum allowable by the computer memory. A multiplier may be used to reserve small amounts of memory for basic computer operations.

bdzyubak commented 2 years ago

Implemented as part of the prep_training_data module in 86ed3d1d1a94a764d30453f4ec157cc03ed3a75f