I've recently got my M1 MacBook Air and I'm working on running some NN training. The optimiser I would normally use is Nadam but I've found that this results in an uncontrolled increase in RAM with each epoch - which appears to me like some sort of memory leak.
This does not happen when running identical code on an Intel CPU or when using other optimisers, such as Adam.
TF version - 0.1alpha3 installed as per the instructions in this repo and issue 153
Python - Python 3.8.6 | packaged by conda-forge
Code to reproduce - I've put together a minimal example, below.
Hi,
I've recently got my M1 MacBook Air and I'm working on running some NN training. The optimiser I would normally use is Nadam but I've found that this results in an uncontrolled increase in RAM with each epoch - which appears to me like some sort of memory leak.
This does not happen when running identical code on an Intel CPU or when using other optimisers, such as Adam.
TF version - 0.1alpha3 installed as per the instructions in this repo and issue 153 Python - Python 3.8.6 | packaged by conda-forge Code to reproduce - I've put together a minimal example, below.
Any help would be appreciated!
Log when the optimiser is Nadam:
Output when the optimiser is Adam, all other settings are identical.
Code: