I am trying out your example provided here on 2 different local machine, for simplicity sake let's say A and B.
On machine A, the code runs (converted it to a script, and removed the debugging output) and I got a reasonable loss and IoU.
On machine B, the code runs, but in the middle of the first epoch the loss and IoU turns NaN.
Anyone has any idea why is this happening? Or perhaps how to debug this?
Both machine runs Windows 10, python 3.7 with tensorflow-gpu installed using conda.
closing the issue;
I believe I could not find the root cause, but running the samples in Linux was consistent and I played around on a Linux environment instead
Hello,
I am trying out your example provided here on 2 different local machine, for simplicity sake let's say A and B.
On machine A, the code runs (converted it to a script, and removed the debugging output) and I got a reasonable loss and IoU. On machine B, the code runs, but in the middle of the first epoch the loss and IoU turns NaN.
Anyone has any idea why is this happening? Or perhaps how to debug this? Both machine runs Windows 10, python 3.7 with tensorflow-gpu installed using conda.