Open aneesh3108 opened 7 years ago
@aneesh3108 Decrease Batch_size, let me know if it works
@pradyu1993 Decreased batch size to 10 - and it barely runs fast enough (quite surprising though! )
Now facing a new problem where my loss is showing as 'nan'.
There are a couple of suggestions out there that reason out why this is happening - but if you have any context on the cause, do let me know.
(Update: Currently solved by commenting out all batch normalization layers)
New question: My loss starts at 14.something (with the exact same code as yours and the exact same config.) Did you do any tweaking ?
Thanks,
Aneesh
Hello,
When I run this code, it gives me the following error:
Tried using 660Ti - went upto 980 and 1080. Doesn't seem to go.
Any solutions???
Also, does this warning have anything to do with it ??