I'm running the code in Colab, using the provided ./sample_data/mnist_train_small.csv (can't seem to import mnistdata). I'm not sure whether the mnistdata copy of the data set is normalized differently that the one provided in Colab, but I assume they're similar.
When training the model, the example's cross_entropy calculation outputs NaNs, as is documented in (this StackOverflow solution)[https://stackoverflow.com/a/33644778]. A followup (answer by colah)[https://stackoverflow.com/a/33645235] suggests that this is a deliberate choice aimed at getting users to interact with the code, but it's dated 2015, and the implementation mentioned is different than that of the latest version of mnist-1.0-softmax.py.
I'm running the code in Colab, using the provided
./sample_data/mnist_train_small.csv
(can't seem toimport mnistdata
). I'm not sure whether themnistdata
copy of the data set is normalized differently that the one provided in Colab, but I assume they're similar.When training the model, the example's
cross_entropy
calculation outputs NaNs, as is documented in (this StackOverflow solution)[https://stackoverflow.com/a/33644778]. A followup (answer by colah)[https://stackoverflow.com/a/33645235] suggests that this is a deliberate choice aimed at getting users to interact with the code, but it's dated 2015, and the implementation mentioned is different than that of the latest version ofmnist-1.0-softmax.py
.