Closed mlkonopelski closed 3 years ago
Hi @mlkonopelski , Thanks for your question. I've copied your code in the following Colab notebook, to debug it: https://colab.research.google.com/drive/1DMeCPC93ot417NNqpCBgEkA4UpEwik1v?usp=sharing
I just called dropna()
on housing_df
, and now there's no nan during training.
Indeed, the housing dataset contains a few nan values, and if you don't fix them before training, then things go wrong. That's because any computation involving nan returns nan. So as soon as one nan value is encountered, the model weights get updated to nan, and after that there's no way to get any other result.
The other thing you should do is to scale the numerical features. Please see the chapter 2 for more details.
Lastly, there's now a second edition of my book, based on TensorFlow 2 instead of 1. It's much simpler to use. The code examples are open source, you can check them out at https://github.com/ageron/handson-ml2
Hope this helps.
Hi,
I tried to implement this model as stated in book - however during training I have NaNs everywhere. I suspect that the weights got to Inf or input_dim is wrong dimension (those are solutions from SO) but I'm not sure why.
Here is my code to replicate
Result:
Epoch 1/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan Epoch 2/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan Epoch 3/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan Epoch 4/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan Epoch 5/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan Epoch 6/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan Epoch 7/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan Epoch 8/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan Epoch 9/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan Epoch 10/10 581/581 [==============================] - 1s 1ms/step - loss: nan - root_mean_squared_error: nan - val_loss: nan - val_root_mean_squared_error: nan