allenai / deep_qa

A deep NLP library, based on Keras / tf, focused on question answering (but useful for other NLP too)
Apache License 2.0
404 stars 132 forks source link

Getting a nan softmax loss #411

Closed bhavikajalli closed 7 years ago

bhavikajalli commented 7 years ago

I followed the instructions and started training on the Squad Dataset. It started off well but then the loss became a nan. What could be the possible reason and how can I correct it ? `657/1474 [============>.................] - ETA: 6417s - loss: 4.8018 - span_begin_softmax_loss: 2.5148 - span_end_softmax_loss: 2.2870 - span_begin_softmax_acc: 0.3892 - span_end_softmax_acc: 0.4372

658/1474 [============>.................] - ETA: 6406s - loss: 4.8008 - span_begin_softmax_loss: 2.5144 - span_end_softmax_loss: 2.2864 - span_begin_softmax_acc: 0.3892 - span_end_softmax_acc: 0.4371

659/1474 [============>.................] - ETA: 6404s - loss: 4.8027 - span_begin_softmax_loss: 2.5155 - span_end_softmax_loss: 2.2872 - span_begin_softmax_acc: 0.3890 - span_end_softmax_acc: 0.4371

660/1474 [============>.................] - ETA: 6390s - loss: 4.8010 - span_begin_softmax_loss: 2.5145 - span_end_softmax_loss: 2.2865 - span_begin_softmax_acc: 0.3892 - span_end_softmax_acc: 0.4372

661/1474 [============>.................] - ETA: 6388s - loss: 4.8017 - span_begin_softmax_loss: 2.5152 - span_end_softmax_loss: 2.2865 - span_begin_softmax_acc: 0.3892 - span_end_softmax_acc: 0.4372

662/1474 [============>.................] - ETA: 6377s - loss: nan - span_begin_softmax_loss: nan - span_end_softmax_loss: nan - span_begin_softmax_acc: 0.3886 - span_end_softmax_acc: 0.4365`

DeNeutoy commented 7 years ago

Hi, DeepQA is depreciated and not being actively maintained, so we aren't going to be able to help you, sorry. You might like to try https://github.com/allenai/allennlp, the version of it that we wrote in Pytorch.