cazala / synaptic

architecture-free neural network library for node.js and the browser
http://caza.la/synaptic
Other
6.92k stars 666 forks source link

LSTM training stopped? #129

Open BBBBlarry opened 8 years ago

BBBBlarry commented 8 years ago

I'm using an LSTM model with 32 input, 1 memory block with 15 memory cells, and 32 outputs. I'm trying to train it with a 5000-sample training set. However, the training, for some reason, stopped at the 4th iteration (similar situations have happened before in other circumstances, too). Each previous iteration took about 50 seconds but the 4th one never finished even after I waited overnight.

Has anyone else encountered this?

olehf commented 8 years ago

Yes, The training set is too big for JS. I had seen similar issue with Stacked LTSM - 60/40/20/10. The approach would be to split your data set into smaller batches.

On Fri, 12 Aug 2016 at 05:32, Blarry Wang notifications@github.com wrote:

I'm using an LSTM model with 32 input, 1 memory block with 15 memory cells, and 32 outputs. I'm trying to train it with a 5000-sample training set. However, the training, for some reason, stopped at the 4th iteration (similar situations have happened before in other circumstances, too). Each previous iteration took about 50 seconds but the 4th one never finished even after I waited overnight.

Has anyone else encountered this?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/cazala/synaptic/issues/129, or mute the thread https://github.com/notifications/unsubscribe-auth/AAUadrjLOIfIYv6V7PY9EtBkvyp16y5xks5qe_ddgaJpZM4JixKl .

BBBBlarry commented 8 years ago

Thank you olehf, By splitting the data set, do you mean train the first batch for a couple of iterations and then move on to the next until the entire set is trained?