When I run th train.lua -rnn_size 512 -num_layers 2 -dropout 0.5 -print_every 5 -gpuid -1 the training stops at cloning criterion. I'm on Ubuntu 20.04. This is the output.
loading data files...
cutting off end of data so that the batches/sequences divide evenly
reshaping tensor...
data load done. Number of data batches in train: 423, val: 23, test: 0
vocab size: 65
creating an lstm with 2 layers
setting forget gate biases to 1 in LSTM layer 1
setting forget gate biases to 1 in LSTM layer 2
number of parameters in the model: 3320385
cloning rnn
cloning criterion
Can anyone help?
Quick edit: When cloning criterion, the CPU usage goes up to 99%.
When I run
th train.lua -rnn_size 512 -num_layers 2 -dropout 0.5 -print_every 5 -gpuid -1
the training stops atcloning criterion
. I'm on Ubuntu 20.04. This is the output.Can anyone help?
Quick edit: When cloning criterion, the CPU usage goes up to 99%.