Recurrent Neural Network and Long Short Term Memory (LSTM) with Connectionist Temporal Classification implemented in Theano. Includes a Toy training example.
$python train_online.py
Arguments:
FloatX : float32
Num Epochs : 1000
Num Samples : 1000
Scribe:
Alphabet: !"#$%&'()*+,-./0123456789:;=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~
Noise: 0.05
Buffers (vert, horz): 5, 3
Characters per sample: Depends on the random length
Length: Avg:60 Range:(45, 75)
Height: 11
Building the Network
Traceback (most recent call last):
File "train_online.py", line 26, in <module>
ntwk = nn.NeuralNet(scriber.nDims, scriber.nClasses, **nnet_args)
File "/home/rakesha/rnn_ctcs/rnn_ctc/nnet/neuralnet.py", line 23, in __init__
layer3 = CTCLayer(layer2.output, labels, n_classes, use_log_space)
File "/home/rakesha/rnn_ctcs/rnn_ctc/nnet/ctc.py", line 64, in __init__
self._log_ctc()
File "/home/rakesha/rnn_ctcs/rnn_ctc/nnet/ctc.py", line 115, in _log_ctc
outputs_info=[safe_log(_1000)]
File "/home/rakesha/.local/lib/python3.3/site-packages/Theano-0.7.0-py3.3.egg/theano/scan_module/scan.py", line 1044, in scan
scan_outs = local_op(*scan_inputs)
File "/home/rakesha/.local/lib/python3.3/site-packages/Theano-0.7.0-py3.3.egg/theano/gof/op.py", line 600, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/rakesha/.local/lib/python3.3/site-packages/Theano-0.7.0-py3.3.egg/theano/scan_module/scan_op.py", line 550, in make_node
inner_sitsot_out.type.dtype))
ValueError: When compiling the inner function of scan the following error has been encountered: The initial state (`outputs_info` in scan nomenclature) of variable IncSubtensor{Set;:int64:}.0 (argument number 1) has dtype float32, while the result of the inner function (`fn`) has dtype float64. This can happen if the inner function of scan results in an upcast or downcast.
float32 does not work.