PR comes out of #30, in which @cpennington was eager to compose 2 LSTM networks with a Concat.
This required allowing a RecurrentNetwork to also be an instance of RecurrentLayer, which is an interesting challenge.
This PR changes the kind of recurrent shapes from Shape to *; but requires a Num instance for usage. It also changes the run functions, tapes, and gradients types to run only a single example at a time, like the recurrent layer.
Also added is a ConcatRecurrent layers, which is pretty neat.
PR comes out of #30, in which @cpennington was eager to compose 2 LSTM networks with a Concat.
This required allowing a
RecurrentNetwork
to also be an instance ofRecurrentLayer
, which is an interesting challenge.This PR changes the kind of recurrent shapes from
Shape
to*
; but requires aNum
instance for usage. It also changes the run functions, tapes, and gradients types to run only a single example at a time, like the recurrent layer.Also added is a ConcatRecurrent layers, which is pretty neat.