In BatchNorm LSTM, keyword arguments are used to pass input/hidden into layers:
@staticmethod
def _forward_rnn(cell, input_, length, hx):
max_time = input_.size(0)
output = []
for time in range(max_time):
if isinstance(cell, BNLSTMCell):
h_next, c_next = cell(input_=input_[time], hx=hx, time=time)
else:
h_next, c_next = cell(input_=input_[time], hx=hx)
If you compile the cell, this does not work, because we don't understand how to deal with kwargs.
One thing that is subtle about fixing this is that you want cell(input_=input) to be equivalent to cell(input); however, if we are interposing on a forward invocation, you don't actually get any of that info. So you'll have to look at the function specification and reverse engineer it. Annoying. But if this shows up in many places we might have to deal with it.
In BatchNorm LSTM, keyword arguments are used to pass input/hidden into layers:
If you compile the cell, this does not work, because we don't understand how to deal with kwargs.
One thing that is subtle about fixing this is that you want
cell(input_=input)
to be equivalent tocell(input)
; however, if we are interposing on aforward
invocation, you don't actually get any of that info. So you'll have to look at the function specification and reverse engineer it. Annoying. But if this shows up in many places we might have to deal with it.