Closed albertz closed 2 years ago
test_packed_sequence_2
fails now because PyTorch 1.10 seems to require that the input totorch.nn.utils.rnn.pack_sequence
is really a list, not a tensor.Do we really need this test? Such code would anyway not really be supported in case it is a list. And if it is a tensor, it doesn't really make so much sense anyway.
This is unfortunately exactly the case we had when using the padertorch model. We used a tensor as input and in PermutationInvariantTrainingModel.forward()
they do pack_sequence
on that, see here.
Can we avoid this by wrapping at a different level?
test_packed_sequence_2
fails now because PyTorch 1.10 seems to require that the input totorch.nn.utils.rnn.pack_sequence
is really a list, not a tensor. Do we really need this test? Such code would anyway not really be supported in case it is a list. And if it is a tensor, it doesn't really make so much sense anyway.This is unfortunately exactly the case we had when using the padertorch model. We used a tensor as input and in
PermutationInvariantTrainingModel.forward()
they dopack_sequence
on that, see https://github.com/fgnt/padertorch/blob/7293715f26df0794742b0655f8482b3515cefc9d/padertorch/contrib/examples/source_separation/pit/model.py#L86.
But this code then is broken. It will not work with recent PyTorch versions. Recent PyTorch versions require a list, not a tensor.
Their code uses lists. We just used a tensor because we always have tensor inputs to RETURNN I guess.
Ah but then our test is not really testing that. Our test also uses a tensor in case of PyTorch.
So we could extend the converter to convert list inputs into tensors for the RETURNN drop in?
But how did you do it for this model? I assume you already have some wrapper around it?
Nothing needs to be extended. What you currently have will still work.
We just need to adapt the test to do the same thing you are anyway already doing.
I just pushed now such logic to the test. You anyway need to do the same when you wrap PermutationInvariantTrainingModel
.
But how did you do it for this model? I assume you already have some wrapper around it?
I created the input as in their example and provided it to verify_torch_and_convert_to_returnn
as a batched, padded array.
Nothing needs to be extended. What you currently have will still work.
We just need to adapt the test to do the same thing you are anyway already doing.
So I think the test was covering that, no?
I just pushed now such logic to the test. You anyway need to do the same when you wrap
PermutationInvariantTrainingModel
.
Alright, thanks.
But how did you do it for this model? I assume you already have some wrapper around it?
I created the input as in their example and provided it to
verify_torch_and_convert_to_returnn
as a batched, padded array.
But this is wrong for recent PyTorch versions. You miss some logic which converts it to list. See my change in the test.
test_packed_sequence_2
fails now because PyTorch 1.10 seems to require that the input totorch.nn.utils.rnn.pack_sequence
is really a list, not a tensor.Do we really need this test? Such code would anyway not really be supported in case it is a list. And if it is a tensor, it doesn't really make so much sense anyway.