Currently, the padding logic is build around padding only the spatial dims in RETURNN. However, PyTorch does not care about what dims are spatial and also allows more flexible padding. In order to allow this here, I simplified the logic for padding so that only the order of dims in PyTorch matters.
Currently, the padding logic is build around padding only the spatial dims in RETURNN. However, PyTorch does not care about what dims are spatial and also allows more flexible padding. In order to allow this here, I simplified the logic for padding so that only the order of dims in PyTorch matters.