Closed anhhuyalex closed 2 years ago
@anhhuyalex Sorry for the late response. The reason for this error is that at the time, PyTorch did not have same
padding implementation until PyTorch 1.9. So we had a custom implementation of same
padding implemented.
So at the moment, pt-aha
does not work as expected with PyTorch 1.9. We're planning on updating this in the near future to bring it up to date with the most recent PyTorch versions.
In the meantime, you could try using PyTorch 1.8 or lower.
Hello,
When I run
python oneshot_cls.py --config definitions/aha_config.json
on PyTorch version 1.9.1+cu111 I obtain the following errorIt appears that the issue is the settings for the ltm
When replaced with "encoder_padding": 0 and "decoder_padding": 0, the problem goes away.