maxjcohen / transformer

Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
https://timeseriestransformer.readthedocs.io/en/latest/
GNU General Public License v3.0
842 stars 165 forks source link

runtimeerror #17

Closed hongjianyuan closed 4 years ago

hongjianyuan commented 4 years ago

Traceback (most recent call last): File "D:/transformer-master/transformer-master/training.py", line 81, in dataloader_val, epochs=EPOCHS, pbar=pbar, device=device) File "D:\transformer-master\transformer-master\src\utils\search.py", line 20, in fit netout = net(x.to(device)) File "C:\anaconda\envs\dl\lib\site-packages\torch\nn\modules\module.py", line 493, in call result = self.forward(*input, kwargs) File "D:\transformer-master\transformer-master\tst\transformer.py", line 129, in forward encoding = layer(encoding) File "C:\anaconda\envs\dl\lib\site-packages\torch\nn\modules\module.py", line 493, in call result = self.forward(*input, *kwargs) File "D:\transformer-master\transformer-master\tst\encoder.py", line 86, in forward x = self._selfAttention(query=x, key=x, value=x) File "C:\anaconda\envs\dl\lib\site-packages\torch\nn\modules\module.py", line 493, in call result = self.forward(input, kwargs) File "D:\transformer-master\transformer-master\tst\multiHeadAttention.py", line 97, in forward self._scores = self._scores.masked_fill(attention_mask, float('-inf')) RuntimeError: Expected object of scalar type Byte but got scalar type Bool for argument #2 'mask'