Coreference resolution seems to work fine for longer texts. But with short texts I get various errors. (I'm on Debian, Python 3.7.0, PyTorch 1.0.1, AllenNLP 0.8.4.)
from allennlp.predictors import predictor
coref = predictor.Predictor.from_path('https://s3-us-west-2.amazonaws.com/allennlp/models/coref-model-2018.02.05.tar.gz')
coref.predict('I am Joe.')
...
~/anaconda3/lib/python3.7/site-packages/allennlp/modules/seq2vec_encoders/cnn_encoder.py in forward(self, tokens, mask)
103 convolution_layer = getattr(self, 'conv_layer_{}'.format(i))
104 filter_outputs.append(
--> 105 self._activation(convolution_layer(tokens)).max(dim=2)[0]
106 )
107
~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
487 result = self._slow_forward(*input, **kwargs)
488 else:
--> 489 result = self.forward(*input, **kwargs)
490 for hook in self._forward_hooks.values():
491 hook_result = hook(self, input, result)
~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/conv.py in forward(self, input)
185 def forward(self, input):
186 return F.conv1d(input, self.weight, self.bias, self.stride,
--> 187 self.padding, self.dilation, self.groups)
188
189
RuntimeError: $ Torch: invalid memory size -- maybe an overflow? at /opt/conda/conda-bld/pytorch_1550784378911/work/aten/src/TH/THGeneral.cpp:188
Same error with I am Bob.. But no error with I am David..
There is a slightly different error too, which also affects short texts:
coref.predict('I have a head.')
...
~/anaconda3/lib/python3.7/site-packages/allennlp/modules/time_distributed.py in forward(self, pass_through, *inputs, **kwargs)
49 reshaped_kwargs[key] = value
50
---> 51 reshaped_outputs = self._module(*reshaped_inputs, **reshaped_kwargs)
52
53 if some_input is None:
~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
487 result = self._slow_forward(*input, **kwargs)
488 else:
--> 489 result = self.forward(*input, **kwargs)
490 for hook in self._forward_hooks.values():
491 hook_result = hook(self, input, result)
~/anaconda3/lib/python3.7/site-packages/allennlp/modules/seq2vec_encoders/cnn_encoder.py in forward(self, tokens, mask)
103 convolution_layer = getattr(self, 'conv_layer_{}'.format(i))
104 filter_outputs.append(
--> 105 self._activation(convolution_layer(tokens)).max(dim=2)[0]
106 )
107
RuntimeError: cannot perform reduction function max on tensor with no elements because the operation does not have an identity
No error for I have hands. though.
I've done a little debugging, and in the second exception the problem seems to be that convolution_layer has a 5-element window. So if tokens is smaller than that, the output is empty.
Coreference resolution seems to work fine for longer texts. But with short texts I get various errors. (I'm on Debian, Python 3.7.0, PyTorch 1.0.1, AllenNLP 0.8.4.)
Same error with
I am Bob.
. But no error withI am David.
.There is a slightly different error too, which also affects short texts:
No error for
I have hands.
though.I've done a little debugging, and in the second exception the problem seems to be that
convolution_layer
has a 5-element window. So iftokens
is smaller than that, the output is empty.