emanjavacas / pie

A fully-fledge PyTorch package for Morphological Analysis, tailored to morphologically rich and historical languages.
MIT License
22 stars 10 forks source link

context:word & level:char throw an error #28

Open PonteIneptique opened 5 years ago

PonteIneptique commented 5 years ago

In normalized.txt,

  "tasks": [
    {
      "name": "form",
      "target": true,
      "context": "word",
      "level": "char",
      "decoder": "attentional",
      "settings": {
        "bos": true,
        "eos": true,
        "lower": true,
        "target": "form"
      },
      "layer": -1
    }
  ],

throws an error :

2019-06-07 14:16:56,300 : Starting epoch [1]
Traceback (most recent call last):
  File "/home/thibault/dev/pie/env/bin/pie", line 11, in <module>
    load_entry_point('nlp-pie', 'console_scripts', 'pie')()
  File "/home/thibault/dev/pie/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/home/thibault/dev/pie/env/lib/python3.6/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/home/thibault/dev/pie/env/lib/python3.6/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/thibault/dev/pie/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/thibault/dev/pie/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/thibault/dev/pie/pie/scripts/group.py", line 86, in train
    pie.scripts.train.run(config_path=config_path)
  File "/home/thibault/dev/pie/pie/scripts/train.py", line 162, in run
    scores = trainer.train_epochs(settings.epochs, devset=devset)
  File "/home/thibault/dev/pie/pie/trainer.py", line 341, in train_epochs
    self.train_epoch(devset, epoch)
  File "/home/thibault/dev/pie/pie/trainer.py", line 294, in train_epoch
    loss = self.model.loss(batch, get_batch_task(self.tasks))
  File "/home/thibault/dev/pie/pie/models/model.py", line 276, in loss
    context = get_context(outs, wemb, wlen, self.tasks[task]['context'])
  File "/home/thibault/dev/pie/pie/models/model.py", line 18, in get_context
    return torch_utils.flatten_padded_batch(wemb, wlen)
  File "/home/thibault/dev/pie/pie/torch_utils.py", line 152, in flatten_padded_batch
    for sent, sentlen in zip(batch.transpose(0, 1), nwords):
AttributeError: 'NoneType' object has no attribute 'transpose'

The same config with context:sentence is fine.

PonteIneptique commented 5 years ago

This seems to be caused by wemb_dim:0. I would recommend raising an exception when this value is set to 0 but context is word.

emanjavacas commented 5 years ago

Yes, you need input word embeddings if you use context word.