Closed bmanczak closed 3 years ago
I've narrowed the problem down to this commit in pytorch-lightning https://github.com/PyTorchLightning/pytorch-lightning/pull/6207. Try running after installing the previous version of pytorch-lightning with pip install -U pytorch_lightning==1.2.10
. I'm working on fixing the code to work with the latest version of pytorch-lightning.
For reference, I found an open issue with this exact problem: https://github.com/PyTorchLightning/pytorch-lightning/issues/7443#issuecomment-836983666
By the way, you're going to need to set strict=False
when calling load_from_checkpoint
with that model.
Solved in e36e0331f34427973403b4c340cc364fcf9fe26b
Thank you for a swift patch!
To make it work I had to do one small thing:
In src/extractive.py
change nlp.add_pipe(sentencizer)
to nlp.add_pipe("sentencizer")
.
Hi,
Great work on the repo. I followed the Getting Started page and tried to run
mobilebert-uncased-ext-sum
model. Here is a simple code snippet I used:However, I get the following traceback
Any tips on how to solve that?