Closed naoya-i closed 8 years ago
Thanks for reporting this! Are you using the official release (3.6.0), or the GitHub HEAD version of the code? I remember I fixed an error similar to this a bit ago, and corenlp.run doesn't crash on the sentence, which means that hopefully it's the same bug. If you're not already on it, you can build the GitHub code with ant jar
, and use the resulting jar file instead of the official release. I think the models should be the same, but if they're not, there's a link to download the most recent models from the project homepage.
The lack of an -ignore-errors
flag is actually kind of deliberate. I'd like to hold OpenIE to a standard of never crashing (after all, the rest of CoreNLP doesn't crash either), and therefore any exception should be treated as a critical bug that should be fixed quickly.
Thanks for your reply! I have tried only the official release (3.6.0) at that time, so I tried the GitHub version at this time. Fortunately, the GitHub version did not crash on all the sentences that I mentioned. For the time being, I will work on this version.
The lack of an -ignore-errors flag is actually kind of deliberate. I'd like to hold OpenIE to a standard of never crashing (after all, the rest of CoreNLP doesn't crash either), and therefore any exception should be treated as a critical bug that should be fixed quickly.
Ok, I understand the philosophy behind CoreNLP ;-) If I encounter some other problems, I’ll come back again!
Thanks!
Naoya
On May 13, 2016, at 06:26, Gabor Angeli notifications@github.com wrote:
Thanks for reporting this! Are you using the official release (3.6.0), or the GitHub HEAD version of the code? I remember I fixed an error similar to this a bit ago, and corenlp.run doesn't crash on the sentence, which means that hopefully it's the same bug. If you're not already on it, you can build the GitHub code with ant jar, and use the resulting jar file instead of the official release. I think the models should be the same, but if they're not, there's a link to download the most recent models from the project homepage.
The lack of an -ignore-errors flag is actually kind of deliberate. I'd like to hold OpenIE to a standard of never crashing (after all, the rest of CoreNLP doesn't crash either), and therefore any exception should be treated as a critical bug that should be fixed quickly.
— You are receiving this because you authored the thread. Reply to this email directly or view it on GitHub
Hi,
I use Stanford OpenIE (http://stanfordnlp.github.io/CoreNLP/openie.html) to extract triples from Gigaword corpus. I call "edu.stanford.nlp.naturalli.OpenIE" module from Stanford CoreNLP jar files as follows:
However, some sentences from Gigaword corpus crash Stanford OpenIE as follows:
So far, I couldn't find any regularity of sentences that can cause this Java exception. For reference, I also pasted other 9 sentences that can cause the Java exception.
Of course, it would be happy if the error is fixed. However, the happier solution that I personally think is to let Stanford OpenIE have "-ignore-errors" option, which is implemented in Ollie, University of Washington's OpenIE system (https://knowitall.github.io/ollie/). The "-ignore-errors" option makes the software more error-tolerant, allowiing us to skip a sentence that causes an error, and just move on to the next sentence. This should be extremely useful for parsing a large file.