dmcc / PyStanfordDependencies

Python interface for converting Penn Treebank trees to Stanford Dependencies and Universal Depenencies
https://pypi.python.org/pypi/PyStanfordDependencies
68 stars 17 forks source link

Error of Jpypebackend when trying example #21

Open YiruS opened 8 years ago

YiruS commented 8 years ago

Hi David,

I'm trying the example to produce dependencies from a parsed sentence using Stanford Parser. When I use your code: sd = StanfordDependencies.get_instance(jar_filename="/home/stanford-parser/stanford-parser.jar") it pops up the error: UserWarning: Error importing JPypeBackend, falling back to SubprocessBackend. raise ValueError("Bad exit code from Stanford CoreNLP") ValueError: Bad exit code from Stanford CoreNLP

Any information would be highly appreciated!

Thanks! Yiru

dmcc commented 8 years ago

Thanks for the report. Seems like there might be two things going on:

First is that it's not able to import JPypeBackend (which is not necessarily bad, but might make some things slower). Do you have the Python module JPype1 installed? (does import jpype work?).

Second, for SubprocessBackend, it looks like it's getting an error when running Stanford CoreNLP. There's a debug=True flag that you can pass to convert_tree()/convert_trees(). Could you post the output from this?

YiruS commented 8 years ago

Hi David,

The problem was that I did not install Jpype. After installing it, another error comes out when calling sd = StanfordDependencies.get_instance(jar_filename='/home/stanford-parser/stanford-parser.jar'). The error is: Your Java runtime is too old (must be 1.8+ to use CoreNLP version 3.5.0 or later and 1.6+ to use CoreNLP version 1.3.1 or later). Currently my java version is 1.8.0_71.

However, when I change the code to sd = StanfordDependencies.get_instance(version='3.5.2'), everything is okay.

The jar_file I've called in the code (stanford-parser/stanford-parser.jar) is the 3.5.2 version downloaded from Stanford parser website. I'm curious about this error since both of the parser are 3.5.2 but it shows different result with different calling method.

Do you have any hint about that?

Thanks! Yiru

On 03/27/2016 08:31 PM, David McClosky wrote:

Thanks for the report. Seems like there might be two things going on:

First is that it's not able to import |JPypeBackend| (which is not necessarily bad, but might make some things slower). Do you have the Python module |JPype1| installed? (does |import jpype| work?).

Second, for |SubprocessBackend|, it looks like it's getting an error when running Stanford CoreNLP. There's a |debug=True| flag that you can pass to |convert_tree()|/|convert_trees()|. Could you post the output from this?

— You are receiving this because you authored the thread. Reply to this email directly or view it on GitHub https://github.com/dmcc/PyStanfordDependencies/issues/21#issuecomment-202176756

dmcc commented 8 years ago

Do you know if you have multiple versions of Java installed? It's possible that jpype was built against a different version than the main java binary in your path (I've had this issue in some places). I'm hoping this explains why it is claiming that your Java version is too old when it's not.

I'm not sure about why it's working with the Stanford CoreNLP 3.5.2 jar and the Stanford Parser 3.5.2 jar. In theory, these jars should be similar and have roughly the same APIs for our purposes. I tried StanfordDependencies.get_instance(jar_filename='/path/to/stanford-p arser-full-2015-04-20/stanford-parser.jar', backend='subprocess') and it was able to convert. I wasn't able to test with JPype, though -- does it work for you with SubprocessBackend?