stanfordnlp / CoreNLP

CoreNLP: A Java suite of core NLP tools for tokenization, sentence segmentation, NER, parsing, coreference, sentiment analysis, etc.
http://stanfordnlp.github.io/CoreNLP/
GNU General Public License v3.0
9.72k stars 2.7k forks source link

OpenIEDemo run error. Tagger file not found "english-left3words-distsim.tagger" #282

Closed raymondhekk closed 8 years ago

raymondhekk commented 8 years ago

Unable to open "edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger" as class path, filename or URL

raymondhekk commented 8 years ago

I find a similar file, but it's not correct on format. "scripts/pos-tagger/english-left3words-distsim.tagger.props"

gangeli commented 8 years ago

Do you have the CoreNLP models in your classpath? That's the usual cause of this exception.

raymondhekk commented 8 years ago

Ok, I'll download the models file. Thank you!

raymondhekk commented 8 years ago

After I config model file on classpath , and run OpenIEDemo , a class format conflict error occurs as below(I use JDK 1.8) :

Exception in thread "main" edu.stanford.nlp.io.RuntimeIOException: Could not load clause splitter model at edu/stanford/nlp/models/naturalli/clauseSearcherModel.ser.gz at edu.stanford.nlp.naturalli.OpenIE.(OpenIE.java:205) at edu.stanford.nlp.pipeline.AnnotatorImplementations.openie(AnnotatorImplementations.java:303) at edu.stanford.nlp.pipeline.AnnotatorFactories$20.create(AnnotatorFactories.java:506) at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:152) at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:451) at edu.stanford.nlp.pipeline.StanfordCoreNLP.(StanfordCoreNLP.java:154) at edu.stanford.nlp.pipeline.StanfordCoreNLP.(StanfordCoreNLP.java:150) at edu.stanford.nlp.pipeline.StanfordCoreNLP.(StanfordCoreNLP.java:137) at edu.stanford.nlp.naturalli.OpenIE.main(OpenIE.java:750) Caused by: java.io.InvalidClassException: edu.stanford.nlp.naturalli.ClauseSplitterSearchProblem$8; local class incompatible: stream classdesc serialVersionUID = 4145523451314579506, local class serialVersionUID = -7360029270983346606 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:621) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1623) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) at edu.stanford.nlp.io.IOUtils.readObjectFromURLOrClasspathOrFileSystem(IOUtils.java:318) at edu.stanford.nlp.naturalli.ClauseSplitter.load(ClauseSplitter.java:283) at edu.stanford.nlp.naturalli.OpenIE.(OpenIE.java:200) ... 8 more

J38 commented 8 years ago

These errors will go away if you use the latest code and download the latest versions of the model jars and put them on your CLASSPATH. The links for the model jars are available on the main GitHub page. If the code and models jars are out of synch you will get errors like this. Please let me know if you continue to have issues and I can help out!