sergey-tihon / Stanford.NLP.NET

Stanford NLP for .NET
http://sergey-tihon.github.io/Stanford.NLP.NET/
MIT License
595 stars 123 forks source link

Relationship between NLP, NER, Segmenter #95

Closed darklajid closed 3 years ago

darklajid commented 5 years ago

I'm currently using the NER Nuget package with good results - but I'm supposed to handle Chinese texts as well.

The official site for NER says

We also provide Chinese models built from the Ontonotes Chinese named entity data. There are two models, one using distributional similarity clusters and one without. These are designed to be run on word-segmented Chinese. So, if you want to use these on normal Chinese text, you will first need to run Stanford Word Segmenter or some other Chinese word segmenter, and then run NER on the output of that!

As far as I can tell, the Segmenter Nuget package is incompatible with the NER one. Do I have to use the NLP one then?

I'm asking, because I was informed that the different packages require different license deals in case you want to use them commercially - and I am not interested in any NLP features other than NER (and - it seems as a dependency - segmentation for Chinese).

sergey-tihon commented 5 years ago

Stanford.NLP.CoreNLP is the master package. You should be able to configure processing pipeline for any NLP task solved by Stanford NLP group.