hankcs / hanlp-lucene-plugin

HanLP中文分词Lucene插件,支持包括Solr在内的基于Lucene的系统
http://www.hankcs.com/nlp/segment/full-text-retrieval-solr-integrated-hanlp-chinese-word-segmentation.html
Apache License 2.0
296 stars 99 forks source link

如果词语中含有“0.01元”,则会将“.”作为句尾导致分词结果为[0] [.] [01] [元],不知道这是否算作bug。 #48

Closed 468120308 closed 4 years ago

468120308 commented 5 years ago

image

hankcs commented 4 years ago

你好,已经在 https://github.com/hankcs/hanlp-lucene-plugin/commit/c0dfdb1de71d9f322f1f131879dbd91d89c562ce 中修复。