howl-anderson / Chinese_models_for_SpaCy

SpaCy 中文模型 | Models for SpaCy that support Chinese
MIT License
650 stars 110 forks source link

[Errno 2] No such file or directory: 'python3.7/site-packages/zh_core_web_sm/zh_core_web_sm-0.1.0/tokenizer' #28

Closed jiminsun closed 4 years ago

jiminsun commented 4 years ago

Hi,

I've been trying to use your package for pos-tagging a pre-tokenized Chinese corpus, but ran into this error while running zh_core_web_sm.load().

I'm working with spacy 2.2.4. and I've downloaded the most recently released V0.1.0. Seems like there's only 'ner', parser', 'tagger', and 'vocab' folder with meta.json file in the zh_core_web_sm-0.1.0 directory.

Thanks in advance! :)

howl-anderson commented 4 years ago

Hi @jiminsun

Sorry for the later reply, can you provide a mini code to reproduce the error?

jiminsun commented 4 years ago

I've been running the notebooks/demo.ipynb you've provided, simply importing all the necessary packages and running nlp=zh_core_web_sm.load(). I've run it again just now under the same environment, and the problem is fixed though I haven't changed anything. :) Thanks for the follow-up!