Closed jiminsun closed 4 years ago
Hi @jiminsun
Sorry for the later reply, can you provide a mini code to reproduce the error?
I've been running the notebooks/demo.ipynb
you've provided, simply importing all the necessary packages and running nlp=zh_core_web_sm.load()
.
I've run it again just now under the same environment, and the problem is fixed though I haven't changed anything. :)
Thanks for the follow-up!
Hi,
I've been trying to use your package for pos-tagging a pre-tokenized Chinese corpus, but ran into this error while running
zh_core_web_sm.load()
.I'm working with spacy 2.2.4. and I've downloaded the most recently released V0.1.0. Seems like there's only 'ner', parser', 'tagger', and 'vocab' folder with meta.json file in the zh_core_web_sm-0.1.0 directory.
Thanks in advance! :)