stanfordnlp / stanza

Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many human languages
https://stanfordnlp.github.io/stanza/
Other
7.29k stars 893 forks source link

No module named 'stanfordnlp.utils'; 'stanfordnlp' is not a package #134

Closed mzeidhassan closed 5 years ago

mzeidhassan commented 5 years ago

Hi there,

I am having a very strange issue here. First, I create a new conda environment. I installed standfornlp, spacy, etc.

From within Jupyter Notebook, trying to import standfornlp, I am getting this error:

`ModuleNotFoundError Traceback (most recent call last)

in ----> 1 import stanfordnlp ~\OneDrive\Anaconda_NLP_3\NLP\stanfordnlp.py in 2 import argparse 3 import os ----> 4 from stanfordnlp.utils.resources import DEFAULT_MODEL_DIR 5 import pyarabic.araby as araby 6 import pyarabic.number as number ModuleNotFoundError: No module named 'stanfordnlp.utils'; 'stanfordnlp' is not a package ` What is really strange is that when I go to Anaconda terminal of my new environment, starting the Python interpreter, I can import stanfordnlp without problems. I have Python 3.6.7 by the way. ![image](https://user-images.githubusercontent.com/7589948/65011796-19da9d00-d8d2-11e9-9bab-f2d1de924b27.png) Any idea why? Thanks in advance for your support!
J38 commented 5 years ago

It sounds like your conda environment with stanfordnlp is not running when you start up your Jupyter notebook.

Here is an example Stack Overflow thread discussing this issue:

https://stackoverflow.com/questions/37433363/link-conda-environment-with-jupyter-notebook

mzeidhassan commented 5 years ago

Thanks @J38 ! I tried everything there on this post, but still standfordnlp doesn't work. Spacy and all other packages work just fine though. I am really puzzled. I launch Jupyter Notebook from the correct activated conda environment where stanfordnlp is installed.

mzeidhassan commented 5 years ago

Uninstalled anaconda, then installed a fresh new miniconda with Python 3.7 this time, and I am still having the same error even if I use Atom. I am working on Windows 10 by the way.

Everything works except for stanfordnlp :-(

yuhaozhang commented 5 years ago

This is really strange. Can you try installing stanfordnlp from the source and see if it fixes the issue? You can follow the instructions provided here.

mzeidhassan commented 5 years ago

@yuhaozhang nothing is working. Neither from Jupyter Notebook, nor from Atom for example.

Getting this error:

from stanfordnlp.utils.resources import DEFAULT_MODEL_DIR ModuleNotFoundError: No module named 'stanfordnlp.utils'; 'stanfordnlp' is not a package

When I run this code:

import stanfordnlp stanfordnlp.download('en') nlp = stanfordnlp.Pipeline() doc = nlp("Barack Obama was born in Hawaii. He was elected president in 2008.") doc.sentences[0].print_dependencies()

Any idea?

qipeng commented 5 years ago

Are you sure the jupyter you're using is supported by the correct python you installed stanfordnlp on? In the commandline, what do which python and which jupyter show?

mzeidhassan commented 5 years ago

@qipeng I am using Miniconda on Windows 10.

Here are the versions:

python -V : Python 3.7.3 jupyter --version: 4.4.0

mzeidhassan commented 5 years ago

Creating a new conda environment with Python 3.6 and installed pytroch using conda, not pip, seems to resolve the issue for now and it's working fine. Do you plan to provide a conda package in the near future?

yuhaozhang commented 5 years ago

pip should be compatible with conda in most cases, in the sense that you can use pip to install a package, but have it visible in your conda environment. You just need to make sure that the pip you used is installed inside your conda environment. However, we understand that this is not a perfect solution.

Having conda package is on our roadmap, but we do not have a timeline for it yet. I am closing this issue for now, since we already have another issue discussing this (https://github.com/stanfordnlp/stanfordnlp/issues/10).