minrk / findspark

BSD 3-Clause "New" or "Revised" License
511 stars 72 forks source link

findspark not working after installation #18

Open behpouriahi opened 6 years ago

behpouriahi commented 6 years ago

Hi, I used pip3 install findspark . after installation complete I tryed to use import findspark but it said No module named 'findspark'. I don't know what is the problem here

minrk commented 6 years ago

Typically that means that pip3 and your Python interpreter are not the same. Try comparing head -n 1 $(which pip3) and print(sys.executable) in your Python session.

justinnaldzin commented 6 years ago

Make sure your SPARK_HOME environment variable is correctly assigned. does this work for you? ls $SPARK_HOME

JunqiLoveCoding commented 5 years ago

I face the same issue now. I installed the findspark in my laptop but cannot import it in jupyter notebook.

nursace commented 4 years ago

I face the same issue now. I installed the findspark in my laptop but cannot import it in jupyter notebook.

Could you solve your issue? I have the same

mazino2d commented 4 years ago

I have the same too :(

nmay231 commented 4 years ago

I would suggest using something to keep pip and python/jupyter pointing to the same installation. Pyenv (while it's not its main goal) does this pretty well. Just install jupyter and findspark after install pyenv and setting a version with pyenv (global | local) VERSION.

You should be able to use python -m pip install ... to install or otherwise interact with pip. Doing this with IPython should work as well.

If you are using jupyter, run jupyter --paths. I get this.

config:
    /home/nmay/.jupyter
    /home/nmay/.pyenv/versions/3.8.0/etc/jupyter
    /usr/local/etc/jupyter
    /etc/jupyter
data:
    /home/nmay/.local/share/jupyter
    /home/nmay/.pyenv/versions/3.8.0/share/jupyter   <-- This is the important path
    /usr/local/share/jupyter
    /usr/share/jupyter
runtime:
    /home/nmay/.local/share/jupyter/runtime

In my case, it's /home/nmay/.pyenv/versions/3.8.0/share/jupyter (since I use pyenv). The python and pip binaries that runs with jupyter will be located at /home/nmay/.pyenv/versions/3.8.0/bin/python and <path>/bin/pip. You could alias these (e.g. jupyter-pip) and install findspark with those.

Hope that helps :+1:

D4N005H commented 3 years ago

In case you're using Jupyter, Open Anaconda Prompt (Anaconda3) from the start menu. Then use this code to specifically force Findspark to be installed for the Jupyter's environment. conda install -c conda-forge findspark

rakib06 commented 3 years ago

I install findspark in conda base env.. then I could solve it

bashconda deactivate conda activate python conda list pip3 install pyspark pip install pyspark conda install pyspark pip install findspark pip3 install findspark conda install findspark conda deactivate conda activate spark_env jupyter notebook doskey /history image

Shiva10k commented 1 year ago

Hi, I used pip3 install findspark . after installation complete I tryed to use import findspark but it said No module named 'findspark'. I don't know what is the problem here

Please restart your Jupyter notebook kernal and it will solve your problem. Screenshot from 2023-01-07 13-10-47