Open behpouriahi opened 6 years ago
Typically that means that pip3
and your Python interpreter are not the same. Try comparing head -n 1 $(which pip3)
and print(sys.executable)
in your Python session.
Make sure your SPARK_HOME environment variable is correctly assigned.
does this work for you? ls $SPARK_HOME
I face the same issue now. I installed the findspark in my laptop but cannot import it in jupyter notebook.
I face the same issue now. I installed the findspark in my laptop but cannot import it in jupyter notebook.
Could you solve your issue? I have the same
I have the same too :(
I would suggest using something to keep pip and python/jupyter pointing to the same installation. Pyenv (while it's not its main goal) does this pretty well. Just install jupyter and findspark after install pyenv and setting a version with pyenv (global | local) VERSION
.
You should be able to use python -m pip install ...
to install or otherwise interact with pip. Doing this with IPython should work as well.
If you are using jupyter, run jupyter --paths
. I get this.
config:
/home/nmay/.jupyter
/home/nmay/.pyenv/versions/3.8.0/etc/jupyter
/usr/local/etc/jupyter
/etc/jupyter
data:
/home/nmay/.local/share/jupyter
/home/nmay/.pyenv/versions/3.8.0/share/jupyter <-- This is the important path
/usr/local/share/jupyter
/usr/share/jupyter
runtime:
/home/nmay/.local/share/jupyter/runtime
In my case, it's /home/nmay/.pyenv/versions/3.8.0/share/jupyter
(since I use pyenv). The python and pip binaries that runs with jupyter will be located at /home/nmay/.pyenv/versions/3.8.0/bin/python
and <path>/bin/pip
. You could alias these (e.g. jupyter-pip
) and install findspark with those.
Hope that helps :+1:
In case you're using Jupyter, Open Anaconda Prompt (Anaconda3) from the start menu. Then use this code to specifically force Findspark to be installed for the Jupyter's environment.
conda install -c conda-forge findspark
I install findspark in conda base env.. then I could solve it
bashconda deactivate conda activate python conda list pip3 install pyspark pip install pyspark conda install pyspark pip install findspark pip3 install findspark conda install findspark conda deactivate conda activate spark_env jupyter notebook doskey /history
Hi, I used pip3 install findspark . after installation complete I tryed to use import findspark but it said No module named 'findspark'. I don't know what is the problem here
Please restart your Jupyter notebook kernal and it will solve your problem.
Hi, I used pip3 install findspark . after installation complete I tryed to use import findspark but it said No module named 'findspark'. I don't know what is the problem here