Open vestuto opened 8 years ago
UPDATE: regardless of content of /etc/hosts
cannot get notebook 06 to run the following cell without the same error as shown above:
sc = SparkContext('spark://schedulers:7077')
According to @quasiben and @ahmadia this may be closer to fixed by setting your JAVA_HOME
and JRE_HOME
environment variables.
Setting the following in my ~/.bashrc
allows me to run pyspark
from the conda environment I've set up...
export JAVA_HOME=$(/usr/libexec/java_home)
export JRE_HOME=$JAVA_HOME/jre
... but I still get the same error from notebook 06
This was a VPN issue for me. Are you running one? On Mon, Jul 11, 2016 at 10:07 PM Jason Vestuto notifications@github.com wrote:
Setting the following in my ~/.bashrc allows me to run pyspark from the conda environment I've set up...
export JAVA_HOME=$(/usr/libexec/java_home) export JRE_HOME=$JAVA_HOME/jre
... but I still get the same error from notebook 06
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mrocklin/scipy-2016-parallel/issues/2#issuecomment-231925763, or mute the thread https://github.com/notifications/unsubscribe/AAfRJaSOdIOPoHUkU_ogDvaaw5jLNaTyks5qUwTRgaJpZM4JJ-Mq .
nope. just trying to run locally on my laptop.
Students in your tutorial may encounter this. The following code cell generates an error without updating
/etc/hosts
The error is as following:
Adding the following to the end of
/etc/hosts
enabled me to run the cell successfully:Where
<hostname>
is the output from callinghostname
from the shell.