IBM / elasticsearch-spark-recommender

Use Jupyter Notebooks to demonstrate how to build a Recommender with Apache Spark & Elasticsearch
https://developer.ibm.com/code/patterns/build-a-recommender-with-apache-spark-and-elasticsearch/
Apache License 2.0
838 stars 266 forks source link

PYSPARK_DRIVER_PYTHON is not defined #36

Closed RENEEGAILP closed 4 years ago

RENEEGAILP commented 6 years ago

PYSPARK_DRIVER_PYTHON="jupyter"`` PYSPARK_DRIVER_PYTHON_OPTS="notebook" ../spark-2.2.0-bin-hadoop2.7/bin/pyspark --driver-memory 4g --driver-class-path ../../elasticsearch-hadoop-5.3.0/dist/elasticsearch-spark-20_2.11-5.3.0.jar

This gives an error in command prompt

PYSPARK_DRIVER_PYTHON is not recognized as an internal or external command, operable command or a batch file

MLnick commented 6 years ago

Which platform are you running on ?

RENEEGAILP commented 6 years ago

Windows 8.1 I've changed the path variables manually in environment variables I'm not sure how to add the jar files for elastic search capture

MLnick commented 6 years ago

You should be able to use --driver-class-path but ensure you set the path to the fully qualified path name for the location where you unzipped the elasticsearch-hadoop JAR.

I am not familiar with running Spark on Windows, but I assume you should set the path using Windows-style backslashes.

RENEEGAILP commented 6 years ago

capture2

After opening the ipnyb in jupyter this is the error

MLnick commented 6 years ago

Can you try running the Windows .cmd version of pyspark :..\spark-2.2.0-bin-hadoop2.7\bin\pyspark.cmd?

yangjax commented 4 years ago

capture2

After opening the ipnyb in jupyter this is the error

i meet the same error, haver you solve it?