IBMDataScience / DSx-Desktop

IBM Data Science Experience Desktop was built for those who want to download and play locally. Analyze, learn, and build with the tools you love, right on your desktop.
33 stars 18 forks source link

NameError: name 'spark' is not defined #33

Closed ghost closed 6 years ago

ghost commented 7 years ago

Hi, I have installed DFX Desktop and I was assuming that Spark will be installed along with it, but sounds I was wrong. Should I install Spark on my computer separately? And if I do, would that be working fine with DSX?

Thanks, Asghar

Debbani commented 7 years ago

@am8042 Have you installed Anaconda + Spark or one of the other Notebook images? Please can you go to the Settings -> General tab to find out what type of the Notebook image you've installed. You'll not need to install Spark separately. If you've one of the other non-spark images downloaded, you can uninstall just the notebook image and install the one that you're looking for.

ghost commented 7 years ago

Hi, Yes I have Anaconda (Python 2.7) installed there. But there is no Spark there.

On Thu, Aug 17, 2017 at 10:48 AM, Debbani notifications@github.com wrote:

@am8042 https://github.com/am8042 Have you installed Anaconda + Spark or one of the other Notebook images? Please can you go to the Settings -> General tab to find out what type of the Notebook image you've installed. You'll not need to install Spark separately. If you've one of the other non-spark images downloaded, you can uninstall just the notebook image and install the one that you're looking for.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/IBMDataScience/DSx-Desktop/issues/33#issuecomment-323070853, or mute the thread https://github.com/notifications/unsubscribe-auth/AIPv9FBF8OcO4dU33kZ5yw_gq1tg9ZxPks5sZD2qgaJpZM4O6PZn .

Debbani commented 7 years ago

@am8042 That's correct, this notebook image does not have Spark. Please follow the steps described above to uninstall the notebook image that you've, and install the Anaconda + Spark image instead.

screen shot 2017-08-17 at 6 27 48 am