nchammas / flintrock

A command-line tool for launching Apache Spark clusters.
Apache License 2.0
638 stars 116 forks source link

Automatically add Spark and HDFS executables to `$PATH` #119

Closed nchammas closed 8 years ago

nchammas commented 8 years ago

I think it makes sense to add some (maybe all?) of the executables in spark/bin/ and hadoop/bin/ to the default $PATH so the user can invoke them right on login.

I'd probably do this by linking the executables we want to /usr/local/bin/ or something.

BenFradet commented 8 years ago

Definitely agree with this one since I tend to write my submit scripts in the form of

spark-submit \
  ...

prior to launching my cluster and just expect them to work.

I'll work on this one once #118 gets merged