Closed nchammas closed 8 years ago
Definitely agree with this one since I tend to write my submit scripts in the form of
spark-submit \
...
prior to launching my cluster and just expect them to work.
I'll work on this one once #118 gets merged
I think it makes sense to add some (maybe all?) of the executables in
spark/bin/
andhadoop/bin/
to the default$PATH
so the user can invoke them right on login.I'd probably do this by linking the executables we want to
/usr/local/bin/
or something.