abronte / PysparkProxy

Seamlessly execute pyspark code on remote clusters
Other
4 stars 0 forks source link

Start the server with a single command via pip install #18

Closed abronte closed 6 years ago

abronte commented 6 years ago

Getting this up and running on a cluster should be like this:

pip install pysparkproxy
pyspark-proxy-server start