Closed ifilonenko closed 7 years ago
Upon merging, this PR closes #506
To run integration tests in a proper R environment, it is required that R_HOME is defined in the testing environment which means everyone would need to install R. Is that something that would be an issue or something that can be assumed if someone is building out a full dev environment for Spark? @foxish @erikerlandson
Hmm, interesting. The submitting node has an R dependency? or the driver?
The R dependency is because there is a need to mimic the make-distribution environment in target/docker/R
so that when we run ADD R opt/spark/R
it is already packaged in the Docker environment; this is similar to how we setup PySpark.
@ssuchter @varunkatta Integration test will pass after R is installed and R_HOME is defined in the jenkins environment.
PR is otherwise ready for review
rerun integration tests please
rerun integration tests please
rerun integration tests please
PR is ready for review @foxish @erikerlandson
@ifilonenko you also need to update sbin/build-push-docker-images.sh
to add the new images.
rerun unit tests please
Ready for merging to branch-2.2 unless any other concerns @erikerlandson @foxish @liyinan926
LGTM. Thanks for the work!
rerun unit tests please
Unless there are any further comments, I think this is ready to merge
All set to merge? @foxish @erikerlandson @liyinan926
I think it's all good.
hi - what's left for this work? this https://github.com/apache-spark-on-k8s/spark/pull/507#discussion_r143595440?
I left some comments. Also would like to see this tested in a production environment, but maybe we can just merge it and follow up as feedback comes in.
@mccheah as per your last comment. Is this okay to merge then?
Can merge when CI passes - I just updated the branch.
ready for merging: @foxish
Nvm. Please ignore last comment. That was in the merge commit.
What changes were proposed in this pull request?
Initial Spark R support
How was this patch tested?