Closed tmckayus closed 7 years ago
By the way, testing note. To test this image with the oshinko tutorials you want to do something like this:
1) check out the PR and build a local openshift-spark image based on it 2) Tag the new spark image into the integrated registry for an openshift project (modifying the Makefile and using "make push" is an easy way to do that) 3) Create an oshinko cluster configmap that references the image with the full pull spec, for example oc create configmap clusterconfig --from-literal=sparkimage=172.30.100.159:5000/testspark/openshift-spark:2.2 4) Run the various "oc new-app" commands from the tutorials but add "-p OSHINKO_NAMED_CONFIG=clusterconfig"
Note, you'll also want modified s2i templates that reference an s2i image built with spark 2.2, see https://github.com/radanalyticsio/oshinko-s2i/pull/123 for notes on that.
This will run the s2i applications but tell oshinko to use the local openshift-spark image built on spark 2.2 when creating clusters.
tested this against
$ oc version
oc v1.5.0+031cbe4
kubernetes v1.5.2+43a9be4
features: Basic-Auth
Server https://shift.opb.studios:8443
openshift v3.6.0-rc.0+98b3d56
kubernetes v1.6.1+5115d708d7
by running the ophicleide tutorial
everything worked as expected.
+1
Fixed the merge conflict but holding off on this for just a wee bit longer to finalize branch/tag strategy
I have used this version of openshift-spark as a cluster image for grafzahl, jgrafzahl, and spring boot sparkpi s2i applications (also based on spark 2.2) and it succeeds.
Note, I ran some of the s2i applications with spark 2.2 and a spark 2.1 cluster and the applications failed. They also fail with a spark 2.1 driver and spark 2.2 cluster.
https://radanalytics.io/applications/jgrafzahl https://radanalytics.io/applications/grafzahl https://radanalytics.io/applications/spring_sparkpi
Also used as a base image for oshinko scala s2i and ran https://github.com/pdmack/scala-sbt-s2i-test