I am having trouble building Spark and running all the unit tests within a Docker container. The JVM complains that there isn't enough memory, though I believe I've set the appropriate JAVA_OPTS and granted the Docker container plenty of memory.
Do you folks have some instructions on how to build Spark from source and run all the unit tests within a Docker container? I took a look through the scripts here but couldn't find anything.
For the record, I'm trying to build and test Spark as follows:
# start the container like this
# docker run -m 4g -t -i centos bash
export JAVA_OPTS="-Xms512m -Xmx1024m -XX:PermSize=64m -XX:MaxPermSize=128m -Xss512k"
# build
sbt/sbt -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -Pkinesis-asl -Phive -Phive-thriftserver package assembly/assembly
# Scala unit tests
sbt/sbt -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -Pkinesis-asl -Phive -Phive-thriftserver catalyst/test sql/test hive/test mllib/test
This is a general question about Spark on Docker. Let me know if there is a better place to ask this. I asked a similar question on the Spark dev list.
I am having trouble building Spark and running all the unit tests within a Docker container. The JVM complains that there isn't enough memory, though I believe I've set the appropriate
JAVA_OPTS
and granted the Docker container plenty of memory.Do you folks have some instructions on how to build Spark from source and run all the unit tests within a Docker container? I took a look through the scripts here but couldn't find anything.
For the record, I'm trying to build and test Spark as follows: