REST job server for Spark. Note that this is *not* the mainline open source version. For that, go to https://github.com/spark-jobserver/spark-jobserver. This fork now serves as a semi-private repo for Ooyala.
Other
344
stars
135
forks
source link
./bin/server_deploy.sh localhost fails while Trying to deploy jobserver in local #59
I tried to test the JobServer by deploying it in my local.
I changed the DEPLOY_HOSTS="127.0.0.1" in file config/localhost.sh
I got this below error while running -> ./bin/server_deploy.sh localhost
[info] JobSqlDAOSpec:
[info] save and get the jars
[info] - should be able to save one jar and get it back * FAILED *
[info] false did not equal true (JobSqlDAOSpec.scala:101)
[info] - should be able to retrieve the jar file * FAILED *
[info] false did not equal true (JobSqlDAOSpec.scala:114)
[info] saveJobConfig() and getJobConfigs() tests
[info] - should provide an empty map on getJobConfigs() for an empty CONFIGS table * FAILED *
[info] Map() did not equal Map(test-id0 -> Config(SimpleConfigObject({"marco":"pollo"})), test-id1 -> Config(SimpleConfigObject({"merry":"xmas"}))) (JobSqlDAOSpec.scala:121)
[info] - should save and get the same config * FAILED *
[info] org.h2.jdbc.JdbcSQLException: Unique index or primary key violation: "PRIMARY_KEY_6 ON PUBLIC.CONFIGS(JOB_ID)"; SQL statement:
[info] INSERT INTO "CONFIGS" ("JOB_ID","JOB_CONFIG") VALUES (?,?) [23505-170]
[info] at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
[info] at org.h2.message.DbException.get(DbException.java:169)
[info] at org.h2.message.DbException.get(DbException.java:146)
[info] at org.h2.index.BaseIndex.getDuplicateKeyException(BaseIndex.java:81)
[info] at org.h2.index.PageBtree.find(PageBtree.java:121)
[info] at org.h2.index.PageBtreeLeaf.addRow(PageBtreeLeaf.java:147)
[info] at org.h2.index.PageBtreeLeaf.addRowTry(PageBtreeLeaf.java:100)
[info] at org.h2.index.PageBtreeIndex.addRow(PageBtreeIndex.java:102)
[info] at org.h2.index.PageBtreeIndex.add(PageBtreeIndex.java:93)
[info] at org.h2.table.RegularTable.addRow(RegularTable.java:122)
[info] ...
[info] - should be able to get previously saved config * FAILED *
[info] Set(test-id0, test-id1) did not equal Set(test-id0) (JobSqlDAOSpec.scala:143)
[info] - Save a new config, bring down DB, bring up DB, should get configs from DB * FAILED *
[info] org.h2.jdbc.JdbcSQLException: Unique index or primary key violation: "PRIMARY_KEY_6 ON PUBLIC.CONFIGS(JOB_ID)"; SQL statement:
[info] INSERT INTO "CONFIGS" ("JOB_ID","JOB_CONFIG") VALUES (?,?) [23505-170]
[info] at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
[info] at org.h2.message.DbException.get(DbException.java:169)
[info] at org.h2.message.DbException.get(DbException.java:146)
[info] at org.h2.index.BaseIndex.getDuplicateKeyException(BaseIndex.java:81)
[info] at org.h2.index.PageBtree.find(PageBtree.java:121)
[info] at org.h2.index.PageBtreeLeaf.addRow(PageBtreeLeaf.java:147)
[info] at org.h2.index.PageBtreeLeaf.addRowTry(PageBtreeLeaf.java:100)
[info] at org.h2.index.PageBtreeIndex.addRow(PageBtreeIndex.java:102)
[info] at org.h2.index.PageBtreeIndex.add(PageBtreeIndex.java:93)
[info] at org.h2.table.RegularTable.addRow(RegularTable.java:122)
[info] ...
[info] Basic saveJobInfo() and getJobInfos() tests
[info] - should provide an empty map on getJobInfos() for an empty JOBS table * FAILED *
[info] Map() did not equal Map(test-id0 -> JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T14:27:12.498-07:00),test-classpath,2014-09-05T14:27:12.498-07:00,None,None), test-id1 -> JobInfo(test-id1,test-context,JarInfo(test-appName0,2014-09-05T14:27:12.498-07:00),test-classpath,2014-09-05T14:27:12.498-07:00,None,None)) (JobSqlDAOSpec.scala:171)
[info] - should save a new JobInfo and get the same JobInfo * FAILED *
[info] Set(test-id1, test-id0) did not equal Set(test-id0) (JobSqlDAOSpec.scala:182)
[info] - should be able to get previously saved JobInfo * FAILED *
[info] Set(test-id1, test-id0) did not equal Set(test-id0) (JobSqlDAOSpec.scala:193)
[info] - Save another new jobInfo, bring down DB, bring up DB, should JobInfos from DB * FAILED *
[info] Stream(JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T14:27:12.498-07:00),test-classpath,2014-09-05T14:27:12.498-07:00,None,None), ?) did not equal List(JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T15:01:16.807-07:00),test-classpath,2014-09-05T15:01:16.807-07:00,None,None), JobInfo(test-id1,test-context,JarInfo(test-appName0,2014-09-05T15:01:16.807-07:00),test-classpath,2014-09-05T15:01:16.807-07:00,None,None)) (JobSqlDAOSpec.scala:215)
[info] - saving a JobInfo with the same jobId should update the JOBS table * FAILED *
[info] JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T14:27:12.498-07:00),test-classpath,2014-09-05T14:27:12.498-07:00,None,None) did not equal JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T15:01:16.807-07:00),test-classpath,2014-09-05T15:01:16.807-07:00,None,None) (JobSqlDAOSpec.scala:232)
[info] SparkJobSpec:
[info] Sample tests for default validation && method
[info] - should return valid
[info] - should return invalid if one of them is invalid
[info] - should return invalid if both of them are invalid with the first message
[info] SparkJobUtilsSpec:
[info] SparkJobUtils.configToSparkConf
[info] - should translate num-cpu-cores and memory-per-node properly
[info] - should add other arbitrary settings
[info] JobStatusActorSpec:
[info] JobStatusActor
[info] - should return empty sequence if there is no job infos
[info] - should return error if non-existing job is unsubscribed
[info] - should not initialize a job more than two times
[info] - should be informed JobStarted until it is unsubscribed
[info] - should be ok to subscribe beofore job init
[info] - should be informed JobValidationFailed once
[info] - should be informed JobFinished until it is unsubscribed
[info] - should be informed JobErroredOut until it is unsubscribed
[info] - should update status correctly
[info] - should update JobValidationFailed status correctly
[info] - should update JobErroredOut status correctly
[info] JobSqlDAOJdbcConfigSpec:
[info] parse MySQL config
[info] - should parse a valid MySQL config
[info] - should fail to parse a MySQL config
[info] parse H2 config
[info] - should parse a valid H2 config
[info] - should fail to parse H2 config
[info] parse default config
[info] - should return a default H2 config
[info] JobInfoActorSpec:
[info] JobInfoActor
[info] - should store a job configuration
[info] - should return a job configuration when the jobId exists
[info] - should return error if jobId does not exist
2014-09-05 15:01:22.898 java[52089:860b] Unable to load realm mapping info from SCDynamicStore
I got results! Map(lazy -> 1, jumped -> 1, dog -> 1, The -> 1, over -> 1, fish -> 1, the -> 1)
[info] JobManagerActorSpec:
[info] error conditions
[info] - should return errors if appName does not match
[info] - should return error message if classPath does not match
[info] - should error out if loading garbage jar
[info] - should error out if job validation fails
[info] starting jobs
[info] - should start job and return result successfully (all events)
[info] - should start job more than one time and return result successfully (all events)
[info] - should start job and return results (sync route)
[info] - should start job and return JobStarted (async)
[info] - should return error if job throws an error
[info] - job should get jobConfig passed in to StartJob message
[info] - should properly serialize case classes and other job jar classes
[info] - should refuse to start a job when too many jobs in the context are running
[info] - should start a job that's an object rather than class
[info] starting jobs
[info] - jobs should be able to cache RDDs and retrieve them through getPersistentRDDs
[info] - jobs should be able to cache and retrieve RDDs by name
[info] JobResultActorSpec:
[info] JobResultActor
[info] - should return error if non-existing jobs are asked
[info] - should get back existing result
[info] - should be informed only once by subscribed result
[info] - should not be informed unsubscribed result
[info] - should not publish if do not subscribe to JobResult events
[info] - should return error if non-existing subscription is unsubscribed
[info] LocalContextSupervisorSpec:
[info] context management
[info] - should list empty contexts at startup
[info] - can add contexts from jobConfig
[info] - should be able to add multiple new contexts
[info] - should be able to stop contexts already running
[info] - should return NoSuchContext if attempt to stop nonexisting context
[info] - should not allow creation of an already existing context
I got results! Map(lazy -> 1, jumped -> 1, dog -> 1, The -> 1, over -> 1, fish -> 1, the -> 1)
[info] JobManagerActorAdHocSpec:
[info] error conditions
[info] - should return errors if appName does not match
[info] - should return error message if classPath does not match
[info] - should error out if loading garbage jar
[info] - should error out if job validation fails
[info] starting jobs
[info] - should start job and return result successfully (all events)
[info] - should start job more than one time and return result successfully (all events)
[info] - should start job and return results (sync route)
[info] - should start job and return JobStarted (async)
[info] - should return error if job throws an error
[info] - job should get jobConfig passed in to StartJob message
[info] - should properly serialize case classes and other job jar classes
[info] - should refuse to start a job when too many jobs in the context are running
[info] - should start a job that's an object rather than class
[info] NamedRddsSpec:
[info] NamedRdds
[info] - get() should return None when RDD does not exist
[info] - get() should return Some(RDD) when it exists
[info] - destroy() should do nothing when RDD with given name doesn't exist
[info] - destroy() should destroy an RDD that exists
[info] - getNames() should return names of all managed RDDs
[info] - getOrElseCreate() should call generator function if RDD does not exist
[info] - getOrElseCreate() should not call generator function, should return existing RDD if one exists
[info] - update() should replace existing RDD
[info] - should include underlying exception when error occurs
[info] WebApiSpec:
[info] jars routes
[info] - should list all jars
[info] - should respond with OK if jar uploaded successfully
[info] - should respond with bad request if jar formatted incorrectly
[info] list jobs
[info] - should list jobs correctly
[info] /jobs routes
[info] - should respond with bad request if jobConfig cannot be parsed
[info] - should merge user passed jobConfig with default jobConfig
[info] - async route should return 202 if job starts successfully
[info] - adhoc job of sync route should return 200 and result
[info] - should be able to take a timeout param
[info] - adhoc job started successfully of async route should return 202
[info] - should be able to query job result from /jobs/ route
[info] - should be able to query job config from /jobs//config route
[info] - should respond with 404 Not Found from /jobs//config route if jobId does not exist
[info] - should respond with 404 Not Found if context does not exist
[info] - should respond with 404 Not Found if app or class not found
[info] - sync route should return Ok with ERROR in JSON response if job failed
[info] serializing complex data types
[info] - should be able to serialize nested Seq's and Map's within Map's to JSON
[info] - should be able to serialize Seq's with different types to JSON
[info] - should be able to serialize base types (eg float, numbers) to JSON
[info] - should convert non-understood types to string
[info] context routes
[info] - should list all contexts
[info] - should respond with 404 Not Found if stopping unknown context
[info] - should return OK if stopping known context
[info] - should respond with bad request if starting an already started context
[info] - should return OK if starting a new context
[error] Failed: Total 109, Failed 11, Errors 0, Passed 98
[error] Failed tests:
[error] spark.jobserver.io.JobSqlDAOSpec
error sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 114 s, completed Sep 5, 2014 3:03:01 PM
Assembly failed
I tried to test the JobServer by deploying it in my local. I changed the DEPLOY_HOSTS="127.0.0.1" in file config/localhost.sh I got this below error while running -> ./bin/server_deploy.sh localhost
[info] JobSqlDAOSpec: [info] save and get the jars [info] - should be able to save one jar and get it back * FAILED * [info] false did not equal true (JobSqlDAOSpec.scala:101) [info] - should be able to retrieve the jar file * FAILED * [info] false did not equal true (JobSqlDAOSpec.scala:114) [info] saveJobConfig() and getJobConfigs() tests [info] - should provide an empty map on getJobConfigs() for an empty CONFIGS table * FAILED * [info] Map() did not equal Map(test-id0 -> Config(SimpleConfigObject({"marco":"pollo"})), test-id1 -> Config(SimpleConfigObject({"merry":"xmas"}))) (JobSqlDAOSpec.scala:121) [info] - should save and get the same config * FAILED * [info] org.h2.jdbc.JdbcSQLException: Unique index or primary key violation: "PRIMARY_KEY_6 ON PUBLIC.CONFIGS(JOB_ID)"; SQL statement: [info] INSERT INTO "CONFIGS" ("JOB_ID","JOB_CONFIG") VALUES (?,?) [23505-170] [info] at org.h2.message.DbException.getJdbcSQLException(DbException.java:329) [info] at org.h2.message.DbException.get(DbException.java:169) [info] at org.h2.message.DbException.get(DbException.java:146) [info] at org.h2.index.BaseIndex.getDuplicateKeyException(BaseIndex.java:81) [info] at org.h2.index.PageBtree.find(PageBtree.java:121) [info] at org.h2.index.PageBtreeLeaf.addRow(PageBtreeLeaf.java:147) [info] at org.h2.index.PageBtreeLeaf.addRowTry(PageBtreeLeaf.java:100) [info] at org.h2.index.PageBtreeIndex.addRow(PageBtreeIndex.java:102) [info] at org.h2.index.PageBtreeIndex.add(PageBtreeIndex.java:93) [info] at org.h2.table.RegularTable.addRow(RegularTable.java:122) [info] ... [info] - should be able to get previously saved config * FAILED * [info] Set(test-id0, test-id1) did not equal Set(test-id0) (JobSqlDAOSpec.scala:143) [info] - Save a new config, bring down DB, bring up DB, should get configs from DB * FAILED * [info] org.h2.jdbc.JdbcSQLException: Unique index or primary key violation: "PRIMARY_KEY_6 ON PUBLIC.CONFIGS(JOB_ID)"; SQL statement: [info] INSERT INTO "CONFIGS" ("JOB_ID","JOB_CONFIG") VALUES (?,?) [23505-170] [info] at org.h2.message.DbException.getJdbcSQLException(DbException.java:329) [info] at org.h2.message.DbException.get(DbException.java:169) [info] at org.h2.message.DbException.get(DbException.java:146) [info] at org.h2.index.BaseIndex.getDuplicateKeyException(BaseIndex.java:81) [info] at org.h2.index.PageBtree.find(PageBtree.java:121) [info] at org.h2.index.PageBtreeLeaf.addRow(PageBtreeLeaf.java:147) [info] at org.h2.index.PageBtreeLeaf.addRowTry(PageBtreeLeaf.java:100) [info] at org.h2.index.PageBtreeIndex.addRow(PageBtreeIndex.java:102) [info] at org.h2.index.PageBtreeIndex.add(PageBtreeIndex.java:93) [info] at org.h2.table.RegularTable.addRow(RegularTable.java:122) [info] ... [info] Basic saveJobInfo() and getJobInfos() tests [info] - should provide an empty map on getJobInfos() for an empty JOBS table * FAILED * [info] Map() did not equal Map(test-id0 -> JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T14:27:12.498-07:00),test-classpath,2014-09-05T14:27:12.498-07:00,None,None), test-id1 -> JobInfo(test-id1,test-context,JarInfo(test-appName0,2014-09-05T14:27:12.498-07:00),test-classpath,2014-09-05T14:27:12.498-07:00,None,None)) (JobSqlDAOSpec.scala:171) [info] - should save a new JobInfo and get the same JobInfo * FAILED * [info] Set(test-id1, test-id0) did not equal Set(test-id0) (JobSqlDAOSpec.scala:182) [info] - should be able to get previously saved JobInfo * FAILED * [info] Set(test-id1, test-id0) did not equal Set(test-id0) (JobSqlDAOSpec.scala:193) [info] - Save another new jobInfo, bring down DB, bring up DB, should JobInfos from DB * FAILED * [info] Stream(JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T14:27:12.498-07:00),test-classpath,2014-09-05T14:27:12.498-07:00,None,None), ?) did not equal List(JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T15:01:16.807-07:00),test-classpath,2014-09-05T15:01:16.807-07:00,None,None), JobInfo(test-id1,test-context,JarInfo(test-appName0,2014-09-05T15:01:16.807-07:00),test-classpath,2014-09-05T15:01:16.807-07:00,None,None)) (JobSqlDAOSpec.scala:215) [info] - saving a JobInfo with the same jobId should update the JOBS table * FAILED * [info] JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T14:27:12.498-07:00),test-classpath,2014-09-05T14:27:12.498-07:00,None,None) did not equal JobInfo(test-id0,test-context,JarInfo(test-appName0,2014-09-05T15:01:16.807-07:00),test-classpath,2014-09-05T15:01:16.807-07:00,None,None) (JobSqlDAOSpec.scala:232) [info] SparkJobSpec: [info] Sample tests for default validation && method [info] - should return valid [info] - should return invalid if one of them is invalid [info] - should return invalid if both of them are invalid with the first message [info] SparkJobUtilsSpec: [info] SparkJobUtils.configToSparkConf [info] - should translate num-cpu-cores and memory-per-node properly [info] - should add other arbitrary settings [info] JobStatusActorSpec: [info] JobStatusActor [info] - should return empty sequence if there is no job infos [info] - should return error if non-existing job is unsubscribed [info] - should not initialize a job more than two times [info] - should be informed JobStarted until it is unsubscribed [info] - should be ok to subscribe beofore job init [info] - should be informed JobValidationFailed once [info] - should be informed JobFinished until it is unsubscribed [info] - should be informed JobErroredOut until it is unsubscribed [info] - should update status correctly [info] - should update JobValidationFailed status correctly [info] - should update JobErroredOut status correctly [info] JobSqlDAOJdbcConfigSpec: [info] parse MySQL config [info] - should parse a valid MySQL config [info] - should fail to parse a MySQL config [info] parse H2 config [info] - should parse a valid H2 config [info] - should fail to parse H2 config [info] parse default config [info] - should return a default H2 config [info] JobInfoActorSpec: [info] JobInfoActor [info] - should store a job configuration [info] - should return a job configuration when the jobId exists [info] - should return error if jobId does not exist 2014-09-05 15:01:22.898 java[52089:860b] Unable to load realm mapping info from SCDynamicStore I got results! Map(lazy -> 1, jumped -> 1, dog -> 1, The -> 1, over -> 1, fish -> 1, the -> 1) [info] JobManagerActorSpec: [info] error conditions [info] - should return errors if appName does not match [info] - should return error message if classPath does not match [info] - should error out if loading garbage jar [info] - should error out if job validation fails [info] starting jobs [info] - should start job and return result successfully (all events) [info] - should start job more than one time and return result successfully (all events) [info] - should start job and return results (sync route) [info] - should start job and return JobStarted (async) [info] - should return error if job throws an error [info] - job should get jobConfig passed in to StartJob message [info] - should properly serialize case classes and other job jar classes [info] - should refuse to start a job when too many jobs in the context are running [info] - should start a job that's an object rather than class [info] starting jobs [info] - jobs should be able to cache RDDs and retrieve them through getPersistentRDDs [info] - jobs should be able to cache and retrieve RDDs by name [info] JobResultActorSpec: [info] JobResultActor [info] - should return error if non-existing jobs are asked [info] - should get back existing result [info] - should be informed only once by subscribed result [info] - should not be informed unsubscribed result [info] - should not publish if do not subscribe to JobResult events [info] - should return error if non-existing subscription is unsubscribed [info] LocalContextSupervisorSpec: [info] context management [info] - should list empty contexts at startup [info] - can add contexts from jobConfig [info] - should be able to add multiple new contexts [info] - should be able to stop contexts already running [info] - should return NoSuchContext if attempt to stop nonexisting context [info] - should not allow creation of an already existing context I got results! Map(lazy -> 1, jumped -> 1, dog -> 1, The -> 1, over -> 1, fish -> 1, the -> 1) [info] JobManagerActorAdHocSpec: [info] error conditions [info] - should return errors if appName does not match [info] - should return error message if classPath does not match [info] - should error out if loading garbage jar [info] - should error out if job validation fails [info] starting jobs [info] - should start job and return result successfully (all events) [info] - should start job more than one time and return result successfully (all events) [info] - should start job and return results (sync route) [info] - should start job and return JobStarted (async) [info] - should return error if job throws an error [info] - job should get jobConfig passed in to StartJob message [info] - should properly serialize case classes and other job jar classes [info] - should refuse to start a job when too many jobs in the context are running [info] - should start a job that's an object rather than class [info] NamedRddsSpec: [info] NamedRdds [info] - get() should return None when RDD does not exist [info] - get() should return Some(RDD) when it exists [info] - destroy() should do nothing when RDD with given name doesn't exist [info] - destroy() should destroy an RDD that exists [info] - getNames() should return names of all managed RDDs [info] - getOrElseCreate() should call generator function if RDD does not exist [info] - getOrElseCreate() should not call generator function, should return existing RDD if one exists [info] - update() should replace existing RDD [info] - should include underlying exception when error occurs [info] WebApiSpec: [info] jars routes [info] - should list all jars [info] - should respond with OK if jar uploaded successfully [info] - should respond with bad request if jar formatted incorrectly [info] list jobs [info] - should list jobs correctly [info] /jobs routes [info] - should respond with bad request if jobConfig cannot be parsed [info] - should merge user passed jobConfig with default jobConfig [info] - async route should return 202 if job starts successfully [info] - adhoc job of sync route should return 200 and result [info] - should be able to take a timeout param [info] - adhoc job started successfully of async route should return 202 [info] - should be able to query job result from /jobs/ route
[info] - should be able to query job config from /jobs//config route
[info] - should respond with 404 Not Found from /jobs//config route if jobId does not exist
[info] - should respond with 404 Not Found if context does not exist
[info] - should respond with 404 Not Found if app or class not found
[info] - sync route should return Ok with ERROR in JSON response if job failed
[info] serializing complex data types
[info] - should be able to serialize nested Seq's and Map's within Map's to JSON
[info] - should be able to serialize Seq's with different types to JSON
[info] - should be able to serialize base types (eg float, numbers) to JSON
[info] - should convert non-understood types to string
[info] context routes
[info] - should list all contexts
[info] - should respond with 404 Not Found if stopping unknown context
[info] - should return OK if stopping known context
[info] - should respond with bad request if starting an already started context
[info] - should return OK if starting a new context
[error] Failed: Total 109, Failed 11, Errors 0, Passed 98
[error] Failed tests:
[error] spark.jobserver.io.JobSqlDAOSpec
error sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 114 s, completed Sep 5, 2014 3:03:01 PM
Assembly failed