lightbend / mesos-spark-integration-tests

Mesos Integration Tests on Docker/Ec2
16 stars 9 forks source link

Add optional auth token to use against a mesos cluster #94

Closed mgummelt closed 7 years ago

mgummelt commented 7 years ago

cc @skonto

DC/OS has a "strict" mode, where all Mesos clients must authenticate with an auth token. This PR allows the job to be optionally launched with an auth token to enable state.json requests in that mode.

skonto commented 7 years ago

@mgummelt will have a look now.

skonto commented 7 years ago

@mgummelt since you removed the DC/OS test runner a few months ago how can I test it with: https://github.com/mesosphere/spark-build/blob/master/tests/test.py? We also want to be able to run DC/OS tests now that we focus on DC/OS for FDP.

mgummelt commented 7 years ago

I just updated the test docs: https://github.com/mesosphere/spark-build/pull/126

But you won't be able to test this change in particular, because strict mode is an enterprise only feature.

skonto commented 7 years ago

@mgummelt I get task not serializable errors.

Cause: java.io.NotSerializableException: org.scalatest.Assertions$AssertionsHelper Serialization stack:

  • object not serializable (class: org.scalatest.Assertions$AssertionsHelper, value: org.scalatest.Assertions$AssertionsHelper@696f0212)
  • field (class: org.scalatest.FunSuite, name: assertionsHelper, type: class org.scalatest.Assertions$AssertionsHelper)
  • object (class com.typesafe.spark.test.mesos.SparkJobSpec, SparkJobSpec)
  • field (class: com.typesafe.spark.test.mesos.DynamicAllocationSpec$$anonfun$2, name: $outer, type: interface com.typesafe.spark.test.mesos.DynamicAllocationSpec)
  • object (class com.typesafe.spark.test.mesos.DynamicAllocationSpec$$anonfun$2, )
  • field (class: com.typesafe.spark.test.mesos.DynamicAllocationSpec$$anonfun$2$$anonfun$4, name: $outer, type: class com.typesafe.spark.test.mesos.DynamicAllocationSpec$$anonfun$2)
  • object (class com.typesafe.spark.test.mesos.DynamicAllocationSpec$$anonfun$2$$anonfun$4, ) at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugge

To fix it create a value for authToken before you use it in the spark closure like in the mesosUrl case.

Needs fix here: https://github.com/mesosphere/mesos-spark-integration-tests/blob/ec6672cb605d9e7f0f0ea4a172633bdf892ec33d/test-runner/src/main/scala/com/typesafe/spark/test/mesos/RolesSpec.scala#L67

https://github.com/mesosphere/mesos-spark-integration-tests/blob/ec6672cb605d9e7f0f0ea4a172633bdf892ec33d/test-runner/src/main/scala/com/typesafe/spark/test/mesos/DynamicAllocationSpec.scala#L62

https://github.com/mesosphere/mesos-spark-integration-tests/blob/ec6672cb605d9e7f0f0ea4a172633bdf892ec33d/test-runner/src/main/scala/com/typesafe/spark/test/mesos/DynamicAllocationSpec.scala#L96

mgummelt commented 7 years ago

@skonto should be fixed. can you give it another try?

skonto commented 7 years ago

@mgummelt sure.

skonto commented 7 years ago

LGTM it works.