Closed tantra35 closed 5 years ago
@tantra35 : try 0.8.6.1
, per https://github.com/hashicorp/nomad-java-sdk/issues/24
@cgbaker Sorry for disappoint you, but we mind sdk 0.8.6.1. So even on this version(0.8.6.1) spark on nomad have exception that i mention early. In script that we mention in https://github.com/hashicorp/nomad-spark/issues/24, you can see that we especially replace 0.8.6.1 to 0.7.0.2 to workaround this
The issue likely relates to this: https://github.com/hashicorp/nomad-java-sdk/issues/6#issuecomment-432413422
@cgbaker set TZ to UTC looks like workaround, the same as our solution to rebuild with nomad scala sdk 0.7.0.2, but this is only workaround not complete fix. More complex solution imho to consider all possible datetime formats that nomad can produce, and in try / fallback maner handle them all
When we build lastest version spark on nomad when we launch spark job we got follow exeption
As you may see sdk cant parse field with type Date, because it try to apply format
yyyy-MM-dd'T'HH:mm:ss.SSSZ
, to string2019-03-19T23:19:03.629626062+03:00
, but as we can see in doc https://docs.oracle.com/javase/7/docs/api/java/text/SimpleDateFormat.html, last symbol in format stringZ
will pars only strings like this+0300
whish is differs from what we actualy get from nomad+03:00
, so real format string must looks like thisyyyy-MM-dd'T'HH:mm:ss.SSSX
. So we made a conclusion, that nomad-scala-sdk 0.8.6 is broken. As workaround for this we simple downgrade nomad scala sdk to version 0.7.0.2