azavea / osmesa

OSMesa is an OpenStreetMap processing stack based on GeoTrellis and Apache Spark
Apache License 2.0
80 stars 26 forks source link

Compile Exception for Generated File When Saving Geometries #66

Closed jbouffard closed 6 years ago

jbouffard commented 6 years ago

I was trying to run the ExtractMultiPolygons script using the most current master, and I get this error whenever I try to write the geometries:

2018-05-30 08:23:33 ERROR CodeGenerator:91 - failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 91, Column 0: Cannot compare types "java.lang.Object" and "long"
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 91, Column 0: Cannot compare types "java.lang.Object" and "long"
    at org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:11821)
    at org.codehaus.janino.UnitCompiler.compileBoolean2(UnitCompiler.java:3913)
    at org.codehaus.janino.UnitCompiler.access$5800(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$10.visitBinaryOperation(UnitCompiler.java:3636)
    at org.codehaus.janino.UnitCompiler$10.visitBinaryOperation(UnitCompiler.java:3614)
    at org.codehaus.janino.Java$BinaryOperation.accept(Java.java:4693)
    at org.codehaus.janino.UnitCompiler.compileBoolean(UnitCompiler.java:3614)
    at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:4122)
    at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:4679)
    at org.codehaus.janino.UnitCompiler.access$7700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$12.visitBinaryOperation(UnitCompiler.java:4091)
    at org.codehaus.janino.UnitCompiler$12.visitBinaryOperation(UnitCompiler.java:4070)
    at org.codehaus.janino.Java$BinaryOperation.accept(Java.java:4693)
    at org.codehaus.janino.UnitCompiler.compileGet(UnitCompiler.java:4070)
    at org.codehaus.janino.UnitCompiler.compileGetValue(UnitCompiler.java:5253)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:3477)
    at org.codehaus.janino.UnitCompiler.access$5300(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$9.visitAssignment(UnitCompiler.java:3439)
    at org.codehaus.janino.UnitCompiler$9.visitAssignment(UnitCompiler.java:3419)
    at org.codehaus.janino.Java$Assignment.accept(Java.java:4306)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:3419)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2339)
    at org.codehaus.janino.UnitCompiler.access$1800(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitExpressionStatement(UnitCompiler.java:1473)
    at org.codehaus.janino.UnitCompiler$6.visitExpressionStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$ExpressionStatement.accept(Java.java:2851)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1532)
    at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1472)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$Block.accept(Java.java:2756)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2455)
    at org.codehaus.janino.UnitCompiler.access$1900(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1474)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$IfStatement.accept(Java.java:2926)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1532)
    at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1472)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$Block.accept(Java.java:2756)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.fakeCompile(UnitCompiler.java:1508)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2413)
    at org.codehaus.janino.UnitCompiler.access$1900(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1474)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$IfStatement.accept(Java.java:2926)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1532)
    at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1472)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$Block.accept(Java.java:2756)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2465)
    at org.codehaus.janino.UnitCompiler.access$1900(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1474)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$IfStatement.accept(Java.java:2926)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1532)
    at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1472)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$Block.accept(Java.java:2756)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1571)
    at org.codehaus.janino.UnitCompiler.access$2600(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitDoStatement(UnitCompiler.java:1481)
    at org.codehaus.janino.UnitCompiler$6.visitDoStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$DoStatement.accept(Java.java:3304)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:3075)
    at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1336)
    at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1309)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:799)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:958)
    at org.codehaus.janino.UnitCompiler.access$700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$2.visitMemberClassDeclaration(UnitCompiler.java:393)
    at org.codehaus.janino.UnitCompiler$2.visitMemberClassDeclaration(UnitCompiler.java:385)
    at org.codehaus.janino.Java$MemberClassDeclaration.accept(Java.java:1286)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:385)
    at org.codehaus.janino.UnitCompiler.compileDeclaredMemberTypes(UnitCompiler.java:1285)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:825)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:411)
    at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:390)
    at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:385)
    at org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1405)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:385)
    at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:357)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:234)
    at org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:446)
    at org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:313)
    at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:235)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:204)
    at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1421)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1497)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1494)
    at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
    at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
    at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
    at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
    at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
    at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1369)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:579)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:578)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.AppendColumnsExec.doExecute(objects.scala:261)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
    at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.MapGroupsExec.doExecute(objects.scala:329)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
    at org.apache.spark.sql.execution.SerializeFromObjectExec.inputRDDs(objects.scala:110)
    at org.apache.spark.sql.execution.FilterExec.inputRDDs(basicPhysicalOperators.scala:121)
    at org.apache.spark.sql.execution.joins.BroadcastHashJoinExec.inputRDDs(BroadcastHashJoinExec.scala:76)
    at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:41)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.UnionExec$$anonfun$doExecute$1.apply(basicPhysicalOperators.scala:557)
    at org.apache.spark.sql.execution.UnionExec$$anonfun$doExecute$1.apply(basicPhysicalOperators.scala:557)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
    at scala.collection.immutable.List.map(List.scala:285)
    at org.apache.spark.sql.execution.UnionExec.doExecute(basicPhysicalOperators.scala:557)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
    at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:180)
    at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:154)
    at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
    at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
    at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
    at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
    at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:654)
    at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225)
    at org.apache.spark.sql.DataFrameWriter.orc(DataFrameWriter.scala:570)
    at osmesa.ExtractMultiPolygons$$anonfun$$lessinit$greater$1.apply(ExtractMultiPolygons.scala:116)
    at osmesa.ExtractMultiPolygons$$anonfun$$lessinit$greater$1.apply(ExtractMultiPolygons.scala:39)
    at cats.SemigroupalArityFunctions$$anonfun$map3$1.apply(SemigroupalArityFunctions.scala:15)
    at cats.SemigroupalArityFunctions$$anonfun$map3$1.apply(SemigroupalArityFunctions.scala:15)
    at scala.Function1$$anonfun$andThen$1.apply(Function1.scala:52)
    at cats.data.Validated.ap(Validated.scala:172)
    at cats.data.ValidatedApplicative.ap(Validated.scala:509)
    at cats.data.ValidatedApplicative.ap(Validated.scala:502)
    at cats.ComposedApply$$anonfun$ap$1$$anonfun$apply$1.apply(Composed.scala:25)
    at cats.Monad$$anonfun$map$1.apply(Monad.scala:16)
    at cats.instances.Function0Instances$$anon$1$$anonfun$flatMap$1.apply(function.scala:25)
    at cats.instances.Function0Instances$$anon$1$$anonfun$flatMap$1.apply(function.scala:25)
    at cats.instances.Function0Instances$$anon$1$$anonfun$flatMap$1.apply(function.scala:25)
    at com.monovore.decline.Parser.com$monovore$decline$Parser$$evalResult(Parser.scala:26)
    at com.monovore.decline.Parser.consumeAll(Parser.scala:91)
    at com.monovore.decline.Parser.apply(Parser.scala:17)
    at com.monovore.decline.Command.parse(opts.scala:17)
    at com.monovore.decline.CommandApp.main(CommandApp.scala:48)
    at osmesa.ExtractMultiPolygons.main(ExtractMultiPolygons.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-05-30 08:23:34 WARN  WholeStageCodegenExec:66 - Whole-stage codegen disabled for plan (id=12):
 *(12) Project [type#529, changeset#314L, id#469L, version#488, minorVersion#511, updated#490, validUntil#503, role#531, geom#318]
+- *(12) Filter (((isnull(memberUpdated#545) && isnull(memberValidUntil#546)) && isnull(geom#318)) || ((memberUpdated#545 <= updated#490) && (updated#490 < coalesce(memberValidUntil#546, 1527682999315000))))
   +- *(12) HashAggregate(keys=[role#531, validUntil#503, ref#530L, version#488, id#469L, updated#490, minorVersion#511, changeset#314L, type#529], functions=[], output=[changeset#314L, id#469L, version#488, minorVersion#511, updated#490, validUntil#503, type#529, role#531, memberUpdated#545, memberValidUntil#546, geom#318])
      +- *(12) HashAggregate(keys=[role#531, validUntil#503, ref#530L, version#488, id#469L, updated#490, minorVersion#511, changeset#314L, type#529], functions=[], output=[role#531, validUntil#503, ref#530L, version#488, id#469L, updated#490, minorVersion#511, changeset#314L, type#529])
         +- *(12) Project [changeset#314L, id#469L, version#488, minorVersion#511, updated#490, validUntil#503, member#521.type AS type#529, member#521.ref AS ref#530L, member#521.role AS role#531]
            +- *(12) Filter member#521.role IN (,outer,inner)
               +- Generate explode(members#49), [id#469L, version#488, changeset#314L, updated#490, validUntil#503, minorVersion#511], true, [member#521]
                  +- *(11) Project [id#469L, version#488, changeset#314L, updated#490, members#49, validUntil#503, (_we0#512 - 1) AS minorVersion#511]
                     +- Window [row_number() windowspecdefinition(id#469L, version#488, updated#490 ASC NULLS FIRST, specifiedwindowframe(RowFrame, unboundedpreceding$(), currentrow$())) AS _we0#512], [id#469L, version#488], [updated#490 ASC NULLS FIRST]
                        +- *(10) Sort [id#469L ASC NULLS FIRST, version#488 ASC NULLS FIRST, updated#490 ASC NULLS FIRST], false, 0
                           +- Window [lead(updated#490, 1, null) windowspecdefinition(id#469L, updated#490 ASC NULLS FIRST, specifiedwindowframe(RowFrame, 1, 1)) AS validUntil#503], [id#469L], [updated#490 ASC NULLS FIRST]
                              +- *(9) Sort [id#469L ASC NULLS FIRST, updated#490 ASC NULLS FIRST], false, 0
                                 +- Exchange hashpartitioning(id#469L, 200)
                                    +- *(8) Project [id#469L, version#488, changeset#314L, updated#490, members#49]
                                       +- *(8) BroadcastHashJoin [id#469L, cast(version#488 as bigint)], [id#0L, version#11L], Inner, BuildLeft
                                          :- BroadcastExchange HashedRelationBroadcastMode(List(input[1, bigint, true], cast(input[2, int, false] as bigint)))
                                          :  +- *(5) Filter isnotnull(version#488)
                                          :     +- *(5) HashAggregate(keys=[changeset#314L, id#469L], functions=[max(version#11L), max(updated#317)], output=[changeset#314L, id#469L, version#488, updated#490])
                                          :        +- Exchange hashpartitioning(changeset#314L, id#469L, 200)
                                          :           +- *(4) HashAggregate(keys=[changeset#314L, id#469L], functions=[partial_max(version#11L), partial_max(updated#317)], output=[changeset#314L, id#469L, max#944L, max#945])
                                          :              +- Union
                                          :                 :- LocalTableScan <empty>, [changeset#314L, id#469L, version#11L, updated#317]
                                          :                 +- *(3) Project [changeset#7L, id#0L, version#11L, timestamp#8 AS updated#474]
                                          :                    +- *(3) Filter UDF(CASE WHEN (NOT visible#12 && isnotnull(_we0#51)) THEN _we1#52 ELSE tags#2 END)
                                          :                       +- Window [lag(tags#2, 1, null) windowspecdefinition(id#0L, version#11L ASC NULLS FIRST, specifiedwindowframe(RowFrame, -1, -1)) AS _we0#51, lag(tags#2, 1, null) windowspecdefinition(id#0L, version#11L ASC NULLS FIRST, specifiedwindowframe(RowFrame, -1, -1)) AS _we1#52], [id#0L], [version#11L ASC NULLS FIRST]
                                          :                          +- *(2) Sort [id#0L ASC NULLS FIRST, version#11L ASC NULLS FIRST], false, 0
                                          :                             +- *(2) Project [id#0L, changeset#7L, timestamp#8, version#11L, visible#12, tags#2]
                                          :                                +- Exchange hashpartitioning(id#0L, 200)
                                          :                                   +- *(1) Project [id#0L, tags#2, changeset#7L, timestamp#8, version#11L, visible#12]
                                          :                                      +- *(1) Filter ((isnotnull(type#1) && (type#1 = relation)) && isnotnull(id#0L))
                                          :                                         +- *(1) FileScan orc [id#0L,type#1,tags#2,changeset#7L,timestamp#8,version#11L,visible#12] Batched: false, Format: ORC, Location: InMemoryFileIndex[file:/tmp/rhode-island.orc], PartitionFilters: [], PushedFilters: [IsNotNull(type), EqualTo(type,relation), IsNotNull(id)], ReadSchema: struct<id:bigint,type:string,tags:map<string,string>,changeset:bigint,timestamp:timestamp,version...
                                          +- *(8) Project [id#0L, version#11L, members#49]
                                             +- *(8) Filter (UDF(CASE WHEN (NOT visible#12 && isnotnull(_we0#51)) THEN _we1#52 ELSE tags#2 END) && isnotnull(version#11L))
                                                +- Window [lag(tags#2, 1, null) windowspecdefinition(id#0L, version#11L ASC NULLS FIRST, specifiedwindowframe(RowFrame, -1, -1)) AS _we0#51, lag(tags#2, 1, null) windowspecdefinition(id#0L, version#11L ASC NULLS FIRST, specifiedwindowframe(RowFrame, -1, -1)) AS _we1#52], [id#0L], [version#11L ASC NULLS FIRST]
                                                   +- *(7) Sort [id#0L ASC NULLS FIRST, version#11L ASC NULLS FIRST], false, 0
                                                      +- *(7) Project [id#0L, UDF(members#6) AS members#49, version#11L, visible#12, tags#2]
                                                         +- Exchange hashpartitioning(id#0L, 200)
                                                            +- *(6) Project [id#0L, tags#2, members#6, version#11L, visible#12]
                                                               +- *(6) Filter ((isnotnull(type#1) && (type#1 = relation)) && isnotnull(id#0L))
                                                                  +- *(6) FileScan orc [id#0L,type#1,tags#2,members#6,version#11L,visible#12] Batched: false, Format: ORC, Location: InMemoryFileIndex[file:/tmp/rhode-island.orc], PartitionFilters: [], PushedFilters: [IsNotNull(type), EqualTo(type,relation), IsNotNull(id)], ReadSchema: struct<id:bigint,type:string,tags:map<string,string>,members:array<struct<type:string,ref:bigint,...

2018-05-30 08:23:42 ERROR CodeGenerator:91 - failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 83, Column 0: Cannot compare types "java.lang.Object" and "long"
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 83, Column 0: Cannot compare types "java.lang.Object" and "long"
    at org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:11821)
    at org.codehaus.janino.UnitCompiler.compileBoolean2(UnitCompiler.java:3913)
    at org.codehaus.janino.UnitCompiler.access$5800(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$10.visitBinaryOperation(UnitCompiler.java:3636)
    at org.codehaus.janino.UnitCompiler$10.visitBinaryOperation(UnitCompiler.java:3614)
    at org.codehaus.janino.Java$BinaryOperation.accept(Java.java:4693)
    at org.codehaus.janino.UnitCompiler.compileBoolean(UnitCompiler.java:3614)
    at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:4122)
    at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:4679)
    at org.codehaus.janino.UnitCompiler.access$7700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$12.visitBinaryOperation(UnitCompiler.java:4091)
    at org.codehaus.janino.UnitCompiler$12.visitBinaryOperation(UnitCompiler.java:4070)
    at org.codehaus.janino.Java$BinaryOperation.accept(Java.java:4693)
    at org.codehaus.janino.UnitCompiler.compileGet(UnitCompiler.java:4070)
    at org.codehaus.janino.UnitCompiler.compileGetValue(UnitCompiler.java:5253)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:3477)
    at org.codehaus.janino.UnitCompiler.access$5300(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$9.visitAssignment(UnitCompiler.java:3439)
    at org.codehaus.janino.UnitCompiler$9.visitAssignment(UnitCompiler.java:3419)
    at org.codehaus.janino.Java$Assignment.accept(Java.java:4306)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:3419)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2339)
    at org.codehaus.janino.UnitCompiler.access$1800(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitExpressionStatement(UnitCompiler.java:1473)
    at org.codehaus.janino.UnitCompiler$6.visitExpressionStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$ExpressionStatement.accept(Java.java:2851)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1532)
    at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1472)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$Block.accept(Java.java:2756)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2455)
    at org.codehaus.janino.UnitCompiler.access$1900(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1474)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$IfStatement.accept(Java.java:2926)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1532)
    at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1472)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$Block.accept(Java.java:2756)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.fakeCompile(UnitCompiler.java:1508)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2413)
    at org.codehaus.janino.UnitCompiler.access$1900(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1474)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$IfStatement.accept(Java.java:2926)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1532)
    at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1472)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$Block.accept(Java.java:2756)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2465)
    at org.codehaus.janino.UnitCompiler.access$1900(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1474)
    at org.codehaus.janino.UnitCompiler$6.visitIfStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$IfStatement.accept(Java.java:2926)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1532)
    at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1472)
    at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$Block.accept(Java.java:2756)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1571)
    at org.codehaus.janino.UnitCompiler.access$2600(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$6.visitDoStatement(UnitCompiler.java:1481)
    at org.codehaus.janino.UnitCompiler$6.visitDoStatement(UnitCompiler.java:1466)
    at org.codehaus.janino.Java$DoStatement.accept(Java.java:3304)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1466)
    at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1546)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:3075)
    at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1336)
    at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1309)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:799)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:958)
    at org.codehaus.janino.UnitCompiler.access$700(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$2.visitMemberClassDeclaration(UnitCompiler.java:393)
    at org.codehaus.janino.UnitCompiler$2.visitMemberClassDeclaration(UnitCompiler.java:385)
    at org.codehaus.janino.Java$MemberClassDeclaration.accept(Java.java:1286)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:385)
    at org.codehaus.janino.UnitCompiler.compileDeclaredMemberTypes(UnitCompiler.java:1285)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:825)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:411)
    at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:212)
    at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:390)
    at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:385)
    at org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1405)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:385)
    at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:357)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:234)
    at org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:446)
    at org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:313)
    at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:235)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:204)
    at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1421)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1497)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1494)
    at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
    at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
    at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
    at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
    at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
    at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1369)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:579)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:578)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.AppendColumnsExec.doExecute(objects.scala:261)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
    at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.MapGroupsExec.doExecute(objects.scala:329)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
    at org.apache.spark.sql.execution.SerializeFromObjectExec.inputRDDs(objects.scala:110)
    at org.apache.spark.sql.execution.FilterExec.inputRDDs(basicPhysicalOperators.scala:121)
    at org.apache.spark.sql.execution.joins.BroadcastHashJoinExec.inputRDDs(BroadcastHashJoinExec.scala:76)
    at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:41)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.UnionExec$$anonfun$doExecute$1.apply(basicPhysicalOperators.scala:557)
    at org.apache.spark.sql.execution.UnionExec$$anonfun$doExecute$1.apply(basicPhysicalOperators.scala:557)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
    at scala.collection.immutable.List.map(List.scala:285)
    at org.apache.spark.sql.execution.UnionExec.doExecute(basicPhysicalOperators.scala:557)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
    at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:180)
    at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:154)
    at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
    at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
    at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
    at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
    at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:654)
    at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225)
    at org.apache.spark.sql.DataFrameWriter.orc(DataFrameWriter.scala:570)
    at osmesa.ExtractMultiPolygons$$anonfun$$lessinit$greater$1.apply(ExtractMultiPolygons.scala:116)
    at osmesa.ExtractMultiPolygons$$anonfun$$lessinit$greater$1.apply(ExtractMultiPolygons.scala:39)
    at cats.SemigroupalArityFunctions$$anonfun$map3$1.apply(SemigroupalArityFunctions.scala:15)
    at cats.SemigroupalArityFunctions$$anonfun$map3$1.apply(SemigroupalArityFunctions.scala:15)
    at scala.Function1$$anonfun$andThen$1.apply(Function1.scala:52)
    at cats.data.Validated.ap(Validated.scala:172)
    at cats.data.ValidatedApplicative.ap(Validated.scala:509)
    at cats.data.ValidatedApplicative.ap(Validated.scala:502)
    at cats.ComposedApply$$anonfun$ap$1$$anonfun$apply$1.apply(Composed.scala:25)
    at cats.Monad$$anonfun$map$1.apply(Monad.scala:16)
    at cats.instances.Function0Instances$$anon$1$$anonfun$flatMap$1.apply(function.scala:25)
    at cats.instances.Function0Instances$$anon$1$$anonfun$flatMap$1.apply(function.scala:25)
    at cats.instances.Function0Instances$$anon$1$$anonfun$flatMap$1.apply(function.scala:25)
    at com.monovore.decline.Parser.com$monovore$decline$Parser$$evalResult(Parser.scala:26)
    at com.monovore.decline.Parser.consumeAll(Parser.scala:91)
    at com.monovore.decline.Parser.apply(Parser.scala:17)
    at com.monovore.decline.Command.parse(opts.scala:17)
    at com.monovore.decline.CommandApp.main(CommandApp.scala:48)
    at osmesa.ExtractMultiPolygons.main(ExtractMultiPolygons.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-05-30 08:23:42 WARN  WholeStageCodegenExec:66 - Whole-stage codegen disabled for plan (id=29):
 *(29) Project [type#783, changeset#314L, id#723L, version#742, minorVersion#765, updated#744, validUntil#757, role#785, geom#318]
+- *(29) Filter (((isnull(memberUpdated#798) && isnull(memberValidUntil#799)) && isnull(geom#318)) || ((memberUpdated#798 <= updated#744) && (updated#744 < coalesce(memberValidUntil#799, 1527682999315000))))
   +- *(29) HashAggregate(keys=[role#785, validUntil#757, ref#784L, version#742, id#723L, updated#744, minorVersion#765, changeset#314L, type#783], functions=[], output=[changeset#314L, id#723L, version#742, minorVersion#765, updated#744, validUntil#757, type#783, role#785, memberUpdated#798, memberValidUntil#799, geom#318])
      +- *(29) HashAggregate(keys=[role#785, validUntil#757, ref#784L, version#742, id#723L, updated#744, minorVersion#765, changeset#314L, type#783], functions=[], output=[role#785, validUntil#757, ref#784L, version#742, id#723L, updated#744, minorVersion#765, changeset#314L, type#783])
         +- *(29) Project [changeset#314L, id#723L, version#742, minorVersion#765, updated#744, validUntil#757, member#775.type AS type#783, member#775.ref AS ref#784L, member#775.role AS role#785]
            +- Generate explode(members#49), [id#723L, version#742, changeset#314L, updated#744, validUntil#757, minorVersion#765], true, [member#775]
               +- *(28) Project [id#723L, version#742, changeset#314L, updated#744, members#49, validUntil#757, (_we0#766 - 1) AS minorVersion#765]
                  +- Window [row_number() windowspecdefinition(id#723L, version#742, updated#744 ASC NULLS FIRST, specifiedwindowframe(RowFrame, unboundedpreceding$(), currentrow$())) AS _we0#766], [id#723L, version#742], [updated#744 ASC NULLS FIRST]
                     +- *(27) Sort [id#723L ASC NULLS FIRST, version#742 ASC NULLS FIRST, updated#744 ASC NULLS FIRST], false, 0
                        +- Window [lead(updated#744, 1, null) windowspecdefinition(id#723L, updated#744 ASC NULLS FIRST, specifiedwindowframe(RowFrame, 1, 1)) AS validUntil#757], [id#723L], [updated#744 ASC NULLS FIRST]
                           +- *(26) Sort [id#723L ASC NULLS FIRST, updated#744 ASC NULLS FIRST], false, 0
                              +- Exchange hashpartitioning(id#723L, 200)
                                 +- *(25) Project [id#723L, version#742, changeset#314L, updated#744, members#49]
                                    +- *(25) BroadcastHashJoin [id#723L, cast(version#742 as bigint)], [id#0L, version#11L], Inner, BuildLeft
                                       :- BroadcastExchange HashedRelationBroadcastMode(List(input[1, bigint, true], cast(input[2, int, false] as bigint)))
                                       :  +- *(22) Filter isnotnull(version#742)
                                       :     +- *(22) HashAggregate(keys=[changeset#314L, id#723L], functions=[max(version#11L), max(updated#317)], output=[changeset#314L, id#723L, version#742, updated#744])
                                       :        +- Exchange hashpartitioning(changeset#314L, id#723L, 200)
                                       :           +- *(21) HashAggregate(keys=[changeset#314L, id#723L], functions=[partial_max(version#11L), partial_max(updated#317)], output=[changeset#314L, id#723L, max#948L, max#949])
                                       :              +- Union
                                       :                 :- LocalTableScan <empty>, [changeset#314L, id#723L, version#11L, updated#317]
                                       :                 +- *(20) Project [changeset#7L, id#0L, version#11L, timestamp#8 AS updated#728]
                                       :                    +- *(20) Filter (UDF(CASE WHEN (NOT visible#12 && isnotnull(_we0#51)) THEN _we1#52 ELSE tags#2 END) && UDF(CASE WHEN (NOT visible#12 && isnotnull(_we0#51)) THEN _we1#52 ELSE tags#2 END))
                                       :                       +- Window [lag(tags#2, 1, null) windowspecdefinition(id#0L, version#11L ASC NULLS FIRST, specifiedwindowframe(RowFrame, -1, -1)) AS _we0#51, lag(tags#2, 1, null) windowspecdefinition(id#0L, version#11L ASC NULLS FIRST, specifiedwindowframe(RowFrame, -1, -1)) AS _we1#52], [id#0L], [version#11L ASC NULLS FIRST]
                                       :                          +- *(19) Sort [id#0L ASC NULLS FIRST, version#11L ASC NULLS FIRST], false, 0
                                       :                             +- *(19) Project [id#0L, changeset#7L, timestamp#8, version#11L, visible#12, tags#2]
                                       :                                +- ReusedExchange [id#0L, tags#2, changeset#7L, timestamp#8, version#11L, visible#12], Exchange hashpartitioning(id#0L, 200)
                                       +- *(25) Project [id#0L, version#11L, members#49]
                                          +- *(25) Filter ((UDF(CASE WHEN (NOT visible#12 && isnotnull(_we0#51)) THEN _we1#52 ELSE tags#2 END) && UDF(CASE WHEN (NOT visible#12 && isnotnull(_we0#51)) THEN _we1#52 ELSE tags#2 END)) && isnotnull(version#11L))
                                             +- Window [lag(tags#2, 1, null) windowspecdefinition(id#0L, version#11L ASC NULLS FIRST, specifiedwindowframe(RowFrame, -1, -1)) AS _we0#51, lag(tags#2, 1, null) windowspecdefinition(id#0L, version#11L ASC NULLS FIRST, specifiedwindowframe(RowFrame, -1, -1)) AS _we1#52], [id#0L], [version#11L ASC NULLS FIRST]
                                                +- *(24) Sort [id#0L ASC NULLS FIRST, version#11L ASC NULLS FIRST], false, 0
                                                   +- *(24) Project [id#0L, UDF(members#6) AS members#49, version#11L, visible#12, tags#2]
                                                      +- ReusedExchange [id#0L, tags#2, members#6, version#11L, visible#12], Exchange hashpartitioning(id#0L, 200)

Done.
mojodna commented 6 years ago

Yup, I'm seeing this too. I'm looking through to see what changed since I first created this as a debugging tool.