locationtech / geowave

GeoWave provides geospatial and temporal indexing on top of Accumulo, HBase, BigTable, Cassandra, Kudu, Redis, RocksDB, and DynamoDB.
Apache License 2.0
501 stars 190 forks source link

Question about geowave 0.9.8 execution error: java.io.EOFException At org.apache.spark.serializer.KryoDeserializationStream.readObject #1481

Closed hsg77 closed 5 years ago

hsg77 commented 5 years ago

Question about geowave 0.9.8 execution error: java.io.EOFException At org.apache.spark.serializer.KryoDeserializationStream.readObject

Master is local or local [*] execution is OK。 When Master is spark://mycluster:7077, execution will report KryoDeserializationStream.readObject error message.

image

hsg77 commented 5 years ago

`public class geowaveSpark {

public  String main(String[] args)
        throws Exception
{
    String rbc="";
    if (args==null) throw new IOException("args is null");
    if (args.length!=3) throw new IOException("args have three params storeName fdName sparkMaster");
    //
    //String master = "spark://mycluster:7077";
    Configuration hbaseConf =geowaveUtil.GetHBaseConfiguration(); // HBaseConfiguration.create();
    //hbaseConf.set("hbase.zookeeper.quorum", app.zookeeper);
    //hbaseConf.set("fs.defaultFS", "hdfs://mycluster/");
    //System.setProperty("HADOOP_USER_NAME", app.zookeeperUser);
    //
    String storeName=args[0];  //dltb    //cbdk
    String fdName=args[1];     //dlbm    //DKLB
    String sparkMaster=args[2];   //local  or local[*]  or spark://node111:7077   or yarn
    int minSplits = -1;
    int maxSplits = -1;
    //
    SparkConf sparkConf=null;
    //JavaSparkContext context =null;
    SparkSession sparkSession=null;
    DataStorePluginOptions inputStoreOptions = null;
    try {
        //env set
        System.setProperty("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
        System.setProperty("spark.kryo.registrator", "mil.nga.giat.geowave.analytic.spark.GeoWaveRegistrator");
        // Attempt to load input store.
        inputStoreOptions= geowaveUtil.getNewDataStorePluginOptions_hbase(storeName);
        //============
        SparkConf conf = GeoWaveSparkConf.getDefaultConfig()
                .set("spark.yarn.jars","hdfs://mycluster:8020/spark/spark-libs.jar")
                .set("spark.jars","hdfs://mycluster:8020/spark/spark-libs.jar")
                //
                .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
                .set("spark.kryo.registrator", "mil.nga.giat.geowave.analytic.spark.GeoWaveRegistrator")
                //
                .set("spark.dynamicAllocation.enabled","false")
                //
                .set("spark.driver.extraClassPath","/usr/cwgis/app/spark/jars/lib/*")
                .set("spark.executor.extraClassPath","/usr/cwgis/app/spark/jars/lib/*")
                //
                .set("spark.driver.memory","8g")
                .set("spark.executor.memory","8g")
                .set("spark.locality.wait","0")
                .set("yarn.resourcemanager.address", "mycluster:8032")// resourcemanager
                .set("yarn.resourcemanager.scheduler.address", "mycluster:8034")// 
                .set("spark.sql.broadcastTimeout", "1200")
                .set("spark.sql.crossJoin.enabled", "true")
                //
                //.setMaster("local")      //OK
                //.setMaster("local[*]")  //OK
                .setMaster(sparkMaster)  //???                  
                .setAppName("geowaveSpark");
        sparkSession = GeoWaveSparkConf.createDefaultSession(conf);

        //============
        //sparkConf.setJars(JavaSparkContext.jarOfClass(cls));
        System.out.println("sparkMaster="+sparkMaster);

        //
        RDDOptions rddOptions=new RDDOptions();
        rddOptions.setQueryOptions(null);
        rddOptions.setQuery(null);
        rddOptions.setMaxSplits(maxSplits);
        rddOptions.setMinSplits(minSplits);
        //-------
        GeoWaveRDD gwRdd = GeoWaveRDDLoader.loadRDD(
                sparkSession.sparkContext(),
                inputStoreOptions,
                rddOptions);
        System.out.println("loaded :GeoWaveRDDLoader.loadRDD");
        JavaPairRDD<GeoWaveInputKey, SimpleFeature> javaRdd=gwRdd.getRawRDD();
        System.out.println("loaded:getRawRDD");

        System.out.println("RDD feature count:" + javaRdd.count() +" of layerName:" + storeName + "");  //report error:kryo readObject

        System.out.println("executing javaRdd.mapToPair");
        JavaPairRDD<String, Double> mapToPair1 = javaRdd.mapToPair((feat)->{
            Object o1 = feat._2.getAttribute(fdName);
            if (o1==null) o1="null";                
            Double outValue=0.0;
            Geometry geo=(Geometry)feat._2.getDefaultGeometry();
            if(geo!=null)
            {
                outValue=geo.getArea();
                if(outValue==0.0)
                {
                    outValue=geo.getLength();
                }
            }
            else
            {
                outValue=0.0;
            }
            return new Tuple2<String, Double>(new String(o1.toString()), outValue);
        });

        JavaPairRDD<String, Double> reduceByKey = mapToPair1.reduceByKey((x,y)->x+y);

        Map<String, Double> mapToPair2 = reduceByKey.collectAsMap();

        //================
        rbc=this.getResult(mapToPair2,fdName);
        System.out.println(rbc);
        //================

        javaRdd=null;
        mapToPair1=null;
        reduceByKey=null;
        //
    }
    catch (Exception ex){
        ex.printStackTrace();
    }
    finally {
        //context.stop();
        //context.close();
        //context=null;
        if(sparkSession!=null)
        {
            sparkSession.stop();
            sparkSession.close();
            sparkSession=null;
        }
    }
    return rbc;       
}
private String getResult(Map<String, Double> mapToPair2,String fdName)
{
    Object objVal=null;
    double area=0.0;
    int dtRowsCount=0;
    StringBuilder jsonBuilder = new StringBuilder();
    //
    SortedMap<String,Double> sortMap=new TreeMap<String,Double>(mapToPair2);
    Set<String> keyset=sortMap.keySet();
    for (String key : keyset) {
        area=sortMap.get(key);
        //
        StringBuilder fdsBuilder=new StringBuilder();
        fdsBuilder.append("{");
        fdsBuilder.append("\""+fdName+"\":\"" + key + "\"");
        fdsBuilder.append(",\"mj\":\"" + area + "\"");
        fdsBuilder.append("}");
        //
        if(jsonBuilder.toString().length()<=0) {
            jsonBuilder.append(fdsBuilder.toString());
        }
        else {
            jsonBuilder.append(","+fdsBuilder.toString());
        }
        fdsBuilder=null;
        dtRowsCount+=1;           
    }
    return jsonBuilder.toString();
}

}`

hsg77 commented 5 years ago

Especially when executing the javaRdd. count () method and javaRdd. mapToPair, error messages occur:

`tablename=Layer_point fieldname=dlbm sparkMaster=spark://mycluster:7077 Linux 2019-01-04 17:41:18.406 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkContext : Running Spark version 2.3.1 2019-01-04 17:41:18.509 WARN 25482 --- [nio-8888-exec-1] org.apache.hadoop.util.NativeCodeLoader : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2019-01-04 17:41:18.591 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkContext : Submitted application: geowaveSpark 2019-01-04 17:41:18.639 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SecurityManager : Changing view acls to: root 2019-01-04 17:41:18.639 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SecurityManager : Changing modify acls to: root 2019-01-04 17:41:18.640 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SecurityManager : Changing view acls groups to: 2019-01-04 17:41:18.640 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SecurityManager : Changing modify acls groups to: 2019-01-04 17:41:18.641 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SecurityManager : SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2019-01-04 17:41:18.810 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.util.Utils : Successfully started service 'sparkDriver' on port 41246. 2019-01-04 17:41:18.833 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkEnv : Registering MapOutputTracker 2019-01-04 17:41:18.850 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkEnv : Registering BlockManagerMaster 2019-01-04 17:41:18.852 INFO 25482 --- [nio-8888-exec-1] o.a.s.s.BlockManagerMasterEndpoint : Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2019-01-04 17:41:18.852 INFO 25482 --- [nio-8888-exec-1] o.a.s.s.BlockManagerMasterEndpoint : BlockManagerMasterEndpoint up 2019-01-04 17:41:18.860 INFO 25482 --- [nio-8888-exec-1] o.apache.spark.storage.DiskBlockManager : Created local directory at /tmp/blockmgr-253f9c6d-b3a2-4eab-96db-5c92ebe928a5 2019-01-04 17:41:18.879 INFO 25482 --- [nio-8888-exec-1] o.a.spark.storage.memory.MemoryStore : MemoryStore started with capacity 4.0 GB 2019-01-04 17:41:18.891 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkEnv : Registering OutputCommitCoordinator 2019-01-04 17:41:18.953 INFO 25482 --- [nio-8888-exec-1] org.spark_project.jetty.util.log : Logging initialized @8347ms 2019-01-04 17:41:19.000 INFO 25482 --- [nio-8888-exec-1] org.spark_project.jetty.server.Server : jetty-9.3.z-SNAPSHOT 2019-01-04 17:41:19.014 INFO 25482 --- [nio-8888-exec-1] org.spark_project.jetty.server.Server : Started @8410ms 2019-01-04 17:41:19.029 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.AbstractConnector : Started ServerConnector@f5e3910{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2019-01-04 17:41:19.030 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.util.Utils : Successfully started service 'SparkUI' on port 4040. 2019-01-04 17:41:19.052 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@27a0ff53{/jobs,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.053 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@22acab63{/jobs/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.053 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1703d9d4{/jobs/job,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.054 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5f5f7769{/jobs/job/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.054 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@186f2852{/stages,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.055 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@11ab9ff2{/stages/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.055 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@165d921b{/stages/stage,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.056 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1ec899b8{/stages/stage/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.056 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@647f0d30{/stages/pool,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.057 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3c663b52{/stages/pool/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.057 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@612b1cc1{/storage,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.058 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3bf45c05{/storage/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.058 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@27fec0c6{/storage/rdd,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.059 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2acfd3c0{/storage/rdd/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.059 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@13bb1f4c{/environment,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.059 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@a367d4a{/environment/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.060 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5b1f643a{/executors,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.060 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@f9aed4a{/executors/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.061 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4410c28a{/executors/threadDump,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.061 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7df53b6f{/executors/threadDump/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.067 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5d6bbe5f{/static,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.067 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4b69c84{/,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.068 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@41546e55{/api,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.068 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7e98be88{/jobs/job/kill,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.069 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@fce99db{/stages/stage/kill,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.070 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.ui.SparkUI : Bound SparkUI to 0.0.0.0, and started at http://node114:4040 2019-01-04 17:41:19.141 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkContext : Added JAR hdfs://mycluster:8020/spark/spark-libs.jar at hdfs://mycluster:8020/spark/spark-libs.jar with timestamp 1546594879140 2019-01-04 17:41:19.225 INFO 25482 --- [er-threadpool-0] s.d.c.StandaloneAppClient$ClientEndpoint : Connecting to master spark://mycluster:7077... 2019-01-04 17:41:19.271 INFO 25482 --- [pc-connection-0] o.a.s.n.client.TransportClientFactory : Successfully created connection to mycluster/192.168.30.111:7077 after 26 ms (0 ms spent in bootstraps) 2019-01-04 17:41:19.340 INFO 25482 --- [er-event-loop-3] o.a.s.s.c.StandaloneSchedulerBackend : Connected to Spark cluster with app ID app-20190104174119-0016 2019-01-04 17:41:19.346 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/0 on worker-20190103181611-192.168.30.122-40003 (192.168.30.122:40003) with 2 core(s) 2019-01-04 17:41:19.349 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/0 on hostPort 192.168.30.122:40003 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.350 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/1 on worker-20190103181607-192.168.30.117-42595 (192.168.30.117:42595) with 2 core(s) 2019-01-04 17:41:19.350 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/1 on hostPort 192.168.30.117:42595 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.350 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/2 on worker-20190103181607-192.168.30.113-35756 (192.168.30.113:35756) with 2 core(s) 2019-01-04 17:41:19.351 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/2 on hostPort 192.168.30.113:35756 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.351 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/3 on worker-20190103181607-192.168.30.115-37226 (192.168.30.115:37226) with 2 core(s) 2019-01-04 17:41:19.351 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/3 on hostPort 192.168.30.115:37226 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.351 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/4 on worker-20190103181611-192.168.30.123-46195 (192.168.30.123:46195) with 2 core(s) 2019-01-04 17:41:19.352 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/4 on hostPort 192.168.30.123:46195 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.352 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/5 on worker-20190103181611-192.168.30.118-32876 (192.168.30.118:32876) with 2 core(s) 2019-01-04 17:41:19.352 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/5 on hostPort 192.168.30.118:32876 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.352 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/6 on worker-20190103181607-192.168.30.116-41508 (192.168.30.116:41508) with 2 core(s) 2019-01-04 17:41:19.353 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/6 on hostPort 192.168.30.116:41508 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.353 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/7 on worker-20190103181610-192.168.30.121-46023 (192.168.30.121:46023) with 2 core(s) 2019-01-04 17:41:19.353 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/7 on hostPort 192.168.30.121:46023 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.353 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/8 on worker-20190103181611-192.168.30.120-33521 (192.168.30.120:33521) with 2 core(s) 2019-01-04 17:41:19.354 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/8 on hostPort 192.168.30.120:33521 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.354 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20190104174119-0016/9 on worker-20190103181610-192.168.30.119-35068 (192.168.30.119:35068) with 2 core(s) 2019-01-04 17:41:19.354 INFO 25482 --- [er-event-loop-0] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20190104174119-0016/9 on hostPort 192.168.30.119:35068 with 2 core(s), 8.0 GB RAM 2019-01-04 17:41:19.362 INFO 25482 --- [er-event-loop-1] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/2 is now RUNNING 2019-01-04 17:41:19.362 INFO 25482 --- [er-event-loop-1] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/0 is now RUNNING 2019-01-04 17:41:19.363 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.util.Utils : Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 36049. 2019-01-04 17:41:19.363 INFO 25482 --- [nio-8888-exec-1] o.a.s.n.netty.NettyBlockTransferService : Server created on node114:36049 2019-01-04 17:41:19.364 INFO 25482 --- [er-event-loop-3] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/1 is now RUNNING 2019-01-04 17:41:19.365 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.storage.BlockManager : Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2019-01-04 17:41:19.376 INFO 25482 --- [er-event-loop-0] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/8 is now RUNNING 2019-01-04 17:41:19.377 INFO 25482 --- [er-event-loop-1] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/4 is now RUNNING 2019-01-04 17:41:19.378 INFO 25482 --- [er-event-loop-2] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/7 is now RUNNING 2019-01-04 17:41:19.381 INFO 25482 --- [er-event-loop-3] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/5 is now RUNNING 2019-01-04 17:41:19.381 INFO 25482 --- [er-event-loop-3] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/3 is now RUNNING 2019-01-04 17:41:19.384 INFO 25482 --- [er-event-loop-1] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/6 is now RUNNING 2019-01-04 17:41:19.384 INFO 25482 --- [er-event-loop-1] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20190104174119-0016/9 is now RUNNING 2019-01-04 17:41:19.392 INFO 25482 --- [nio-8888-exec-1] o.a.spark.storage.BlockManagerMaster : Registering BlockManager BlockManagerId(driver, node114, 36049, None) 2019-01-04 17:41:19.402 INFO 25482 --- [er-event-loop-3] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager node114:36049 with 4.0 GB RAM, BlockManagerId(driver, node114, 36049, None) 2019-01-04 17:41:19.405 INFO 25482 --- [nio-8888-exec-1] o.a.spark.storage.BlockManagerMaster : Registered BlockManager BlockManagerId(driver, node114, 36049, None) 2019-01-04 17:41:19.405 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.storage.BlockManager : Initialized BlockManager: BlockManagerId(driver, node114, 36049, None) 2019-01-04 17:41:19.417 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@376c787c{/metrics/json,null,AVAILABLE,@Spark} 2019-01-04 17:41:19.434 INFO 25482 --- [nio-8888-exec-1] o.a.s.s.c.StandaloneSchedulerBackend : SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0 sparkMaster=spark://mycluster:7077 2019-01-04 17:41:20.312 INFO 25482 --- [nio-8888-exec-1] o.a.spark.storage.memory.MemoryStore : Block broadcast_0 stored as values in memory (estimated size 288.2 KB, free 4.0 GB) 2019-01-04 17:41:20.811 INFO 25482 --- [nio-8888-exec-1] o.a.spark.storage.memory.MemoryStore : Block broadcast_0_piece0 stored as bytes in memory (estimated size 24.0 KB, free 4.0 GB) 2019-01-04 17:41:20.814 INFO 25482 --- [er-event-loop-1] o.apache.spark.storage.BlockManagerInfo : Added broadcast_0_piece0 in memory on node114:36049 (size: 24.0 KB, free: 4.0 GB) 2019-01-04 17:41:20.819 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkContext : Created broadcast 0 from newAPIHadoopRDD at GeoWaveRDDLoader.java:163 loaded:GeoWaveRDDLoader.loadRDD loaded:getRawRDD 2019-01-04 17:41:21.188 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.zookeeper.RecoverableZooKeeper : Process identifier=hconnection-0xc699d23 connecting to ZooKeeper ensemble=node111:2181,node112:2181,node113:2181 2019-01-04 17:41:21.202 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT 2019-01-04 17:41:21.202 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:host.name=node114 2019-01-04 17:41:21.202 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:java.version=1.8.0_172 2019-01-04 17:41:21.202 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:java.vendor=Oracle Corporation 2019-01-04 17:41:21.202 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:java.home=/usr/cwgis/app/jdk/jre 2019-01-04 17:41:21.202 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:java.class.path=sproutgis-exec.jar:lib/activation-1.1.1.jar:lib/aircompressor-0.8.jar:lib/Agrona-0.4.13.jar:lib/annotations-3.0.1.jar:lib/antlr-2.7.7.jar:lib/antlr-runtime-3.4.jar:lib/antlr4-runtime-4.7.jar:lib/aopalliance-1.0.jar:lib/aopalliance-repackaged-2.4.0-b34.jar:lib/apache-log4j-extras-1.2.17.jar:lib/apacheds-i18n-2.0.0-M15.jar:lib/apacheds-kerberos-codec-2.0.0-M15.jar:lib/api-asn1-api-1.0.0-M20.jar:lib/api-util-1.0.0-M20.jar:lib/arpack_combined_all-0.1.jar:lib/arrow-format-0.8.0.jar:lib/arrow-memory-0.8.0.jar:lib/arrow-vector-0.8.0.jar:lib/asm-3.1.jar:lib/asm-4.0.jar:lib/asyncretry-0.0.7.jar:lib/avro-1.7.7.jar:lib/avro-ipc-1.7.7-tests.jar:lib/avro-ipc-1.7.7.jar:lib/avro-mapred-1.7.7-hadoop2.jar:lib/aws-java-sdk-core-1.11.105.jar:lib/aws-java-sdk-kms-1.11.105.jar:lib/aws-java-sdk-s3-1.11.105.jar:lib/base64-2.3.8.jar:lib/batik-anim-1.10.jar:lib/batik-awt-util-1.10.jar:lib/batik-bridge-1.10.jar:lib/batik-constants-1.10.jar:lib/batik-css-1.10.jar:lib/batik-dom-1.10.jar:lib/batik-ext-1.10.jar:lib/batik-gvt-1.10.jar:lib/batik-i18n-1.10.jar:lib/batik-parser-1.10.jar:lib/batik-script-1.10.jar:lib/batik-svg-dom-1.10.jar:lib/batik-svggen-1.10.jar:lib/batik-transcoder-1.10.jar:lib/batik-util-1.10.jar:lib/batik-xml-1.10.jar:lib/bcprov-jdk15on-1.52.jar:lib/bonecp-0.8.0.RELEASE.jar:lib/breeze-macros_2.11-0.13.2.jar:lib/breeze_2.11-0.13.2.jar:lib/calcite-avatica-1.2.0-incubating.jar:lib/calcite-core-1.2.0-incubating.jar:lib/calcite-linq4j-1.2.0-incubating.jar:lib/cglib-nodep-2.2.jar:lib/chill-java-0.8.4.jar:lib/chill_2.11-0.8.4.jar:lib/classmate-1.3.3.jar:lib/classworlds-1.1-alpha-2.jar:lib/commons-beanutils-1.9.2-noclassprop.jar:lib/commons-beanutils-1.9.3.jar:lib/commons-beanutils-core-1.8.0.jar:lib/commons-cli-1.2.jar:lib/commons-codec-1.10.jar:lib/commons-collections-3.2.2.jar:lib/commons-compiler-3.0.8.jar:lib/commons-compress-1.4.1.jar:lib/commons-configuration-1.6.jar:lib/commons-crypto-1.0.0.jar:lib/commons-dbcp-1.4.jar:lib/commons-digester-2.1.jar:lib/commons-fileupload-1.3.3.jar:lib/commons-httpclient-3.1.jar:lib/commons-io-2.4.jar:lib/commons-jxpath-1.3.jar:lib/commons-lang-2.6.jar:lib/commons-lang3-3.5.jar:lib/commons-logging-1.1.3.jar:lib/commons-math-2.1.jar:lib/commons-math3-3.4.1.jar:lib/commons-net-2.2.jar:lib/commons-pool-1.6.jar:lib/commons-pool2-2.4.2.jar:lib/commons-vfs2-2.1.jar:lib/compress-lzf-1.0.3.jar:lib/core-1.1.2.jar:lib/curator-client-2.7.1.jar:lib/curator-framework-2.6.0.jar:lib/curator-recipes-2.6.0.jar:lib/cyclops-react-1.0.0-RC4.jar:lib/datanucleus-api-jdo-3.2.6.jar:lib/datanucleus-core-3.2.10.jar:lib/datanucleus-rdbms-3.2.9.jar:lib/derby-10.12.1.1.jar:lib/ehcache-2.10.3.jar:lib/eigenbase-properties-1.1.5.jar:lib/ejml-core-0.32.jar:lib/ejml-ddense-0.32.jar:lib/ezmorph-1.0.6.jar:lib/fastjson-1.2.15.jar:lib/findbugs-annotations-1.3.9-1.jar:lib/flatbuffers-1.2.0-3f79e055.jar:lib/freemarker-2.3.25-incubating.jar:lib/GeographicLib-Java-1.44.jar:lib/geowave-adapter-auth-0.9.8-SNAPSHOT.jar:lib/geowave-adapter-raster-0.9.8-SNAPSHOT.jar:lib/geowave-adapter-vector-0.9.8-SNAPSHOT.jar:lib/geowave-analytic-api-0.9.8-SNAPSHOT.jar:lib/geowave-analytic-mapreduce-0.9.8-SNAPSHOT.jar:lib/geowave-analytic-spark-0.9.8-SNAPSHOT.jar:lib/geowave-core-cli-0.9.8-SNAPSHOT.jar:lib/geowave-core-geotime-0.9.8-SNAPSHOT.jar:lib/geowave-core-index-0.9.8-SNAPSHOT.jar:lib/geowave-core-ingest-0.9.8-SNAPSHOT.jar:lib/geowave-core-mapreduce-0.9.8-SNAPSHOT.jar:lib/geowave-core-store-0.9.8-SNAPSHOT.jar:lib/geowave-datastore-hbase-0.9.8-SNAPSHOT.jar:lib/gs-main-2.13.2.jar:lib/gs-ows-2.13.2.jar:lib/gs-platform-2.13.2.jar:lib/gs-wfs-2.13.2.jar:lib/gs-wms-2.13.2.jar:lib/gson-2.7.jar:lib/gt-api-19.2.jar:lib/gt-app-schema-resolver-19.2.jar:lib/gt-complex-19.2.jar:lib/gt-coverage-19.2.jar:lib/gt-cql-19.2.jar:lib/gt-data-19.2.jar:lib/gt-epsg-wkt-19.2.jar:lib/gt-geotiff-19.2.jar:lib/gt-grid-19.2.jar:lib/gt-image-19.2.jar:lib/gt-imageio-ext-gdal-19.2.jar:lib/gt-imagemosaic-19.2.jar:lib/gt-jdbc-19.2.jar:lib/gt-main-19.2.jar:lib/gt-metadata-19.2.jar:lib/gt-opengis-19.2.jar:lib/gt-process-19.2.jar:lib/gt-process-feature-19.2.jar:lib/gt-process-raster-19.2.jar:lib/gt-property-19.2.jar:lib/gt-referencing-19.2.jar:lib/gt-render-19.2.jar:lib/gt-shapefile-19.2.jar:lib/gt-svg-19.2.jar:lib/gt-swing-19.2.jar:lib/gt-tile-client-19.2.jar:lib/gt-transform-19.2.jar:lib/gt-wfs-ng-19.2.jar:lib/gt-wms-19.2.jar:lib/gt-wmts-19.2.jar:lib/gt-wps-19.2.jar:lib/gt-xml-19.2.jar:lib/gt-xsd-core-19.2.jar:lib/gt-xsd-fes-19.2.jar:lib/gt-xsd-filter-19.2.jar:lib/gt-xsd-gml2-19.2.jar:lib/gt-xsd-gml3-19.2.jar:lib/gt-xsd-ows-19.2.jar:lib/gt-xsd-sld-19.2.jar:lib/gt-xsd-wfs-19.2.jar:lib/gt-xsd-wmts-19.2.jar:lib/gt-xsd-wps-19.2.jar:lib/guava-12.0.1.jar:lib/guice-3.0.jar:lib/h2-1.4.193.jar:lib/hadoop-annotations-2.7.4.jar:lib/hadoop-auth-2.7.4.jar:lib/hadoop-client-2.7.4.jar:lib/hadoop-common-2.7.4.jar:lib/hadoop-hdfs-2.7.4.jar:lib/hadoop-mapreduce-client-app-2.7.4.jar:lib/hadoop-mapreduce-client-common-2.7.4.jar:lib/hadoop-mapreduce-client-core-2.7.4.jar:lib/hadoop-mapreduce-client-jobclient-2.7.4.jar:lib/hadoop-mapreduce-client-shuffle-2.7.4.jar:lib/hadoop-yarn-api-2.7.4.jar:lib/hadoop-yarn-client-2.7.4.jar:lib/hadoop-yarn-common-2.7.4.jar:lib/hadoop-yarn-server-common-2.7.4.jar:lib/hamcrest-core-1.3.jar:lib/hbase-shaded-client-1.3.1.jar:lib/hbase-shaded-server-1.3.1.jar:lib/HdrHistogram-2.1.7.jar:lib/hibernate-validator-5.2.4.Final.jar:lib/hive-exec-1.2.1.spark2.jar:lib/hive-metastore-1.2.1.spark2.jar:lib/hk2-api-2.4.0-b34.jar:lib/hk2-locator-2.4.0-b34.jar:lib/hk2-utils-2.4.0-b34.jar:lib/hppc-0.7.2.jar:lib/htrace-core-3.1.0-incubating.jar:lib/httpclient-4.5.2.jar:lib/httpcore-4.4.6.jar:lib/imageio-ext-gdal-bindings-1.9.2.jar:lib/imageio-ext-gdalarcbinarygrid-1.1.24.jar:lib/imageio-ext-gdaldted-1.1.24.jar:lib/imageio-ext-gdalecw-1.1.24.jar:lib/imageio-ext-gdalecwjp2-1.1.24.jar:lib/imageio-ext-gdalehdr-1.1.24.jar:lib/imageio-ext-gdalenvihdr-1.1.24.jar:lib/imageio-ext-gdalerdasimg-1.1.24.jar:lib/jta-1.1.jar:lib/imageio-ext-gdalframework-1.1.24.jar:lib/imageio-ext-gdalgeotiff-1.1.24.jar:lib/imageio-ext-gdalidrisi-1.1.24.jar:lib/imageio-ext-gdalkakadujp2-1.1.24.jar:lib/imageio-ext-gdalmrsid-1.1.24.jar:lib/imageio-ext-gdalmrsidjp2-1.1.24.jar:lib/imageio-ext-gdalnitf-1.1.24.jar:lib/imageio-ext-gdalrpftoc-1.1.24.jar:lib/imageio-ext-gdalsrp-1.1.24.jar:lib/imageio-ext-gdalvrt-1.1.24.jar:lib/imageio-ext-geocore-1.1.24.jar:lib/imageio-ext-imagereadmt-1.1.24.jar:lib/imageio-ext-png-1.1.24.jar:lib/imageio-ext-streams-1.1.24.jar:lib/imageio-ext-tiff-1.1.24.jar:lib/imageio-ext-utilities-1.1.24.jar:lib/ion-java-1.0.2.jar:lib/itext-2.1.5.jar:lib/ivy-2.4.0.jar:lib/jackson-annotations-2.8.6.jar:lib/jackson-core-2.8.6.jar:lib/jackson-core-asl-1.9.13.jar:lib/jackson-databind-2.6.7.1.jar:lib/jackson-dataformat-cbor-2.8.6.jar:lib/jackson-jaxrs-1.8.3.jar:lib/jackson-mapper-asl-1.9.13.jar:lib/jackson-module-paranamer-2.7.9.jar:lib/jackson-module-scala_2.11-2.6.7.1.jar:lib/jackson-xc-1.8.3.jar:lib/jai_codec-1.1.3.jar:lib/jai_core-1.1.3.jar:lib/jai_imageio-1.1.jar:lib/janino-2.7.8.jar:lib/jasypt-1.9.2.jar:lib/jasypt-springsecurity3-1.9.2.jar:lib/java-xmlbuilder-1.1.jar:lib/JavaEWAH-0.3.2.jar:lib/JavaFastPFOR-0.1.12.jar:lib/javaslang-2.0.2.jar:lib/javaslang-match-2.0.2.jar:lib/javassist-3.20.0-GA.jar:lib/javax.annotation-api-1.2.jar:lib/javax.inject-1.jar:lib/javax.inject-2.4.0-b34.jar:lib/javax.servlet-api-3.1.0.jar:lib/javax.ws.rs-api-2.0.1.jar:lib/javolution-5.5.1.jar:lib/jaxb-api-2.2.2.jar:lib/jaxb-impl-2.2.3-1.jar:lib/jboss-logging-3.3.0.Final.jar:lib/jcip-annotations-1.0.jar:lib/jcl-over-slf4j-1.7.22.jar:lib/jcommander-1.48.jar:lib/jdk.tools-1.8.jar:lib/jdo-api-3.0.1.jar:lib/jdom2-2.0.6.jar:lib/jersey-client-1.9.jar:lib/jersey-client-2.22.2.jar:lib/jersey-common-2.22.2.jar:lib/jersey-container-servlet-2.23.2.jar:lib/jersey-container-servlet-core-2.23.2.jar:lib/jersey-core-1.9.jar:lib/jersey-guava-2.22.2.jar:lib/jersey-json-1.9.jar:lib/jersey-media-jaxb-2.23.2.jar:lib/jersey-server-1.9.jar:lib/jersey-server-2.23.2.jar:lib/jets3t-0.9.4.jar:lib/jettison-1.1.jar:lib/jetty-6.1.26.jar:lib/jetty-sslengine-6.1.26.jar:lib/jetty-util-6.1.26.jar:lib/jgridshift-1.0.jar:lib/jmespath-java-1.11.105.jar:lib/joda-time-2.9.7.jar:lib/jodd-core-3.5.2.jar:lib/jool-0.9.11.jar:lib/jopt-simple-3.2.jar:lib/jsch-0.1.54.jar:lib/json-lib-2.4-jdk15.jar:lib/json-simple-1.1.1.jar:lib/json4s-ast_2.11-3.2.11.jar:lib/json4s-core_2.11-3.2.11.jar:lib/json4s-jackson_2.11-3.2.11.jar:lib/jsp-api-2.1.jar:lib/jsr-275-1.0-beta-2.jar:lib/jsr203hadoop-1.0.1.jar:lib/jsr305-1.3.9.jar:lib/jt-affine-1.0.24.jar:lib/jt-algebra-1.0.24.jar:lib/jt-attributeop-1.4.0.jar:lib/jt-bandcombine-1.0.24.jar:lib/jt-bandmerge-1.0.24.jar:lib/jt-bandselect-1.0.24.jar:lib/jt-binarize-1.0.24.jar:lib/jt-border-1.0.24.jar:lib/jt-buffer-1.0.24.jar:lib/jt-classifier-1.0.24.jar:lib/jt-colorconvert-1.0.24.jar:lib/jt-colorindexer-1.0.24.jar:lib/jt-concurrent-tile-cache-1.0.24.jar:lib/jt-contour-1.4.0.jar:lib/jt-crop-1.0.24.jar:lib/jt-errordiffusion-1.0.24.jar:lib/jt-format-1.0.24.jar:lib/jt-imagefunction-1.0.24.jar:lib/jt-iterators-1.0.24.jar:lib/jt-lookup-1.0.24.jar:lib/jt-mosaic-1.0.24.jar:lib/jt-nullop-1.0.24.jar:lib/jt-orderdither-1.0.24.jar:lib/jt-piecewise-1.0.24.jar:lib/jt-rangelookup-1.4.0.jar:lib/jt-rescale-1.0.24.jar:lib/jt-rlookup-1.0.24.jar:lib/jt-scale-1.0.24.jar:lib/jt-scale2-1.0.24.jar:lib/jt-stats-1.0.24.jar:lib/jt-translate-1.0.24.jar:lib/jt-utilities-1.0.24.jar:lib/jt-utils-1.4.0.jar:lib/jt-vectorbin-1.0.24.jar:lib/jt-vectorbinarize-1.4.0.jar:lib/jt-vectorize-1.4.0.jar:lib/jt-warp-1.0.24.jar:lib/jt-zonal-1.0.24.jar:lib/jt-zonalstats-1.4.0.jar:lib/jtransforms-2.4.0.jar:lib/jts-core-1.14.0.jar:lib/jts-example-1.14.0.jar:lib/jts-io-1.14.0.jar:lib/jul-to-slf4j-1.7.22.jar:lib/junit-4.12.jar:lib/kafka-clients-0.8.2.1.jar:lib/kafka_2.11-0.8.2.1.jar:lib/kryo-2.21.jar:lib/kryo-shaded-3.0.3.jar:lib/leveldbjni-all-1.8.jar:lib/libfb303-0.9.3.jar:lib/libthrift-0.9.3.jar:lib/log4j-1.2.17.jar:lib/log4j-over-slf4j-1.7.22.jar:lib/logback-classic-1.1.9.jar:lib/logback-core-1.1.9.jar:lib/lz4-java-1.4.0.jar:lib/machinist_2.11-0.6.1.jar:lib/macro-compat_2.11-1.1.1.jar:lib/metrics-core-2.2.0.jar:lib/metrics-core-3.1.2.jar:lib/metrics-graphite-3.1.2.jar:lib/metrics-json-3.1.5.jar:lib/metrics-jvm-3.1.5.jar:lib/miglayout-3.7-swing.jar:lib/minlog-1.2.jar:lib/minlog-1.3.0.jar:lib/net.opengis.fes-19.2.jar:lib/net.opengis.ows-19.2.jar:lib/net.opengis.wfs-19.2.jar:lib/net.opengis.wmts-19.2.jar:lib/net.opengis.wps-19.2.jar:lib/netty-3.9.9.Final.jar:lib/netty-all-4.1.17.Final.jar:lib/objenesis-2.1.jar:lib/opencsv-2.3.jar:lib/orc-core-1.4.4-nohive.jar:lib/orc-mapreduce-1.4.4-nohive.jar:lib/org.eclipse.emf.common-2.12.0.jar:lib/org.eclipse.emf.ecore-2.12.0.jar:lib/org.eclipse.emf.ecore.xmi-2.12.0.jar:lib/org.eclipse.xsd-2.12.0.jar:lib/org.w3.xlink-19.2.jar:lib/oro-2.0.8.jar:lib/osgi-resource-locator-1.0.1.jar:lib/paranamer-2.3.jar:lib/parquet-column-1.8.3.jar:lib/parquet-common-1.8.3.jar:lib/parquet-encoding-1.8.3.jar:lib/parquet-format-2.3.1.jar:lib/parquet-hadoop-1.8.3.jar:lib/parquet-hadoop-bundle-1.6.0.jar:lib/parquet-jackson-1.8.3.jar:lib/pcollections-2.1.2.jar:lib/picocontainer-1.2.jar:lib/plexus-archiver-2.2.jar:lib/plexus-container-default-1.0-alpha-9-stable-1.jar:lib/plexus-io-2.0.4.jar:lib/plexus-utils-3.0.7.jar:lib/pngj-2.0.1.jar:lib/protobuf-java-2.5.0.jar:lib/py4j-0.10.7.jar:lib/pyrolite-4.13.jar:lib/reactive-streams-1.0.0.jar:lib/reflectasm-1.07-shaded.jar:lib/RoaringBitmap-0.5.11.jar:lib/s3fs-1.5.3.jar:lib/scala-compiler-2.11.0.jar:lib/scala-library-2.11.8.jar:lib/scala-parser-combinators_2.11-1.0.4.jar:lib/scala-reflect-2.11.8.jar:lib/scala-xml_2.11-1.0.2.jar:lib/scalap-2.11.0.jar:lib/servlet-api-2.5.jar:lib/shapeless_2.11-2.3.2.jar:lib/slf4j-api-1.7.22.jar:lib/slf4j-log4j12-1.7.22.jar:lib/snakeyaml-1.17.jar:lib/snappy-0.2.jar:lib/xz-1.0.jar:lib/snappy-java-1.1.2.6.jar:lib/spark-catalyst_2.11-2.3.1.jar:lib/spark-core_2.11-2.3.1.jar:lib/spark-graphx_2.11-2.3.1.jar:lib/spark-hive_2.11-2.3.1.jar:lib/spark-kvstore_2.11-2.3.1.jar:lib/spark-launcher_2.11-2.3.1.jar:lib/spark-mllib-local_2.11-2.3.1.jar:lib/spark-mllib_2.11-2.3.1.jar:lib/spark-network-common_2.11-2.3.1.jar:lib/spark-network-shuffle_2.11-2.3.1.jar:lib/spark-sketch_2.11-2.3.1.jar:lib/spark-sql_2.11-2.3.1.jar:lib/spark-streaming_2.11-2.3.1.jar:lib/spark-tags_2.11-2.3.1.jar:lib/spark-unsafe_2.11-2.3.1.jar:lib/spire-macros_2.11-0.13.0.jar:lib/spire_2.11-0.13.0.jar:lib/spring-aop-4.3.6.RELEASE.jar:lib/spring-beans-4.3.6.RELEASE.jar:lib/spring-boot-1.4.4.RELEASE.jar:lib/spring-boot-autoconfigure-1.4.4.RELEASE.jar:lib/spring-boot-devtools-1.4.4.RELEASE.jar:lib/spring-boot-starter-1.4.4.RELEASE.jar:lib/spring-boot-starter-logging-1.4.4.RELEASE.jar:lib/spring-boot-starter-tomcat-1.4.4.RELEASE.jar:lib/spring-boot-starter-web-1.4.4.RELEASE.jar:lib/spring-context-4.3.6.RELEASE.jar:lib/spring-context-support-4.3.6.RELEASE.jar:lib/spring-core-4.3.6.RELEASE.jar:lib/spring-expression-4.3.6.RELEASE.jar:lib/spring-jdbc-4.3.6.RELEASE.jar:lib/spring-security-config-4.1.4.RELEASE.jar:lib/spring-security-core-4.1.4.RELEASE.jar:lib/spring-security-web-4.1.4.RELEASE.jar:lib/spring-tx-4.3.6.RELEASE.jar:lib/spring-web-4.3.6.RELEASE.jar:lib/spring-webmvc-4.3.6.RELEASE.jar:lib/ST4-4.0.4.jar:lib/stax-api-1.0-2.jar:lib/stax-api-1.0.1.jar:lib/stream-2.7.0.jar:lib/stringtemplate-3.2.1.jar:lib/t-digest-3.2.jar:lib/tika-core-1.5.jar:lib/tomcat-embed-core-8.5.11.jar:lib/tomcat-embed-el-8.5.11.jar:lib/tomcat-embed-websocket-8.5.11.jar:lib/univocity-parsers-2.5.9.jar:lib/unused-1.0.0.jar:lib/uzaygezen-core-0.2.jar:lib/validation-api-1.1.0.Final.jar:lib/vecmath-1.5.2.jar:lib/xbean-asm5-shaded-4.4.jar:lib/xercesImpl-2.9.1.jar:lib/xml-apis-1.4.01.jar:lib/xml-apis-ext-1.3.04.jar:lib/xml-commons-resolver-1.2.jar:lib/xmlenc-0.52.jar:lib/xmlgraphics-commons-2.2.jar:lib/xmlpull-1.1.3.1.jar:lib/xmlunit-1.3.jar:lib/xpp3-1.1.3.4.O.jar:lib/xpp3_min-1.1.4c.jar:lib/xstream-1.4.10.jar:lib/zip4j-1.3.2.jar:lib/zkclient-0.3.jar:lib/zookeeper-3.4.6.jar:lib/zstd-jni-1.3.2-2.jar:./conf 2019-01-04 17:41:21.203 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib 2019-01-04 17:41:21.203 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:java.io.tmpdir=/tmp 2019-01-04 17:41:21.203 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:java.compiler= 2019-01-04 17:41:21.203 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:os.name=Linux 2019-01-04 17:41:21.203 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:os.arch=amd64 2019-01-04 17:41:21.203 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:os.version=3.10.0-514.el7.x86_64 2019-01-04 17:41:21.203 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:user.name=root 2019-01-04 17:41:21.203 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:user.home=/root 2019-01-04 17:41:21.204 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Client environment:user.dir=/usr/test 2019-01-04 17:41:21.205 INFO 25482 --- [nio-8888-exec-1] o.a.h.h.s.o.apache.zookeeper.ZooKeeper : Initiating client connection, connectString=node111:2181,node112:2181,node113:2181 sessionTimeout=90000 watcher=org.apache.hadoop.hbase.zookeeper.PendingWatcher@261cb578 2019-01-04 17:41:21.235 INFO 25482 --- [d(node113:2181)] o.a.h.h.s.o.apache.zookeeper.ClientCnxn : Opening socket connection to server node113/192.168.30.113:2181. Will not attempt to authenticate using SASL (unknown error) 2019-01-04 17:41:21.236 INFO 25482 --- [d(node113:2181)] o.a.h.h.s.o.apache.zookeeper.ClientCnxn : Socket connection established to node113/192.168.30.113:2181, initiating session 2019-01-04 17:41:21.245 INFO 25482 --- [d(node113:2181)] o.a.h.h.s.o.apache.zookeeper.ClientCnxn : Session establishment complete on server node113/192.168.30.113:2181, sessionid = 0x367a6debbd30247, negotiated timeout = 40000 2019-01-04 17:41:22.059 INFO 25482 --- [er-event-loop-0] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.117:57652) with ID 1 2019-01-04 17:41:22.149 INFO 25482 --- [er-event-loop-3] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.115:50146) with ID 3 2019-01-04 17:41:22.152 INFO 25482 --- [er-event-loop-1] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.116:44936) with ID 6 2019-01-04 17:41:22.167 INFO 25482 --- [er-event-loop-0] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.117:40016 with 4.1 GB RAM, BlockManagerId(1, 192.168.30.117, 40016, None) 2019-01-04 17:41:22.194 INFO 25482 --- [er-event-loop-1] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.121:57464) with ID 7 2019-01-04 17:41:22.235 INFO 25482 --- [er-event-loop-1] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.120:59326) with ID 8 2019-01-04 17:41:22.244 INFO 25482 --- [er-event-loop-0] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.119:37412) with ID 9 2019-01-04 17:41:22.258 INFO 25482 --- [er-event-loop-3] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.116:44381 with 4.1 GB RAM, BlockManagerId(6, 192.168.30.116, 44381, None) 2019-01-04 17:41:22.264 INFO 25482 --- [er-event-loop-0] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.115:46162 with 4.1 GB RAM, BlockManagerId(3, 192.168.30.115, 46162, None) 2019-01-04 17:41:22.298 INFO 25482 --- [er-event-loop-1] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.121:43572 with 4.1 GB RAM, BlockManagerId(7, 192.168.30.121, 43572, None) 2019-01-04 17:41:22.306 INFO 25482 --- [er-event-loop-3] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.113:36772) with ID 2 2019-01-04 17:41:22.318 INFO 25482 --- [er-event-loop-3] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.118:48086) with ID 5 2019-01-04 17:41:22.350 INFO 25482 --- [er-event-loop-1] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.120:45243 with 4.1 GB RAM, BlockManagerId(8, 192.168.30.120, 45243, None) 2019-01-04 17:41:22.363 INFO 25482 --- [er-event-loop-3] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.119:36771 with 4.1 GB RAM, BlockManagerId(9, 192.168.30.119, 36771, None) 2019-01-04 17:41:22.364 INFO 25482 --- [er-event-loop-3] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.123:43962) with ID 4 2019-01-04 17:41:22.411 INFO 25482 --- [er-event-loop-0] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.113:39473 with 4.1 GB RAM, BlockManagerId(2, 192.168.30.113, 39473, None) 2019-01-04 17:41:22.417 INFO 25482 --- [er-event-loop-2] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.118:45642 with 4.1 GB RAM, BlockManagerId(5, 192.168.30.118, 45642, None) 2019-01-04 17:41:22.466 INFO 25482 --- [er-event-loop-1] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.123:34315 with 4.1 GB RAM, BlockManagerId(4, 192.168.30.123, 34315, None) 2019-01-04 17:41:22.470 INFO 25482 --- [er-event-loop-3] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.30.122:41486) with ID 0 2019-01-04 17:41:22.564 INFO 25482 --- [er-event-loop-1] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 192.168.30.122:33349 with 4.1 GB RAM, BlockManagerId(0, 192.168.30.122, 33349, None) 2019-01-04 17:41:22.659 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'GeoServerResourceLoader', but ApplicationContext is unset. 2019-01-04 17:41:22.659 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'GeoServerResourceLoader', but ApplicationContext is unset. 2019-01-04 17:41:22.660 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'ExtensionFilter', but ApplicationContext is unset. 2019-01-04 17:41:22.664 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'ExtensionProvider', but ApplicationContext is unset. 2019-01-04 17:41:22.664 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'ExtensionFilter', but ApplicationContext is unset. 2019-01-04 17:41:22.676 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'GeoServerResourceLoader', but ApplicationContext is unset. 2019-01-04 17:41:22.676 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'GeoServerResourceLoader', but ApplicationContext is unset. 2019-01-04 17:41:22.676 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'ExtensionFilter', but ApplicationContext is unset. 2019-01-04 17:41:22.676 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'ExtensionProvider', but ApplicationContext is unset. 2019-01-04 17:41:22.676 WARN 25482 --- [nio-8888-exec-1] org.geoserver.platform : Extension lookup 'ExtensionFilter', but ApplicationContext is unset. 2019-01-04 17:41:22.882 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkContext : Starting job: count at geowaveSpark.java:120 2019-01-04 17:41:22.895 INFO 25482 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Got job 0 (count at geowaveSpark.java:120) with 1 output partitions 2019-01-04 17:41:22.896 INFO 25482 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Final stage: ResultStage 0 (count at geowaveSpark.java:120) 2019-01-04 17:41:22.896 INFO 25482 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Parents of final stage: List() 2019-01-04 17:41:22.897 INFO 25482 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Missing parents: List() 2019-01-04 17:41:22.906 INFO 25482 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting ResultStage 0 (NewHadoopRDD[0] at newAPIHadoopRDD at GeoWaveRDDLoader.java:163), which has no missing parents 2019-01-04 17:41:22.936 INFO 25482 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1 stored as values in memory (estimated size 2.0 KB, free 4.0 GB) 2019-01-04 17:41:22.951 INFO 25482 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1_piece0 stored as bytes in memory (estimated size 1288.0 B, free 4.0 GB) 2019-01-04 17:41:22.952 INFO 25482 --- [er-event-loop-0] o.apache.spark.storage.BlockManagerInfo : Added broadcast_1_piece0 in memory on node114:36049 (size: 1288.0 B, free: 4.0 GB) 2019-01-04 17:41:22.953 INFO 25482 --- [uler-event-loop] org.apache.spark.SparkContext : Created broadcast 1 from broadcast at DAGScheduler.scala:1039 2019-01-04 17:41:22.968 INFO 25482 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting 1 missing tasks from ResultStage 0 (NewHadoopRDD[0] at newAPIHadoopRDD at GeoWaveRDDLoader.java:163) (first 15 tasks are for partitions Vector(0)) 2019-01-04 17:41:22.969 INFO 25482 --- [uler-event-loop] o.a.spark.scheduler.TaskSchedulerImpl : Adding task set 0.0 with 1 tasks 2019-01-04 17:41:23.003 INFO 25482 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager : Starting task 0.0 in stage 0.0 (TID 0, 192.168.30.119, executor 9, partition 0, PROCESS_LOCAL, 9269 bytes) 2019-01-04 17:41:28.888 INFO 25482 --- [er-event-loop-1] o.apache.spark.storage.BlockManagerInfo : Added broadcast_1_piece0 in memory on 192.168.30.119:36771 (size: 1288.0 B, free: 4.1 GB) 2019-01-04 17:41:29.204 WARN 25482 --- [result-getter-0] o.apache.spark.scheduler.TaskSetManager : Lost task 0.0 in stage 0.0 (TID 0, 192.168.30.119, executor 9): java.io.EOFException at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:283) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$8.apply(TorrentBroadcast.scala:308) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1380) at org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:309) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1$$anonfun$apply$2.apply(TorrentBroadcast.scala:235) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:211) at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1346) at org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:207) at org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:96) at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:81) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)

2019-01-04 17:41:29.206 INFO 25482 --- [er-event-loop-0] o.apache.spark.scheduler.TaskSetManager : Starting task 0.1 in stage 0.0 (TID 1, 192.168.30.117, executor 1, partition 0, PROCESS_LOCAL, 9269 bytes) 2019-01-04 17:41:32.547 INFO 25482 --- [er-event-loop-0] o.apache.spark.storage.BlockManagerInfo : Added broadcast_1_piece0 in memory on 192.168.30.117:40016 (size: 1288.0 B, free: 4.1 GB) 2019-01-04 17:41:32.857 INFO 25482 --- [result-getter-1] o.apache.spark.scheduler.TaskSetManager : Lost task 0.1 in stage 0.0 (TID 1) on 192.168.30.117, executor 1: java.io.EOFException (null) [duplicate 1] 2019-01-04 17:41:32.859 INFO 25482 --- [er-event-loop-3] o.apache.spark.scheduler.TaskSetManager : Starting task 0.2 in stage 0.0 (TID 2, 192.168.30.116, executor 6, partition 0, PROCESS_LOCAL, 9269 bytes) 2019-01-04 17:41:36.406 INFO 25482 --- [er-event-loop-2] o.apache.spark.storage.BlockManagerInfo : Added broadcast_1_piece0 in memory on 192.168.30.116:44381 (size: 1288.0 B, free: 4.1 GB) 2019-01-04 17:41:36.744 INFO 25482 --- [result-getter-2] o.apache.spark.scheduler.TaskSetManager : Lost task 0.2 in stage 0.0 (TID 2) on 192.168.30.116, executor 6: java.io.EOFException (null) [duplicate 2] 2019-01-04 17:41:36.746 INFO 25482 --- [er-event-loop-1] o.apache.spark.scheduler.TaskSetManager : Starting task 0.3 in stage 0.0 (TID 3, 192.168.30.116, executor 6, partition 0, PROCESS_LOCAL, 9269 bytes) 2019-01-04 17:41:36.772 INFO 25482 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager : Lost task 0.3 in stage 0.0 (TID 3) on 192.168.30.116, executor 6: java.io.EOFException (null) [duplicate 3] 2019-01-04 17:41:36.773 ERROR 25482 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager : Task 0 in stage 0.0 failed 4 times; aborting job 2019-01-04 17:41:36.774 INFO 25482 --- [result-getter-3] o.a.spark.scheduler.TaskSchedulerImpl : Removed TaskSet 0.0, whose tasks have all completed, from pool 2019-01-04 17:41:36.776 INFO 25482 --- [uler-event-loop] o.a.spark.scheduler.TaskSchedulerImpl : Cancelling stage 0 2019-01-04 17:41:36.778 INFO 25482 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : ResultStage 0 (count at geowaveSpark.java:120) failed in 13.854 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 192.168.30.116, executor 6): java.io.EOFException at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:283) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$8.apply(TorrentBroadcast.scala:308) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1380) at org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:309) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1$$anonfun$apply$2.apply(TorrentBroadcast.scala:235) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:211) at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1346) at org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:207) at org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:96) at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:81) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)

Driver stacktrace: 2019-01-04 17:41:36.781 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.scheduler.DAGScheduler : Job 0 failed: count at geowaveSpark.java:120, took 13.898116 s org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 192.168.30.116, executor 6): java.io.EOFException at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:283) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$8.apply(TorrentBroadcast.scala:308) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1380) at org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:309) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1$$anonfun$apply$2.apply(TorrentBroadcast.scala:235) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:211) at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1346) at org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:207) at org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:96) at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:81) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)

Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1602) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1590) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1589) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1589) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) at scala.Option.foreach(Option.scala:257) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:831) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1823) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1772) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1761) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:642) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2034) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2055) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2074) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2099) at org.apache.spark.rdd.RDD.count(RDD.scala:1162) at org.apache.spark.api.java.JavaRDDLike$class.count(JavaRDDLike.scala:455) at org.apache.spark.api.java.AbstractJavaRDDLike.count(JavaRDDLike.scala:45) at com.sapsoft.spark.geowaveSpark.main(geowaveSpark.java:120) at com.sapsoft.controller.geowaveController.geowaveSpark(geowaveController.java:703) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:116) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872) at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846) at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:208) at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177) at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346) at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:105) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:81) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:197) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:474) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349) at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:783) at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66) at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:798) at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1434) at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.EOFException at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:283) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$8.apply(TorrentBroadcast.scala:308) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1380) at org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:309) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1$$anonfun$apply$2.apply(TorrentBroadcast.scala:235) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:211) at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1346) at org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:207) at org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:96) at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:81) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ... 1 more 2019-01-04 17:41:36.789 INFO 25482 --- [nio-8888-exec-1] o.s.jetty.server.AbstractConnector : Stopped Spark@f5e3910{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2019-01-04 17:41:36.791 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.ui.SparkUI : Stopped Spark web UI at http://node114:4040 2019-01-04 17:41:36.796 INFO 25482 --- [nio-8888-exec-1] o.a.s.s.c.StandaloneSchedulerBackend : Shutting down all executors 2019-01-04 17:41:36.796 INFO 25482 --- [er-event-loop-0] seGrainedSchedulerBackend$DriverEndpoint : Asking each executor to shut down 2019-01-04 17:41:36.807 INFO 25482 --- [er-event-loop-2] o.a.s.MapOutputTrackerMasterEndpoint : MapOutputTrackerMasterEndpoint stopped! 2019-01-04 17:41:36.827 INFO 25482 --- [nio-8888-exec-1] o.a.spark.storage.memory.MemoryStore : MemoryStore cleared 2019-01-04 17:41:36.828 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.storage.BlockManager : BlockManager stopped 2019-01-04 17:41:36.829 INFO 25482 --- [nio-8888-exec-1] o.a.spark.storage.BlockManagerMaster : BlockManagerMaster stopped 2019-01-04 17:41:36.832 INFO 25482 --- [er-event-loop-0] rdinator$OutputCommitCoordinatorEndpoint : OutputCommitCoordinator stopped! 2019-01-04 17:41:36.835 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkContext : Successfully stopped SparkContext 2019-01-04 17:41:36.835 INFO 25482 --- [nio-8888-exec-1] org.apache.spark.SparkContext : SparkContext already stopped. `

hsg77 commented 5 years ago

memory use information: image

rfecher commented 5 years ago

Does it work without using the Kryo Serialization when its run on the cluster?

We include spark and Kryo libraries in our coomandline tools jar and my first instinct is version conflicts between the tools jar and your environment. We'd have to try to recreate to be sure, but any additional info you have will be helpful. Thanks!

hsg77 commented 5 years ago

The dependency libraries generated by my project include the jar version as follows: :lib/spark-catalyst_2.11-2.3.1.jar :lib/spark-core_2.11-2.3.1.jar :lib/spark-graphx_2.11-2.3.1.jar :lib/spark-hive_2.11-2.3.1.jar :lib/spark-kvstore_2.11-2.3.1.jar :lib/spark-launcher_2.11-2.3.1.jar :lib/spark-mllib-local_2.11-2.3.1.jar :lib/spark-mllib_2.11-2.3.1.jar :lib/spark-network-common_2.11-2.3.1.jar :lib/spark-network-shuffle_2.11-2.3.1.jar :lib/spark-sketch_2.11-2.3.1.jar :lib/spark-sql_2.11-2.3.1.jar :lib/spark-streaming_2.11-2.3.1.jar :lib/spark-tags_2.11-2.3.1.jar :lib/spark-unsafe_2.11-2.3.1.jar

:lib/kryo-2.21.jar :lib/kryo-shaded-3.0.3.jar

:lib/guava-12.0.1.jar :lib/guice-3.0.jar :lib/h2-1.4.193.jar :lib/hadoop-annotations-2.7.4.jar :lib/hadoop-auth-2.7.4.jar :lib/hadoop-client-2.7.4.jar :lib/hadoop-common-2.7.4.jar :lib/hadoop-hdfs-2.7.4.jar :lib/hadoop-mapreduce-client-app-2.7.4.jar :lib/hadoop-mapreduce-client-common-2.7.4.jar :lib/hadoop-mapreduce-client-core-2.7.4.jar :lib/hadoop-mapreduce-client-jobclient-2.7.4.jar :lib/hadoop-mapreduce-client-shuffle-2.7.4.jar :lib/hadoop-yarn-api-2.7.4.jar :lib/hadoop-yarn-client-2.7.4.jar :lib/hadoop-yarn-common-2.7.4.jar :lib/hadoop-yarn-server-common-2.7.4.jar :lib/hamcrest-core-1.3.jar :lib/hbase-shaded-client-1.3.1.jar :lib/hbase-shaded-server-1.3.1.jar

rfecher commented 5 years ago

I can't say for sure that its the issue, but the kryo-shaded v3.03 and kryo v2.21 version mismatch stands out to me as a potential concern.

hsg77 commented 5 years ago

@rfecher Thank you very much. Your first instinct is very accurate. The problem is the kryo version conflict. I delete the kryo-2.21.jar file and keep the kryo-shaded-3.0.3.jar file in the execution environment, and the execution is OK. After checking, there is a class in kryo-shaded-3.0.3.jar and geowave-tools-0.9.8-apache.jar files. This class is com.esofericsoftwave.kryo.io.UnSafeOutput.class, which is 7066 in size. However, kryo-2.21.jar does not have this class. In addition, add a line to the above code: Conf. setJars (JavaSparkContext. jarOfClass (this. getClass ()); Running is completely OK

rfecher commented 5 years ago

Glad to hear its resolved and thanks for the feedback!