apache / druid

Apache Druid: a high performance real-time analytics database.
https://druid.apache.org/
Apache License 2.0
13.44k stars 3.69k forks source link

[druid-hdfs-storage] fully qualified deep storage path throws exceptions #6820

Closed swd543 closed 5 years ago

swd543 commented 5 years ago

A fully qualified path like druid.storage.storageDirectory=hdfs://vm-hadoop:9000/druid/segments causes any ingestion tasks to fail. How would we persist data into a remote hdfs cluster? The logs are attached below.

2019-01-08T13:15:16,785 INFO [main] org.apache.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2019-01-08T13:15:16,789 INFO [main] org.apache.druid.guice.PropertiesModule - Loading properties from runtime.properties
2019-01-08T13:15:16,822 INFO [main] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.1.3.Final
2019-01-08T13:15:17,305 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.common.config.NullValueHandlingConfig] from props[druid.generic.] as [org.apache.druid.common.config.NullValueHandlingConfig@2d6764b2]
2019-01-08T13:15:17,366 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, directory='extensions', useExtensionClassloaderFirst=false, hadoopDependenciesDir='hadoop-dependencies', hadoopContainerDruidClasspath='null', addExtensionsToHadoopContainer=false, loadList=[druid-histogram, druid-datasketches, druid-lookups-cached-global, druid-hdfs-storage]}]
2019-01-08T13:15:17,369 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-histogram] for class [interface org.apache.druid.cli.CliCommandCreator]
2019-01-08T13:15:17,381 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-histogram/druid-histogram-0.13.0-incubating.jar] for extension[druid-histogram]
2019-01-08T13:15:17,382 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-datasketches] for class [interface org.apache.druid.cli.CliCommandCreator]
2019-01-08T13:15:17,382 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-datasketches/sketches-core-0.10.3.jar] for extension[druid-datasketches]
2019-01-08T13:15:17,383 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-datasketches/druid-datasketches-0.13.0-incubating.jar] for extension[druid-datasketches]
2019-01-08T13:15:17,383 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-datasketches/memory-0.10.3.jar] for extension[druid-datasketches]
2019-01-08T13:15:17,383 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-datasketches/commons-math3-3.6.1.jar] for extension[druid-datasketches]
2019-01-08T13:15:17,385 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-lookups-cached-global] for class [interface org.apache.druid.cli.CliCommandCreator]
2019-01-08T13:15:17,385 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-lookups-cached-global/druid-lookups-cached-global-0.13.0-incubating.jar] for extension[druid-lookups-cached-global]
2019-01-08T13:15:17,386 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-hdfs-storage] for class [interface org.apache.druid.cli.CliCommandCreator]
2019-01-08T13:15:17,387 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/json-smart-1.1.1.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,387 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/apacheds-i18n-2.0.0-M15.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,387 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/jaxb-api-2.2.2.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,388 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/commons-digester-1.8.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,388 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/objenesis-2.6.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,388 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-mapreduce-client-shuffle-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,389 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/xmlenc-0.52.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,389 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-hdfs-client-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,389 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/stax-api-1.0-2.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,389 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/jackson-xc-1.9.13.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,390 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/nimbus-jose-jwt-3.9.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,390 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/jcip-annotations-1.0.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,390 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/api-util-1.0.0-M20.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,390 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/gson-2.2.4.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,391 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/commons-net-3.1.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,391 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/druid-hdfs-storage-0.13.0-incubating.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,391 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-common-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,391 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/curator-recipes-4.0.0.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,391 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/apacheds-kerberos-codec-2.0.0-M15.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,392 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/curator-framework-4.0.0.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,392 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-mapreduce-client-app-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,392 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/commons-beanutils-core-1.8.0.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,393 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/leveldbjni-all-1.8.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,393 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-auth-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,393 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-client-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,393 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/jetty-sslengine-6.1.26.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,397 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/commons-configuration-1.6.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,397 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/okio-1.4.0.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,397 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/commons-compress-1.16.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,398 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/jackson-jaxrs-1.9.13.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,398 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/servlet-api-2.5.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,398 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-mapreduce-client-core-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,398 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-yarn-server-common-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,399 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/commons-collections-3.2.2.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,399 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-yarn-common-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,399 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/jsp-api-2.1.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,399 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-yarn-api-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,400 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/jersey-client-1.9.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,400 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-mapreduce-client-jobclient-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,400 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/okhttp-2.4.0.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,400 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/htrace-core4-4.0.1-incubating.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,401 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-yarn-client-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,401 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/api-asn1-api-1.0.0-M20.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,401 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/commons-beanutils-1.7.0.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,401 INFO [main] org.apache.druid.initialization.Initialization - added URL[file:/opt/druid-0.13.0/extensions/druid-hdfs-storage/hadoop-mapreduce-client-common-2.8.3.jar] for extension[druid-hdfs-storage]
2019-01-08T13:15:17,500 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.guice.ModulesConfig] from props[druid.modules.] as [ModulesConfig{excludeList=[]}]
2019-01-08T13:15:17,702 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-histogram] for class [interface org.apache.druid.initialization.DruidModule]
2019-01-08T13:15:17,704 INFO [main] org.apache.druid.initialization.Initialization - Adding implementation [org.apache.druid.query.aggregation.histogram.ApproximateHistogramDruidModule] for class [interface org.apache.druid.initialization.DruidModule] from local file system extension
2019-01-08T13:15:17,705 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-datasketches] for class [interface org.apache.druid.initialization.DruidModule]
2019-01-08T13:15:17,707 INFO [main] org.apache.druid.initialization.Initialization - Adding implementation [org.apache.druid.query.aggregation.datasketches.theta.SketchModule] for class [interface org.apache.druid.initialization.DruidModule] from local file system extension
2019-01-08T13:15:17,708 INFO [main] org.apache.druid.initialization.Initialization - Adding implementation [org.apache.druid.query.aggregation.datasketches.theta.oldapi.OldApiSketchModule] for class [interface org.apache.druid.initialization.DruidModule] from local file system extension
2019-01-08T13:15:17,710 INFO [main] org.apache.druid.initialization.Initialization - Adding implementation [org.apache.druid.query.aggregation.datasketches.quantiles.DoublesSketchModule] for class [interface org.apache.druid.initialization.DruidModule] from local file system extension
2019-01-08T13:15:17,712 INFO [main] org.apache.druid.initialization.Initialization - Adding implementation [org.apache.druid.query.aggregation.datasketches.tuple.ArrayOfDoublesSketchModule] for class [interface org.apache.druid.initialization.DruidModule] from local file system extension
2019-01-08T13:15:17,713 INFO [main] org.apache.druid.initialization.Initialization - Adding implementation [org.apache.druid.query.aggregation.datasketches.hll.HllSketchModule] for class [interface org.apache.druid.initialization.DruidModule] from local file system extension
2019-01-08T13:15:17,714 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-lookups-cached-global] for class [interface org.apache.druid.initialization.DruidModule]
2019-01-08T13:15:17,716 INFO [main] org.apache.druid.initialization.Initialization - Adding implementation [org.apache.druid.server.lookup.namespace.NamespaceExtractionModule] for class [interface org.apache.druid.initialization.DruidModule] from local file system extension
2019-01-08T13:15:17,716 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-hdfs-storage] for class [interface org.apache.druid.initialization.DruidModule]
2019-01-08T13:15:17,718 INFO [main] org.apache.druid.initialization.Initialization - Adding implementation [org.apache.druid.storage.hdfs.HdfsStorageDruidModule] for class [interface org.apache.druid.initialization.DruidModule] from local file system extension
2019-01-08T13:15:18,830 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-01-08T13:15:19,962 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.java.util.emitter.core.LoggingEmitterConfig] from props[druid.emitter.logging.] as [LoggingEmitterConfig{loggerClass='org.apache.druid.java.util.emitter.core.LoggingEmitter', logLevel='info'}]
2019-01-08T13:15:19,991 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[interface org.apache.druid.server.security.Escalator] from props[druid.escalator.] as [NoopEscalator{}]
2019-01-08T13:15:20,021 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.curator.CuratorConfig] from props[druid.zk.service.] as [org.apache.druid.curator.CuratorConfig@709ed6f3]
2019-01-08T13:15:20,029 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.curator.ExhibitorConfig] from props[druid.exhibitor.service.] as [org.apache.druid.curator.ExhibitorConfig@1687eb01]
2019-01-08T13:15:20,085 INFO [main] org.apache.curator.utils.Compatibility - Running in ZooKeeper 3.4.x compatibility mode
2019-01-08T13:15:20,087 WARN [main] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29
2019-01-08T13:15:20,136 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.DruidNode] from props[druid.] as [DruidNode{serviceName='druid/middleManager', host='vm-druid', port=-1, plaintextPort=8100, enablePlaintextPort=true, tlsPort=-1, enableTlsPort=false}]
2019-01-08T13:15:20,141 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.initialization.ZkPathsConfig] from props[druid.zk.paths.] as [org.apache.druid.server.initialization.ZkPathsConfig@22e2266d]
2019-01-08T13:15:20,149 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.security.AuthConfig] from props[druid.auth.] as [AuthConfig{authenticatorChain=null, authorizers=null, unsecuredPaths=[], allowUnauthenticatedHttpOptions=false}]
2019-01-08T13:15:20,194 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.metrics.DruidMonitorSchedulerConfig] from props[druid.monitoring.] as [org.apache.druid.server.metrics.DruidMonitorSchedulerConfig@3e5fd2b1]
2019-01-08T13:15:20,199 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.metrics.MonitorsConfig] from props[druid.monitoring.] as [MonitorsConfig{monitors=[class org.apache.druid.java.util.metrics.JvmMonitor]}]
2019-01-08T13:15:20,200 INFO [main] org.apache.druid.server.emitter.EmitterModule - Underlying emitter for ServiceEmitter: LoggingEmitter{log=Logger{name=[org.apache.druid.java.util.emitter.core.LoggingEmitter], class[class org.apache.logging.slf4j.Log4jLogger]}, level=INFO}
2019-01-08T13:15:20,200 INFO [main] org.apache.druid.server.emitter.EmitterModule - Extra service dimensions: {version=0.13.0-incubating}
2019-01-08T13:15:20,487 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.initialization.CuratorDiscoveryConfig] from props[druid.discovery.curator.] as [org.apache.druid.server.initialization.CuratorDiscoveryConfig@4eb9f2af]
2019-01-08T13:15:20,508 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.initialization.ServerConfig] from props[druid.server.http.] as [ServerConfig{numThreads=25, queueSize=2147483647, enableRequestLimit=false, maxIdleTime=PT5M, defaultQueryTimeout=300000, maxScatterGatherBytes=9223372036854775807, maxQueryTimeout=9223372036854775807, maxRequestHeaderSize=8192, gracefulShutdownTimeout=PT0S, unannouncePropagationDelay=PT0S, inflateBufferSize=4096, compressionLevel=-1}]
2019-01-08T13:15:20,552 INFO [main] org.apache.druid.server.metrics.MetricsModule - Adding monitor[org.apache.druid.java.util.metrics.JvmMonitor@b73433]
2019-01-08T13:15:20,553 INFO [main] org.apache.druid.server.metrics.MetricsModule - Adding monitor[org.apache.druid.query.ExecutorServiceMonitor@40247d48]
2019-01-08T13:15:20,554 INFO [main] org.apache.druid.server.metrics.MetricsModule - Adding monitor[org.apache.druid.server.initialization.jetty.JettyServerModule$JettyMonitor@62a68bcb]
2019-01-08T13:15:20,565 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.log.StartupLoggingConfig] from props[druid.startup.logging.] as [org.apache.druid.server.log.StartupLoggingConfig@2262d6d5]
2019-01-08T13:15:20,566 INFO [main] org.apache.druid.cli.CliPeon - Starting up with processors[2], memory[348,651,520], maxMemory[1,908,932,608].
2019-01-08T13:15:20,567 INFO [main] org.apache.druid.cli.CliPeon - * awt.toolkit: sun.awt.X11.XToolkit
2019-01-08T13:15:20,567 INFO [main] org.apache.druid.cli.CliPeon - * druid.emitter: logging
2019-01-08T13:15:20,567 INFO [main] org.apache.druid.cli.CliPeon - * druid.emitter.logging.logLevel: info
2019-01-08T13:15:20,568 INFO [main] org.apache.druid.cli.CliPeon - * druid.extensions.loadList: ["druid-histogram", "druid-datasketches", "druid-lookups-cached-global","druid-hdfs-storage"]
2019-01-08T13:15:20,568 INFO [main] org.apache.druid.cli.CliPeon - * druid.host: vm-druid
2019-01-08T13:15:20,568 INFO [main] org.apache.druid.cli.CliPeon - * druid.indexer.fork.property.druid.processing.buffer.sizeBytes: 256000000
2019-01-08T13:15:20,568 INFO [main] org.apache.druid.cli.CliPeon - * druid.indexer.fork.property.druid.processing.numThreads: 2
2019-01-08T13:15:20,568 INFO [main] org.apache.druid.cli.CliPeon - * druid.indexer.logs.directory: var/druid/indexing-logs
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.indexer.logs.type: file
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.indexer.runner.javaOpts: -server -Xmx2g -Duser.timezone=UTC -Dfile.encoding=UTF-8 -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.indexer.task.baseTaskDir: var/druid/task
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.indexer.task.hadoopWorkingPath: var/druid/hadoop-tmp
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.indexing.doubleStorage: double
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.metadata.storage.connector.connectURI: jdbc:derby://localhost:1527/var/druid/metadata.db;create=true
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.metadata.storage.connector.host: localhost
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.metadata.storage.connector.port: 1527
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.metadata.storage.type: derby
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.metrics.emitter.dimension.dataSource: wikipedia
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.metrics.emitter.dimension.taskId: index_wikipedia_2019-01-08T13:15:15.682Z
2019-01-08T13:15:20,569 INFO [main] org.apache.druid.cli.CliPeon - * druid.metrics.emitter.dimension.taskType: index
2019-01-08T13:15:20,570 INFO [main] org.apache.druid.cli.CliPeon - * druid.monitoring.monitors: ["org.apache.druid.java.util.metrics.JvmMonitor"]
2019-01-08T13:15:20,570 INFO [main] org.apache.druid.cli.CliPeon - * druid.plaintextPort: 8100
2019-01-08T13:15:20,570 INFO [main] org.apache.druid.cli.CliPeon - * druid.processing.buffer.sizeBytes: 256000000
2019-01-08T13:15:20,570 INFO [main] org.apache.druid.cli.CliPeon - * druid.processing.numThreads: 2
2019-01-08T13:15:20,570 INFO [main] org.apache.druid.cli.CliPeon - * druid.selectors.coordinator.serviceName: druid/coordinator
2019-01-08T13:15:20,571 INFO [main] org.apache.druid.cli.CliPeon - * druid.selectors.indexing.serviceName: druid/overlord
2019-01-08T13:15:20,571 INFO [main] org.apache.druid.cli.CliPeon - * druid.server.http.numThreads: 25
2019-01-08T13:15:20,571 INFO [main] org.apache.druid.cli.CliPeon - * druid.service: druid/middleManager
2019-01-08T13:15:20,571 INFO [main] org.apache.druid.cli.CliPeon - * druid.startup.logging.logProperties: true
2019-01-08T13:15:20,571 INFO [main] org.apache.druid.cli.CliPeon - * druid.storage.storageDirectory: hdfs://vm-hadoop:9000/druid/segments
2019-01-08T13:15:20,571 INFO [main] org.apache.druid.cli.CliPeon - * druid.storage.type: hdfs
2019-01-08T13:15:20,571 INFO [main] org.apache.druid.cli.CliPeon - * druid.tlsPort: -1
2019-01-08T13:15:20,571 INFO [main] org.apache.druid.cli.CliPeon - * druid.worker.capacity: 3
2019-01-08T13:15:20,572 INFO [main] org.apache.druid.cli.CliPeon - * druid.zk.paths.base: /druid
2019-01-08T13:15:20,573 INFO [main] org.apache.druid.cli.CliPeon - * druid.zk.service.host: vm-kafka
2019-01-08T13:15:20,574 INFO [main] org.apache.druid.cli.CliPeon - * file.encoding: UTF-8
2019-01-08T13:15:20,574 INFO [main] org.apache.druid.cli.CliPeon - * file.encoding.pkg: sun.io
2019-01-08T13:15:20,574 INFO [main] org.apache.druid.cli.CliPeon - * file.separator: /
2019-01-08T13:15:20,574 INFO [main] org.apache.druid.cli.CliPeon - * java.awt.graphicsenv: sun.awt.X11GraphicsEnvironment
2019-01-08T13:15:20,574 INFO [main] org.apache.druid.cli.CliPeon - * java.awt.printerjob: sun.print.PSPrinterJob
2019-01-08T13:15:20,574 INFO [main] org.apache.druid.cli.CliPeon - * java.class.path: conf/druid/_common:conf/druid/middleManager:lib/druid-indexing-hadoop-0.13.0-incubating.jar:lib/reactive-streams-1.0.2.jar:lib/async-http-client-netty-utils-2.5.3.jar:lib/disruptor-3.3.6.jar:lib/commons-lang-2.6.jar:lib/commons-compiler-2.7.6.jar:lib/guice-multibindings-4.1.0.jar:lib/javax.inject-1.jar:lib/guice-servlet-4.1.0.jar:lib/jackson-datatype-guava-2.6.7.jar:lib/log4j-jul-2.5.jar:lib/aether-api-0.9.0.M2.jar:lib/aws-java-sdk-s3-1.11.199.jar:lib/config-magic-0.9.jar:lib/commons-lang3-3.2.jar:lib/jackson-annotations-2.6.7.jar:lib/jetty-server-9.4.10.v20180503.jar:lib/jmespath-java-1.11.199.jar:lib/avatica-server-1.10.0.jar:lib/avatica-core-1.10.0.jar:lib/jackson-core-2.6.7.jar:lib/objenesis-2.6.jar:lib/netty-3.10.6.Final.jar:lib/javax.servlet-api-3.1.0.jar:lib/aggdesigner-algorithm-6.0.jar:lib/jackson-dataformat-cbor-2.6.7.jar:lib/curator-x-discovery-4.0.0.jar:lib/aws-java-sdk-core-1.11.199.jar:lib/netty-handler-4.1.29.Final.jar:lib/asm-commons-5.2.jar:lib/netty-all-4.1.30.Final.jar:lib/plexus-utils-3.0.15.jar:lib/calcite-core-1.17.0.jar:lib/druid-sql-0.13.0-incubating.jar:lib/log4j-slf4j-impl-2.5.jar:lib/caffeine-2.5.5.jar:lib/javax.el-3.0.0.jar:lib/jsr305-2.0.1.jar:lib/joda-time-2.9.9.jar:lib/jdbi-2.63.1.jar:lib/classmate-1.0.0.jar:lib/druid-hll-0.13.0-incubating.jar:lib/mapdb-1.0.8.jar:lib/jackson-dataformat-smile-2.6.7.jar:lib/commons-dbcp2-2.0.1.jar:lib/aws-java-sdk-kms-1.11.199.jar:lib/jetty-servlets-9.4.10.v20180503.jar:lib/validation-api-1.1.0.Final.jar:lib/jline-0.9.94.jar:lib/commons-logging-1.1.1.jar:lib/metrics-core-4.0.0.jar:lib/error_prone_annotations-2.2.0.jar:lib/plexus-interpolation-1.19.jar:lib/druid-api-0.13.0-incubating.jar:lib/netty-resolver-4.1.29.Final.jar:lib/jetty-io-9.4.10.v20180503.jar:lib/aether-util-0.9.0.M2.jar:lib/okhttp-1.0.2.jar:lib/hibernate-validator-5.1.3.Final.jar:lib/druid-console-0.0.4.jar:lib/commons-pool2-2.2.jar:lib/aether-connector-file-0.9.0.M2.jar:lib/maven-model-3.1.1.jar:lib/netty-codec-http-4.1.29.Final.jar:lib/rhino-1.7R5.jar:lib/sigar-1.6.5.132.jar:lib/netty-buffer-4.1.29.Final.jar:lib/wagon-provider-api-2.4.jar:lib/derbynet-10.11.1.1.jar:lib/jetty-security-9.4.10.v20180503.jar:lib/aether-connector-okhttp-0.0.9.jar:lib/commons-codec-1.7.jar:lib/ion-java-1.0.2.jar:lib/curator-recipes-4.0.0.jar:lib/guava-16.0.1.jar:lib/derby-10.11.1.1.jar:lib/netty-transport-4.1.29.Final.jar:lib/async-http-client-2.5.3.jar:lib/extendedset-0.13.0-incubating.jar:lib/aether-spi-0.9.0.M2.jar:lib/curator-framework-4.0.0.jar:lib/aws-java-sdk-ec2-1.11.199.jar:lib/jsr311-api-1.1.1.jar:lib/druid-server-0.13.0-incubating.jar:lib/jersey-server-1.19.3.jar:lib/druid-aws-common-0.13.0-incubating.jar:lib/aopalliance-1.0.jar:lib/calcite-linq4j-1.17.0.jar:lib/maven-repository-metadata-3.1.1.jar:lib/netty-codec-dns-4.1.29.Final.jar:lib/maven-settings-builder-3.1.1.jar:lib/httpcore-4.4.4.jar:lib/janino-2.7.6.jar:lib/commons-io-2.5.jar:lib/jetty-client-9.4.10.v20180503.jar:lib/jackson-module-jaxb-annotations-2.6.7.jar:lib/jetty-continuation-9.4.10.v20180503.jar:lib/jetty-http-9.4.10.v20180503.jar:lib/commons-compress-1.16.jar:lib/avatica-metrics-1.10.0.jar:lib/aether-impl-0.9.0.M2.jar:lib/jackson-jaxrs-base-2.6.7.jar:lib/commons-pool-1.6.jar:lib/java-util-0.13.0-incubating.jar:lib/druid-common-0.13.0-incubating.jar:lib/derbyclient-10.11.1.1.jar:lib/RoaringBitmap-0.5.18.jar:lib/asm-tree-5.2.jar:lib/jackson-jaxrs-json-provider-2.6.7.jar:lib/jna-4.5.1.jar:lib/fastutil-8.1.0.jar:lib/json-smart-2.3.jar:lib/javax.el-api-3.0.0.jar:lib/slf4j-api-1.6.4.jar:lib/netty-transport-native-unix-common-4.1.29.Final.jar:lib/xz-1.8.jar:lib/tesla-aether-0.0.5.jar:lib/json-path-2.3.0.jar:lib/jackson-jaxrs-smile-provider-2.6.7.jar:lib/netty-codec-4.1.29.Final.jar:lib/commons-collections-3.2.2.jar:lib/compress-lzf-1.0.4.jar:lib/netty-codec-socks-4.1.29.Final.jar:lib/jcl-over-slf4j-1.7.12.jar:lib/guice-4.1.0.jar:lib/commons-text-1.3.jar:lib/javax.activation-1.2.0.jar:lib/log4j-api-2.5.jar:lib/commons-math3-3.6.1.jar:lib/jackson-jq-0.0.7.jar:lib/joni-2.1.11.jar:lib/jetty-util-9.4.10.v20180503.jar:lib/netty-reactive-streams-2.0.0.jar:lib/druid-processing-0.13.0-incubating.jar:lib/jboss-logging-3.1.3.GA.jar:lib/jersey-core-1.19.3.jar:lib/curator-client-4.0.0.jar:lib/maven-model-builder-3.1.1.jar:lib/maven-settings-3.1.1.jar:lib/jackson-databind-2.6.7.jar:lib/esri-geometry-api-2.0.0.jar:lib/druid-indexing-service-0.13.0-incubating.jar:lib/netty-transport-native-epoll-4.1.29.Final-linux-x86_64.jar:lib/commons-cli-1.2.jar:lib/antlr4-runtime-4.5.1.jar:lib/zookeeper-3.4.11.jar:lib/jackson-core-asl-1.9.13.jar:lib/audience-annotations-0.5.0.jar:lib/zstd-jni-1.3.3-1.jar:lib/maven-aether-provider-3.1.1.jar:lib/airline-0.7.jar:lib/jcodings-1.0.13.jar:lib/jetty-servlet-9.4.10.v20180503.jar:lib/jersey-guice-1.19.3.jar:lib/netty-common-4.1.29.Final.jar:lib/spymemcached-2.12.3.jar:lib/commons-beanutils-1.9.3.jar:lib/asm-5.2.jar:lib/lz4-java-1.4.0.jar:lib/jackson-datatype-joda-2.6.7.jar:lib/accessors-smart-1.2.jar:lib/icu4j-54.1.1.jar:lib/jersey-servlet-1.19.3.jar:lib/jetty-proxy-9.4.10.v20180503.jar:lib/netty-handler-proxy-4.1.29.Final.jar:lib/netty-resolver-dns-4.1.29.Final.jar:lib/log4j-core-2.5.jar:lib/jvm-attach-api-1.2.jar:lib/druid-services-0.13.0-incubating.jar:lib/log4j-1.2-api-2.5.jar:lib/jackson-mapper-asl-1.9.13.jar:lib/protobuf-java-3.1.0.jar:lib/httpclient-4.5.3.jar:lib/opencsv-4.2.jar:lib/commons-collections4-4.1.jar:
2019-01-08T13:15:20,574 INFO [main] org.apache.druid.cli.CliPeon - * java.class.version: 52.0
2019-01-08T13:15:20,574 INFO [main] org.apache.druid.cli.CliPeon - * java.endorsed.dirs: /usr/lib/jvm/java-8-oracle/jre/lib/endorsed
2019-01-08T13:15:20,574 INFO [main] org.apache.druid.cli.CliPeon - * java.ext.dirs: /usr/lib/jvm/java-8-oracle/jre/lib/ext:/usr/java/packages/lib/ext
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.home: /usr/lib/jvm/java-8-oracle/jre
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.io.tmpdir: var/tmp
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.library.path: /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.runtime.name: Java(TM) SE Runtime Environment
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.runtime.version: 1.8.0_191-b12
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.specification.name: Java Platform API Specification
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.specification.vendor: Oracle Corporation
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.specification.version: 1.8
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.util.logging.manager: org.apache.logging.log4j.jul.LogManager
2019-01-08T13:15:20,575 INFO [main] org.apache.druid.cli.CliPeon - * java.vendor: Oracle Corporation
2019-01-08T13:15:20,576 INFO [main] org.apache.druid.cli.CliPeon - * java.vendor.url: http://java.oracle.com/
2019-01-08T13:15:20,576 INFO [main] org.apache.druid.cli.CliPeon - * java.vendor.url.bug: http://bugreport.sun.com/bugreport/
2019-01-08T13:15:20,576 INFO [main] org.apache.druid.cli.CliPeon - * java.version: 1.8.0_191
2019-01-08T13:15:20,576 INFO [main] org.apache.druid.cli.CliPeon - * java.vm.info: mixed mode
2019-01-08T13:15:20,578 INFO [main] org.apache.druid.cli.CliPeon - * java.vm.name: Java HotSpot(TM) 64-Bit Server VM
2019-01-08T13:15:20,581 INFO [main] org.apache.druid.cli.CliPeon - * java.vm.specification.name: Java Virtual Machine Specification
2019-01-08T13:15:20,581 INFO [main] org.apache.druid.cli.CliPeon - * java.vm.specification.vendor: Oracle Corporation
2019-01-08T13:15:20,581 INFO [main] org.apache.druid.cli.CliPeon - * java.vm.specification.version: 1.8
2019-01-08T13:15:20,581 INFO [main] org.apache.druid.cli.CliPeon - * java.vm.vendor: Oracle Corporation
2019-01-08T13:15:20,581 INFO [main] org.apache.druid.cli.CliPeon - * java.vm.version: 25.191-b12
2019-01-08T13:15:20,581 INFO [main] org.apache.druid.cli.CliPeon - * line.separator: 

2019-01-08T13:15:20,581 INFO [main] org.apache.druid.cli.CliPeon - * log4j.shutdownCallbackRegistry: org.apache.druid.common.config.Log4jShutdown
2019-01-08T13:15:20,582 INFO [main] org.apache.druid.cli.CliPeon - * log4j.shutdownHookEnabled: true
2019-01-08T13:15:20,582 INFO [main] org.apache.druid.cli.CliPeon - * os.arch: amd64
2019-01-08T13:15:20,582 INFO [main] org.apache.druid.cli.CliPeon - * os.name: Linux
2019-01-08T13:15:20,582 INFO [main] org.apache.druid.cli.CliPeon - * os.version: 4.15.0-1036-azure
2019-01-08T13:15:20,582 INFO [main] org.apache.druid.cli.CliPeon - * path.separator: :
2019-01-08T13:15:20,582 INFO [main] org.apache.druid.cli.CliPeon - * sun.arch.data.model: 64
2019-01-08T13:15:20,582 INFO [main] org.apache.druid.cli.CliPeon - * sun.boot.class.path: /usr/lib/jvm/java-8-oracle/jre/lib/resources.jar:/usr/lib/jvm/java-8-oracle/jre/lib/rt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jsse.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jce.jar:/usr/lib/jvm/java-8-oracle/jre/lib/charsets.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jfr.jar:/usr/lib/jvm/java-8-oracle/jre/classes
2019-01-08T13:15:20,582 INFO [main] org.apache.druid.cli.CliPeon - * sun.boot.library.path: /usr/lib/jvm/java-8-oracle/jre/lib/amd64
2019-01-08T13:15:20,582 INFO [main] org.apache.druid.cli.CliPeon - * sun.cpu.endian: little
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * sun.cpu.isalist: 
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * sun.io.unicode.encoding: UnicodeLittle
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * sun.java.command: org.apache.druid.cli.Main internal peon var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/task.json var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/21c7f3b8-afbd-4c6f-a17c-c3d2c546e833/status.json var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/21c7f3b8-afbd-4c6f-a17c-c3d2c546e833/report.json
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * sun.java.launcher: SUN_STANDARD
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * sun.jnu.encoding: UTF-8
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * sun.management.compiler: HotSpot 64-Bit Tiered Compilers
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * sun.os.patch.level: unknown
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * user.country: US
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * user.dir: /opt/druid-0.13.0
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * user.home: /root
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * user.language: en
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * user.name: root
2019-01-08T13:15:20,583 INFO [main] org.apache.druid.cli.CliPeon - * user.timezone: UTC
2019-01-08T13:15:20,591 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.storage.hdfs.HdfsKerberosConfig] from props[druid.hadoop.security.kerberos.] as [org.apache.druid.storage.hdfs.HdfsKerberosConfig@0]
2019-01-08T13:15:20,597 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.indexing.common.config.TaskConfig] from props[druid.indexer.task.] as [org.apache.druid.indexing.common.config.TaskConfig@64942607]
2019-01-08T13:15:20,614 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.guice.http.DruidHttpClientConfig] from props[druid.global.http.] as [org.apache.druid.guice.http.DruidHttpClientConfig@68dd39d2]
2019-01-08T13:15:20,686 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.client.indexing.IndexingServiceSelectorConfig] from props[druid.selectors.indexing.] as [org.apache.druid.client.indexing.IndexingServiceSelectorConfig@47c7a9e5]
2019-01-08T13:15:20,728 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.indexing.common.RetryPolicyConfig] from props[druid.peon.taskActionClient.retry.] as [org.apache.druid.indexing.common.RetryPolicyConfig@2e62ead7]
2019-01-08T13:15:20,735 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.storage.hdfs.HdfsDataSegmentPusherConfig] from props[druid.storage.] as [org.apache.druid.storage.hdfs.HdfsDataSegmentPusherConfig@11b32a14]
2019-01-08T13:15:20,740 INFO [main] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Configured HDFS as deep storage
2019-01-08T13:15:20,751 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.segment.loading.LocalDataSegmentPusherConfig] from props[druid.storage.] as [org.apache.druid.segment.loading.LocalDataSegmentPusherConfig@164642a4]
2019-01-08T13:15:20,757 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.client.DruidServerConfig] from props[druid.server.] as [org.apache.druid.client.DruidServerConfig@750a04ec]
2019-01-08T13:15:20,764 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from props[druid.announcer.] as [org.apache.druid.server.initialization.BatchDataSegmentAnnouncerConfig@b9d018b]
2019-01-08T13:15:20,772 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[interface org.apache.druid.server.coordination.DataSegmentAnnouncerProvider] from props[druid.announcer.] as [org.apache.druid.server.coordination.BatchDataSegmentAnnouncerProvider@7bc58891]
2019-01-08T13:15:20,774 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.client.coordinator.CoordinatorSelectorConfig] from props[druid.selectors.coordinator.] as [org.apache.druid.client.coordinator.CoordinatorSelectorConfig@33e434c8]
2019-01-08T13:15:20,777 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.segment.realtime.plumber.CoordinatorBasedSegmentHandoffNotifierConfig] from props[druid.segment.handoff.] as [org.apache.druid.segment.realtime.plumber.CoordinatorBasedSegmentHandoffNotifierConfig@1084ac45]
2019-01-08T13:15:20,780 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [2] for [druid.processing.numThreads] on [org.apache.druid.query.DruidProcessingConfig#getNumThreadsConfigured()]
2019-01-08T13:15:20,781 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.tmpDir] on [org.apache.druid.query.DruidProcessingConfig#getTmpDir()]
2019-01-08T13:15:20,782 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.numMergeBuffers] on [org.apache.druid.query.DruidProcessingConfig#getNumMergeBuffersConfigured()]
2019-01-08T13:15:20,783 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [org.apache.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2019-01-08T13:15:20,783 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [256000000] for [druid.processing.buffer.sizeBytes] on [org.apache.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2019-01-08T13:15:20,783 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.computation.buffer.poolCacheMaxCount, ${base_path}.buffer.poolCacheMaxCount] on [org.apache.druid.query.DruidProcessingConfig#poolCacheMaxCount()]
2019-01-08T13:15:20,784 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.fifo] on [org.apache.druid.query.DruidProcessingConfig#isFifo()]
2019-01-08T13:15:20,785 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [org.apache.druid.java.util.common.concurrent.ExecutorServiceConfig#getFormatString()]
2019-01-08T13:15:20,886 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[interface org.apache.druid.client.cache.CacheProvider] from props[druid.cache.] as [org.apache.druid.client.cache.CaffeineCacheProvider@1cc8416a]
2019-01-08T13:15:21,105 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.client.cache.CacheConfig] from props[druid.realtime.cache.] as [org.apache.druid.client.cache.CacheConfig@5d124d29]
2019-01-08T13:15:21,107 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[interface org.apache.druid.segment.writeout.SegmentWriteOutMediumFactory] from props[druid.peon.defaultSegmentWriteOutMediumFactory.] as [org.apache.druid.segment.writeout.TmpFileSegmentWriteOutMediumFactory@5e5ddfbc]
2019-01-08T13:15:21,112 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.query.lookup.LookupListeningAnnouncerConfig] from props[druid.lookup.] as [ListeningAnnouncerConfig{listenersPath='/druid/listeners'}]
2019-01-08T13:15:21,119 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.server.initialization.TLSServerConfig] from props[druid.server.https.] as [TLSServerConfig{keyStorePath='null', keyStoreType='null', certAlias='null', keyManagerFactoryAlgorithm='null', includeCipherSuites=null, excludeCipherSuites=null, includeProtocols=null, excludeProtocols=null, requireClientCertificate=false, trustStoreType='null', trustStorePath='null', trustStoreAlgorithm='null', validateHostnames='true', crlPath='null'}]
2019-01-08T13:15:21,134 INFO [main] org.eclipse.jetty.util.log - Logging initialized @5388ms to org.eclipse.jetty.util.log.Slf4jLog
2019-01-08T13:15:21,154 INFO [main] org.apache.druid.server.initialization.jetty.JettyServerModule - Creating http connector with port [8100]
2019-01-08T13:15:21,322 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.query.search.SearchQueryConfig] from props[druid.query.search.] as [org.apache.druid.query.search.SearchQueryConfig@62a6674f]
2019-01-08T13:15:21,326 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.query.metadata.SegmentMetadataQueryConfig] from props[druid.query.segmentMetadata.] as [org.apache.druid.query.metadata.SegmentMetadataQueryConfig@4eeab3e]
2019-01-08T13:15:21,332 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.query.groupby.GroupByQueryConfig] from props[druid.query.groupBy.] as [GroupByQueryConfig{defaultStrategy='v2', singleThreaded=false, maxIntermediateRows=50000, maxResults=500000, bufferGrouperMaxSize=2147483647, bufferGrouperMaxLoadFactor=0.0, bufferGrouperInitialBuckets=0, maxMergingDictionarySize=100000000, maxOnDiskStorage=0, forcePushDownLimit=false, forceHashAggregation=false, intermediateCombineDegree=8, numParallelCombineThreads=1}]
2019-01-08T13:15:21,348 INFO [main] org.apache.druid.offheap.OffheapBufferGenerator - Allocating new intermediate processing buffer[0] of size[256,000,000]
2019-01-08T13:15:21,435 INFO [main] org.apache.druid.offheap.OffheapBufferGenerator - Allocating new intermediate processing buffer[1] of size[256,000,000]
2019-01-08T13:15:21,520 INFO [main] org.apache.druid.offheap.OffheapBufferGenerator - Allocating new result merging buffer[0] of size[256,000,000]
2019-01-08T13:15:21,595 INFO [main] org.apache.druid.offheap.OffheapBufferGenerator - Allocating new result merging buffer[1] of size[256,000,000]
2019-01-08T13:15:21,676 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.query.scan.ScanQueryConfig] from props[druid.query.scan.] as [ScanQueryConfig{legacy=false}]
2019-01-08T13:15:21,682 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.query.select.SelectQueryConfig] from props[druid.query.select.] as [org.apache.druid.query.select.SelectQueryConfig@681e913c]
2019-01-08T13:15:21,684 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.query.topn.TopNQueryConfig] from props[druid.query.topN.] as [org.apache.druid.query.topn.TopNQueryConfig@743c3520]
2019-01-08T13:15:21,687 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[interface org.apache.druid.server.log.RequestLoggerProvider] from props[druid.request.logging.] as [org.apache.druid.server.log.NoopRequestLoggerProvider@194012a2]
2019-01-08T13:15:21,697 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.query.lookup.LookupConfig] from props[druid.lookup.] as [LookupConfig{snapshotWorkingDir='', enableLookupSyncOnStartup=true, numLookupLoadingThreads=1, coordinatorFetchRetries=3, lookupStartRetries=3, coordinatorRetryDelay=60000}]
2019-01-08T13:15:21,700 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.java.util.emitter.core.LoggingEmitter.start()] on object[LoggingEmitter{log=Logger{name=[org.apache.druid.java.util.emitter.core.LoggingEmitter], class[class org.apache.logging.slf4j.Log4jLogger]}, level=INFO}].
2019-01-08T13:15:21,700 INFO [main] org.apache.druid.java.util.emitter.core.LoggingEmitter - Start: started [true]
2019-01-08T13:15:21,700 INFO [main] org.apache.druid.curator.CuratorModule - Starting Curator
2019-01-08T13:15:21,700 INFO [main] org.apache.curator.framework.imps.CuratorFrameworkImpl - Starting
2019-01-08T13:15:21,713 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.11-37e277162d567b55a07d1755f0b31c32e93c01a0, built on 11/01/2017 18:06 GMT
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:host.name=vm-druid
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_191
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.home=/usr/lib/jvm/java-8-oracle/jre
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=conf/druid/_common:conf/druid/middleManager:lib/druid-indexing-hadoop-0.13.0-incubating.jar:lib/reactive-streams-1.0.2.jar:lib/async-http-client-netty-utils-2.5.3.jar:lib/disruptor-3.3.6.jar:lib/commons-lang-2.6.jar:lib/commons-compiler-2.7.6.jar:lib/guice-multibindings-4.1.0.jar:lib/javax.inject-1.jar:lib/guice-servlet-4.1.0.jar:lib/jackson-datatype-guava-2.6.7.jar:lib/log4j-jul-2.5.jar:lib/aether-api-0.9.0.M2.jar:lib/aws-java-sdk-s3-1.11.199.jar:lib/config-magic-0.9.jar:lib/commons-lang3-3.2.jar:lib/jackson-annotations-2.6.7.jar:lib/jetty-server-9.4.10.v20180503.jar:lib/jmespath-java-1.11.199.jar:lib/avatica-server-1.10.0.jar:lib/avatica-core-1.10.0.jar:lib/jackson-core-2.6.7.jar:lib/objenesis-2.6.jar:lib/netty-3.10.6.Final.jar:lib/javax.servlet-api-3.1.0.jar:lib/aggdesigner-algorithm-6.0.jar:lib/jackson-dataformat-cbor-2.6.7.jar:lib/curator-x-discovery-4.0.0.jar:lib/aws-java-sdk-core-1.11.199.jar:lib/netty-handler-4.1.29.Final.jar:lib/asm-commons-5.2.jar:lib/netty-all-4.1.30.Final.jar:lib/plexus-utils-3.0.15.jar:lib/calcite-core-1.17.0.jar:lib/druid-sql-0.13.0-incubating.jar:lib/log4j-slf4j-impl-2.5.jar:lib/caffeine-2.5.5.jar:lib/javax.el-3.0.0.jar:lib/jsr305-2.0.1.jar:lib/joda-time-2.9.9.jar:lib/jdbi-2.63.1.jar:lib/classmate-1.0.0.jar:lib/druid-hll-0.13.0-incubating.jar:lib/mapdb-1.0.8.jar:lib/jackson-dataformat-smile-2.6.7.jar:lib/commons-dbcp2-2.0.1.jar:lib/aws-java-sdk-kms-1.11.199.jar:lib/jetty-servlets-9.4.10.v20180503.jar:lib/validation-api-1.1.0.Final.jar:lib/jline-0.9.94.jar:lib/commons-logging-1.1.1.jar:lib/metrics-core-4.0.0.jar:lib/error_prone_annotations-2.2.0.jar:lib/plexus-interpolation-1.19.jar:lib/druid-api-0.13.0-incubating.jar:lib/netty-resolver-4.1.29.Final.jar:lib/jetty-io-9.4.10.v20180503.jar:lib/aether-util-0.9.0.M2.jar:lib/okhttp-1.0.2.jar:lib/hibernate-validator-5.1.3.Final.jar:lib/druid-console-0.0.4.jar:lib/commons-pool2-2.2.jar:lib/aether-connector-file-0.9.0.M2.jar:lib/maven-model-3.1.1.jar:lib/netty-codec-http-4.1.29.Final.jar:lib/rhino-1.7R5.jar:lib/sigar-1.6.5.132.jar:lib/netty-buffer-4.1.29.Final.jar:lib/wagon-provider-api-2.4.jar:lib/derbynet-10.11.1.1.jar:lib/jetty-security-9.4.10.v20180503.jar:lib/aether-connector-okhttp-0.0.9.jar:lib/commons-codec-1.7.jar:lib/ion-java-1.0.2.jar:lib/curator-recipes-4.0.0.jar:lib/guava-16.0.1.jar:lib/derby-10.11.1.1.jar:lib/netty-transport-4.1.29.Final.jar:lib/async-http-client-2.5.3.jar:lib/extendedset-0.13.0-incubating.jar:lib/aether-spi-0.9.0.M2.jar:lib/curator-framework-4.0.0.jar:lib/aws-java-sdk-ec2-1.11.199.jar:lib/jsr311-api-1.1.1.jar:lib/druid-server-0.13.0-incubating.jar:lib/jersey-server-1.19.3.jar:lib/druid-aws-common-0.13.0-incubating.jar:lib/aopalliance-1.0.jar:lib/calcite-linq4j-1.17.0.jar:lib/maven-repository-metadata-3.1.1.jar:lib/netty-codec-dns-4.1.29.Final.jar:lib/maven-settings-builder-3.1.1.jar:lib/httpcore-4.4.4.jar:lib/janino-2.7.6.jar:lib/commons-io-2.5.jar:lib/jetty-client-9.4.10.v20180503.jar:lib/jackson-module-jaxb-annotations-2.6.7.jar:lib/jetty-continuation-9.4.10.v20180503.jar:lib/jetty-http-9.4.10.v20180503.jar:lib/commons-compress-1.16.jar:lib/avatica-metrics-1.10.0.jar:lib/aether-impl-0.9.0.M2.jar:lib/jackson-jaxrs-base-2.6.7.jar:lib/commons-pool-1.6.jar:lib/java-util-0.13.0-incubating.jar:lib/druid-common-0.13.0-incubating.jar:lib/derbyclient-10.11.1.1.jar:lib/RoaringBitmap-0.5.18.jar:lib/asm-tree-5.2.jar:lib/jackson-jaxrs-json-provider-2.6.7.jar:lib/jna-4.5.1.jar:lib/fastutil-8.1.0.jar:lib/json-smart-2.3.jar:lib/javax.el-api-3.0.0.jar:lib/slf4j-api-1.6.4.jar:lib/netty-transport-native-unix-common-4.1.29.Final.jar:lib/xz-1.8.jar:lib/tesla-aether-0.0.5.jar:lib/json-path-2.3.0.jar:lib/jackson-jaxrs-smile-provider-2.6.7.jar:lib/netty-codec-4.1.29.Final.jar:lib/commons-collections-3.2.2.jar:lib/compress-lzf-1.0.4.jar:lib/netty-codec-socks-4.1.29.Final.jar:lib/jcl-over-slf4j-1.7.12.jar:lib/guice-4.1.0.jar:lib/commons-text-1.3.jar:lib/javax.activation-1.2.0.jar:lib/log4j-api-2.5.jar:lib/commons-math3-3.6.1.jar:lib/jackson-jq-0.0.7.jar:lib/joni-2.1.11.jar:lib/jetty-util-9.4.10.v20180503.jar:lib/netty-reactive-streams-2.0.0.jar:lib/druid-processing-0.13.0-incubating.jar:lib/jboss-logging-3.1.3.GA.jar:lib/jersey-core-1.19.3.jar:lib/curator-client-4.0.0.jar:lib/maven-model-builder-3.1.1.jar:lib/maven-settings-3.1.1.jar:lib/jackson-databind-2.6.7.jar:lib/esri-geometry-api-2.0.0.jar:lib/druid-indexing-service-0.13.0-incubating.jar:lib/netty-transport-native-epoll-4.1.29.Final-linux-x86_64.jar:lib/commons-cli-1.2.jar:lib/antlr4-runtime-4.5.1.jar:lib/zookeeper-3.4.11.jar:lib/jackson-core-asl-1.9.13.jar:lib/audience-annotations-0.5.0.jar:lib/zstd-jni-1.3.3-1.jar:lib/maven-aether-provider-3.1.1.jar:lib/airline-0.7.jar:lib/jcodings-1.0.13.jar:lib/jetty-servlet-9.4.10.v20180503.jar:lib/jersey-guice-1.19.3.jar:lib/netty-common-4.1.29.Final.jar:lib/spymemcached-2.12.3.jar:lib/commons-beanutils-1.9.3.jar:lib/asm-5.2.jar:lib/lz4-java-1.4.0.jar:lib/jackson-datatype-joda-2.6.7.jar:lib/accessors-smart-1.2.jar:lib/icu4j-54.1.1.jar:lib/jersey-servlet-1.19.3.jar:lib/jetty-proxy-9.4.10.v20180503.jar:lib/netty-handler-proxy-4.1.29.Final.jar:lib/netty-resolver-dns-4.1.29.Final.jar:lib/log4j-core-2.5.jar:lib/jvm-attach-api-1.2.jar:lib/druid-services-0.13.0-incubating.jar:lib/log4j-1.2-api-2.5.jar:lib/jackson-mapper-asl-1.9.13.jar:lib/protobuf-java-3.1.0.jar:lib/httpclient-4.5.3.jar:lib/opencsv-4.2.jar:lib/commons-collections4-4.1.jar:
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=var/tmp
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.version=4.15.0-1036-azure
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.name=root
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.home=/root
2019-01-08T13:15:21,714 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/opt/druid-0.13.0
2019-01-08T13:15:21,715 INFO [main] org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=vm-kafka sessionTimeout=30000 watcher=org.apache.curator.ConnectionState@1c6a0103
2019-01-08T13:15:21,745 INFO [main-SendThread(vm-kafka:2181)] org.apache.zookeeper.ClientCnxn - Opening socket connection to server vm-kafka/172.16.0.4:2181. Will not attempt to authenticate using SASL (unknown error)
2019-01-08T13:15:21,746 INFO [main] org.apache.curator.framework.imps.CuratorFrameworkImpl - Default schema
2019-01-08T13:15:21,746 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner.start()] on object[org.apache.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner@226d5af0].
2019-01-08T13:15:21,747 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.java.util.emitter.service.ServiceEmitter.start()] on object[ServiceEmitter{serviceDimensions={service=druid/middleManager, host=vm-druid:8100, version=0.13.0-incubating}, emitter=LoggingEmitter{log=Logger{name=[org.apache.druid.java.util.emitter.core.LoggingEmitter], class[class org.apache.logging.slf4j.Log4jLogger]}, level=INFO}}].
2019-01-08T13:15:21,747 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.java.util.metrics.MonitorScheduler.start()] on object[org.apache.druid.java.util.metrics.MonitorScheduler@24d8f87a].
2019-01-08T13:15:21,751 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.storage.hdfs.HdfsStorageAuthentication.authenticate()] on object[org.apache.druid.storage.hdfs.HdfsStorageAuthentication@e3899fd].
2019-01-08T13:15:21,753 INFO [main-SendThread(vm-kafka:2181)] org.apache.zookeeper.ClientCnxn - Socket connection established to vm-kafka/172.16.0.4:2181, initiating session
2019-01-08T13:15:21,754 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.java.util.http.client.NettyHttpClient.start()] on object[org.apache.druid.java.util.http.client.NettyHttpClient@78116659].
2019-01-08T13:15:21,754 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider.start()] on object[org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider@53ea380b].
2019-01-08T13:15:21,754 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider - starting
2019-01-08T13:15:21,755 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider - started
2019-01-08T13:15:21,758 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.curator.discovery.ServerDiscoverySelector.start() throws java.lang.Exception] on object[org.apache.druid.curator.discovery.ServerDiscoverySelector@245cb8df].
2019-01-08T13:15:21,777 INFO [main-SendThread(vm-kafka:2181)] org.apache.zookeeper.ClientCnxn - Session establishment complete on server vm-kafka/172.16.0.4:2181, sessionid = 0x100002780320106, negotiated timeout = 30000
2019-01-08T13:15:21,783 INFO [main-EventThread] org.apache.curator.framework.state.ConnectionStateManager - State change: CONNECTED
2019-01-08T13:15:21,899 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.discovery.DruidLeaderClient.start()] on object[org.apache.druid.discovery.DruidLeaderClient@1693ff90].
2019-01-08T13:15:21,900 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider - Creating NodeTypeWatcher for nodeType [overlord].
2019-01-08T13:15:21,909 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider - Created NodeTypeWatcher for nodeType [overlord].
2019-01-08T13:15:21,909 INFO [main] org.apache.druid.discovery.DruidLeaderClient - Started.
2019-01-08T13:15:21,909 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.curator.announcement.Announcer.start()] on object[org.apache.druid.curator.announcement.Announcer@19fbc594].
2019-01-08T13:15:21,909 INFO [main] org.apache.druid.curator.announcement.Announcer - Starting announcer
2019-01-08T13:15:21,910 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.curator.discovery.ServerDiscoverySelector.start() throws java.lang.Exception] on object[org.apache.druid.curator.discovery.ServerDiscoverySelector@2f4d32bf].
2019-01-08T13:15:21,925 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.discovery.DruidLeaderClient.start()] on object[org.apache.druid.discovery.DruidLeaderClient@76a9a009].
2019-01-08T13:15:21,925 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider - Creating NodeTypeWatcher for nodeType [coordinator].
2019-01-08T13:15:21,929 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider - Created NodeTypeWatcher for nodeType [coordinator].
2019-01-08T13:15:21,929 INFO [NodeTypeWatcher[overlord]] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider$NodeTypeWatcher - Node[vm-druid:8090:DiscoveryDruidNode{druidNode=DruidNode{serviceName='druid/overlord', host='vm-druid', port=-1, plaintextPort=8090, enablePlaintextPort=true, tlsPort=-1, enableTlsPort=false}, nodeType='overlord', services={}}] appeared.
2019-01-08T13:15:21,929 INFO [main] org.apache.druid.discovery.DruidLeaderClient - Started.
2019-01-08T13:15:21,929 INFO [NodeTypeWatcher[overlord]] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider$NodeTypeWatcher - Received INITIALIZED in node watcher.
2019-01-08T13:15:21,932 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner.start()] on object[org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner@7f0d8eff].
2019-01-08T13:15:21,932 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.indexing.worker.executor.ExecutorLifecycle.start() throws java.lang.InterruptedException] on object[org.apache.druid.indexing.worker.executor.ExecutorLifecycle@149aa7b2].
2019-01-08T13:15:21,948 INFO [NodeTypeWatcher[coordinator]] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider$NodeTypeWatcher - Node[vm-druid:8081:DiscoveryDruidNode{druidNode=DruidNode{serviceName='druid/coordinator', host='vm-druid', port=-1, plaintextPort=8081, enablePlaintextPort=true, tlsPort=-1, enableTlsPort=false}, nodeType='coordinator', services={}}] appeared.
2019-01-08T13:15:21,948 INFO [NodeTypeWatcher[coordinator]] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider$NodeTypeWatcher - Received INITIALIZED in node watcher.
2019-01-08T13:15:22,009 INFO [main] org.apache.druid.indexing.worker.executor.ExecutorLifecycle - Running with task: {
  "type" : "index",
  "id" : "index_wikipedia_2019-01-08T13:15:15.682Z",
  "resource" : {
    "availabilityGroup" : "index_wikipedia_2019-01-08T13:15:15.682Z",
    "requiredCapacity" : 1
  },
  "spec" : {
    "dataSchema" : {
      "dataSource" : "wikipedia",
      "parser" : {
        "type" : "string",
        "parseSpec" : {
          "format" : "json",
          "dimensionsSpec" : {
            "dimensions" : [ "channel", "cityName", "comment", "countryIsoCode", "countryName", "isAnonymous", "isMinor", "isNew", "isRobot", "isUnpatrolled", "metroCode", "namespace", "page", "regionIsoCode", "regionName", "user", {
              "name" : "added",
              "type" : "long"
            }, {
              "name" : "deleted",
              "type" : "long"
            }, {
              "name" : "delta",
              "type" : "long"
            } ]
          },
          "timestampSpec" : {
            "column" : "time",
            "format" : "iso"
          }
        }
      },
      "metricsSpec" : [ ],
      "granularitySpec" : {
        "type" : "uniform",
        "segmentGranularity" : "DAY",
        "queryGranularity" : {
          "type" : "none"
        },
        "rollup" : false,
        "intervals" : [ "2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z" ]
      },
      "transformSpec" : {
        "filter" : null,
        "transforms" : [ ]
      }
    },
    "ioConfig" : {
      "type" : "index",
      "firehose" : {
        "type" : "local",
        "baseDir" : "/opt/druid-0.13.0/quickstart",
        "filter" : "wikiticker-2015-09-12-sampled.json.gz",
        "parser" : null
      },
      "appendToExisting" : false
    },
    "tuningConfig" : {
      "type" : "index",
      "targetPartitionSize" : 5000000,
      "maxRowsInMemory" : 25000,
      "maxBytesInMemory" : 0,
      "maxTotalRows" : null,
      "numShards" : null,
      "partitionDimensions" : [ ],
      "indexSpec" : {
        "bitmap" : {
          "type" : "concise"
        },
        "dimensionCompression" : "lz4",
        "metricCompression" : "lz4",
        "longEncoding" : "longs"
      },
      "maxPendingPersists" : 0,
      "buildV9Directly" : true,
      "forceExtendableShardSpecs" : true,
      "forceGuaranteedRollup" : false,
      "reportParseExceptions" : false,
      "pushTimeout" : 0,
      "segmentWriteOutMediumFactory" : null,
      "logParseExceptions" : false,
      "maxParseExceptions" : 2147483647,
      "maxSavedParseExceptions" : 0
    }
  },
  "context" : { },
  "groupId" : "index_wikipedia_2019-01-08T13:15:15.682Z",
  "dataSource" : "wikipedia"
}
2019-01-08T13:15:22,010 INFO [main] org.apache.druid.indexing.worker.executor.ExecutorLifecycle - Attempting to lock file[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/lock].
2019-01-08T13:15:22,012 INFO [main] org.apache.druid.indexing.worker.executor.ExecutorLifecycle - Acquired lock file[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/lock] in 2ms.
2019-01-08T13:15:22,016 INFO [main] org.apache.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_wikipedia_2019-01-08T13:15:15.682Z]: LockListAction{}
2019-01-08T13:15:22,028 INFO [main] org.apache.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_wikipedia_2019-01-08T13:15:15.682Z] to overlord: [LockListAction{}].
2019-01-08T13:15:22,192 INFO [task-runner-0-priority-0] org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner - Running task: index_wikipedia_2019-01-08T13:15:15.682Z
2019-01-08T13:15:22,195 INFO [main] org.apache.druid.server.initialization.jetty.JettyServerModule - Starting Jetty Server...
2019-01-08T13:15:22,196 INFO [task-runner-0-priority-0] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-01-08T13:15:15.682Z] location changed to [TaskLocation{host='vm-druid', port=8100, tlsPort=-1}].
2019-01-08T13:15:22,197 INFO [task-runner-0-priority-0] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-01-08T13:15:15.682Z] status changed to [RUNNING].
2019-01-08T13:15:22,197 INFO [task-runner-0-priority-0] org.apache.druid.indexing.common.task.IndexTask - Found chat handler of class[org.apache.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider]
2019-01-08T13:15:22,197 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Registering Eventhandler[index_wikipedia_2019-01-08T13:15:15.682Z]
2019-01-08T13:15:22,202 INFO [main] org.eclipse.jetty.server.Server - jetty-9.4.10.v20180503; built: 2018-05-03T15:56:21.710Z; git: daa59876e6f384329b122929e70a80934569428c; jvm 1.8.0_191-b12
2019-01-08T13:15:22,216 INFO [task-runner-0-priority-0] org.apache.druid.indexing.common.task.IndexTask - Skipping determine partition scan
2019-01-08T13:15:22,216 INFO [task-runner-0-priority-0] org.apache.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_wikipedia_2019-01-08T13:15:15.682Z]: LockListAction{}
2019-01-08T13:15:22,225 INFO [task-runner-0-priority-0] org.apache.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_wikipedia_2019-01-08T13:15:15.682Z] to overlord: [LockListAction{}].
2019-01-08T13:15:22,294 INFO [main] org.eclipse.jetty.server.session - DefaultSessionIdManager workerName=node0
2019-01-08T13:15:22,294 INFO [main] org.eclipse.jetty.server.session - No SessionScavenger set, using defaults
2019-01-08T13:15:22,296 INFO [main] org.eclipse.jetty.server.session - node0 Scavenging every 600000ms
2019-01-08T13:15:22,304 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Created Appenderator for dataSource[wikipedia].
2019-01-08T13:15:22,472 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering org.apache.druid.server.http.SegmentListerResource as a root resource class
2019-01-08T13:15:22,474 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider as a provider class
2019-01-08T13:15:22,474 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class
2019-01-08T13:15:22,476 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering org.apache.druid.server.initialization.jetty.CustomExceptionMapper as a provider class
2019-01-08T13:15:22,476 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering org.apache.druid.server.initialization.jetty.ForbiddenExceptionMapper as a provider class
2019-01-08T13:15:22,476 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering org.apache.druid.server.StatusResource as a root resource class
2019-01-08T13:15:22,478 INFO [main] com.sun.jersey.server.impl.application.WebApplicationImpl - Initiating Jersey application, version 'Jersey: 1.19.3 10/24/2016 03:43 PM'
2019-01-08T13:15:22,670 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.initialization.jetty.CustomExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2019-01-08T13:15:22,675 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.initialization.jetty.ForbiddenExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2019-01-08T13:15:22,676 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider to GuiceManagedComponentProvider with the scope "Singleton"
2019-01-08T13:15:22,685 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.firehose.LocalFirehoseFactory - Initialized with [/opt/druid-0.13.0/quickstart/tutorial/wikiticker-2015-09-12-sampled.json.gz] files
2019-01-08T13:15:22,709 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Loading sinks from[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist]: []
2019-01-08T13:15:22,719 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
2019-01-08T13:15:22,788 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver - New segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z] for row[MapBasedInputRow{timestamp=2015-09-12T00:46:58.771Z, event={time=2015-09-12T00:46:58.771Z, channel=#en.wikipedia, comment=added project, isAnonymous=false, isMinor=false, isNew=false, isRobot=false, isUnpatrolled=false, namespace=Talk, page=Talk:Oswald Tilghman, user=GELongstreet, delta=36, added=36, deleted=0}, dimensions=[channel, cityName, comment, countryIsoCode, countryName, isAnonymous, isMinor, isNew, isRobot, isUnpatrolled, metroCode, namespace, page, regionIsoCode, regionName, user, added, deleted, delta]}] sequenceName[index_wikipedia_2019-01-08T13:15:15.682Z].
2019-01-08T13:15:23,360 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.http.security.StateResourceFilter to GuiceInstantiatedComponentProvider
2019-01-08T13:15:23,421 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.http.SegmentListerResource to GuiceManagedComponentProvider with the scope "PerRequest"
2019-01-08T13:15:23,451 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.QueryResource to GuiceInstantiatedComponentProvider
2019-01-08T13:15:23,458 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.segment.realtime.firehose.ChatHandlerResource to GuiceInstantiatedComponentProvider
2019-01-08T13:15:23,469 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.http.security.ConfigResourceFilter to GuiceInstantiatedComponentProvider
2019-01-08T13:15:23,474 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.query.lookup.LookupListeningResource to GuiceInstantiatedComponentProvider
2019-01-08T13:15:23,476 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.query.lookup.LookupIntrospectionResource to GuiceInstantiatedComponentProvider
2019-01-08T13:15:23,477 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2019-01-08T13:15:23,524 WARN [main] com.sun.jersey.spi.inject.Errors - The following warnings have been detected with resource and/or provider classes:
  WARNING: A HTTP GET method, public void org.apache.druid.server.http.SegmentListerResource.getSegments(long,long,long,javax.servlet.http.HttpServletRequest) throws java.io.IOException, MUST return a non-void type.
2019-01-08T13:15:23,549 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@2e1add6f{/,null,AVAILABLE}
2019-01-08T13:15:23,589 INFO [main] org.eclipse.jetty.server.AbstractConnector - Started ServerConnector@52b30054{HTTP/1.1,[http/1.1]}{0.0.0.0:8100}
2019-01-08T13:15:23,589 INFO [main] org.eclipse.jetty.server.Server - Started @7845ms
2019-01-08T13:15:23,590 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.query.lookup.LookupReferencesManager.start()] on object[org.apache.druid.query.lookup.LookupReferencesManager@18c820d2].
2019-01-08T13:15:23,590 INFO [main] org.apache.druid.query.lookup.LookupReferencesManager - LookupReferencesManager is starting.
2019-01-08T13:15:23,636 WARN [main] org.apache.druid.query.lookup.LookupReferencesManager - No lookups found for tier [__default], response [org.apache.druid.java.util.http.client.response.FullResponseHolder@4ecd8ab1]
2019-01-08T13:15:23,636 INFO [main] org.apache.druid.query.lookup.LookupReferencesManager - Coordinator is unavailable. Loading saved snapshot instead
2019-01-08T13:15:23,636 INFO [main] org.apache.druid.query.lookup.LookupReferencesManager - No lookups to be loaded at this point
2019-01-08T13:15:23,636 INFO [main] org.apache.druid.query.lookup.LookupReferencesManager - LookupReferencesManager is started.
2019-01-08T13:15:23,637 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.server.listener.announcer.ListenerResourceAnnouncer.start()] on object[org.apache.druid.query.lookup.LookupResourceListenerAnnouncer@52909a97].
2019-01-08T13:15:23,673 INFO [main] org.apache.druid.server.listener.announcer.ListenerResourceAnnouncer - Announcing start time on [/druid/listeners/lookups/__default/http:vm-druid:8100]
2019-01-08T13:15:25,467 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Persisting rows in memory due to: [No more rows can be appended to sink,rowsCurrentlyInMemory[25000] is greater than maxRowsInMemory[25000]]
2019-01-08T13:15:25,469 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Submitting persist runnable for dataSource[wikipedia]
2019-01-08T13:15:25,496 INFO [wikipedia-incremental-persist] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z], persisting Hydrant[FireHydrant{, queryable=wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z, count=0}]
2019-01-08T13:15:25,507 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Starting persist for interval[2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z], rows[25,000]
2019-01-08T13:15:25,790 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Using SegmentWriteOutMediumFactory[TmpFileSegmentWriteOutMediumFactory]
2019-01-08T13:15:25,803 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed version.bin in 3 millis.
2019-01-08T13:15:25,804 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed factory.json in 1 millis
2019-01-08T13:15:25,914 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[channel] conversions with cardinality[50] in 66 millis.
2019-01-08T13:15:26,020 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[cityName] conversions with cardinality[672] in 83 millis.
2019-01-08T13:15:26,382 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[comment] conversions with cardinality[12,440] in 361 millis.
2019-01-08T13:15:26,385 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryIsoCode] conversions with cardinality[102] in 3 millis.
2019-01-08T13:15:26,388 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryName] conversions with cardinality[102] in 2 millis.
2019-01-08T13:15:26,389 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isAnonymous] conversions with cardinality[2] in 0 millis.
2019-01-08T13:15:26,389 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isMinor] conversions with cardinality[2] in 0 millis.
2019-01-08T13:15:26,390 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isNew] conversions with cardinality[2] in 0 millis.
2019-01-08T13:15:26,391 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isRobot] conversions with cardinality[2] in 0 millis.
2019-01-08T13:15:26,392 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isUnpatrolled] conversions with cardinality[2] in 1 millis.
2019-01-08T13:15:26,394 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[metroCode] conversions with cardinality[70] in 2 millis.
2019-01-08T13:15:26,398 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[namespace] conversions with cardinality[231] in 4 millis.
2019-01-08T13:15:26,430 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver - Pushing segments in background: [wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z]
2019-01-08T13:15:26,434 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Hydrant[FireHydrant{, queryable=wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z, count=0}] hasn't persisted yet, persisting. Segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z]
2019-01-08T13:15:26,434 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Hydrant[FireHydrant{, queryable=wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z, count=1}] hasn't persisted yet, persisting. Segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z]
2019-01-08T13:15:26,434 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Submitting persist runnable for dataSource[wikipedia]
2019-01-08T13:15:26,777 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[page] conversions with cardinality[22,824] in 378 millis.
2019-01-08T13:15:26,783 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionIsoCode] conversions with cardinality[328] in 5 millis.
2019-01-08T13:15:26,790 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionName] conversions with cardinality[408] in 6 millis.
2019-01-08T13:15:26,881 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[user] conversions with cardinality[7,262] in 90 millis.
2019-01-08T13:15:26,882 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed dim conversions in 1,038 millis.
2019-01-08T13:15:27,269 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - completed walk through of 25,000 rows in 294 millis.
2019-01-08T13:15:27,292 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed time column in 22 millis.
2019-01-08T13:15:27,292 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed metric columns in 0 millis.
2019-01-08T13:15:27,322 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[channel] inverted with cardinality[50] in 30 millis.
2019-01-08T13:15:27,363 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[cityName] inverted with cardinality[672] in 33 millis.
2019-01-08T13:15:27,546 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[comment] inverted with cardinality[12,440] in 179 millis.
2019-01-08T13:15:27,565 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryIsoCode] inverted with cardinality[102] in 11 millis.
2019-01-08T13:15:27,579 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryName] inverted with cardinality[102] in 11 millis.
2019-01-08T13:15:27,591 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isAnonymous] inverted with cardinality[2] in 9 millis.
2019-01-08T13:15:27,607 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isMinor] inverted with cardinality[2] in 9 millis.
2019-01-08T13:15:27,630 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isNew] inverted with cardinality[2] in 9 millis.
2019-01-08T13:15:27,646 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isRobot] inverted with cardinality[2] in 10 millis.
2019-01-08T13:15:27,668 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isUnpatrolled] inverted with cardinality[2] in 9 millis.
2019-01-08T13:15:27,698 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[metroCode] inverted with cardinality[70] in 21 millis.
2019-01-08T13:15:27,705 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[namespace] inverted with cardinality[231] in 5 millis.
2019-01-08T13:15:27,978 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[page] inverted with cardinality[22,824] in 264 millis.
2019-01-08T13:15:27,990 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionIsoCode] inverted with cardinality[328] in 7 millis.
2019-01-08T13:15:28,003 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionName] inverted with cardinality[408] in 9 millis.
2019-01-08T13:15:28,098 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[user] inverted with cardinality[7,262] in 89 millis.
2019-01-08T13:15:28,107 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed index.drd in 3 millis.
2019-01-08T13:15:28,119 INFO [wikipedia-incremental-persist] org.apache.druid.java.util.common.io.smoosh.FileSmoosher - Created smoosh file [/opt/druid-0.13.0/var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/0/00000.smoosh] of size [3104336] bytes.
2019-01-08T13:15:28,184 INFO [wikipedia-incremental-persist] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z], Hydrant[FireHydrant{, queryable=wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z, count=0}] already swapped. Ignoring request to persist.
2019-01-08T13:15:28,184 WARN [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Ingestion was throttled for [1,749] millis because persists were pending.
2019-01-08T13:15:28,184 INFO [wikipedia-incremental-persist] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z], persisting Hydrant[FireHydrant{, queryable=wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z, count=1}]
2019-01-08T13:15:28,186 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Starting persist for interval[2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z], rows[14,244]
2019-01-08T13:15:28,263 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Using SegmentWriteOutMediumFactory[TmpFileSegmentWriteOutMediumFactory]
2019-01-08T13:15:28,264 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed version.bin in 0 millis.
2019-01-08T13:15:28,264 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed factory.json in 0 millis
2019-01-08T13:15:28,267 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[channel] conversions with cardinality[50] in 2 millis.
2019-01-08T13:15:28,277 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[cityName] conversions with cardinality[491] in 7 millis.
2019-01-08T13:15:28,379 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[comment] conversions with cardinality[7,653] in 98 millis.
2019-01-08T13:15:28,381 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryIsoCode] conversions with cardinality[88] in 1 millis.
2019-01-08T13:15:28,383 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryName] conversions with cardinality[88] in 2 millis.
2019-01-08T13:15:28,384 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isAnonymous] conversions with cardinality[2] in 1 millis.
2019-01-08T13:15:28,384 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isMinor] conversions with cardinality[2] in 0 millis.
2019-01-08T13:15:28,385 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isNew] conversions with cardinality[2] in 0 millis.
2019-01-08T13:15:28,386 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isRobot] conversions with cardinality[2] in 1 millis.
2019-01-08T13:15:28,386 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isUnpatrolled] conversions with cardinality[2] in 0 millis.
2019-01-08T13:15:28,388 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[metroCode] conversions with cardinality[69] in 1 millis.
2019-01-08T13:15:28,396 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[namespace] conversions with cardinality[197] in 8 millis.
2019-01-08T13:15:28,513 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[page] conversions with cardinality[12,879] in 116 millis.
2019-01-08T13:15:28,516 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionIsoCode] conversions with cardinality[262] in 2 millis.
2019-01-08T13:15:28,519 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionName] conversions with cardinality[319] in 3 millis.
2019-01-08T13:15:28,558 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[user] conversions with cardinality[4,682] in 39 millis.
2019-01-08T13:15:28,558 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed dim conversions in 293 millis.
2019-01-08T13:15:28,702 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - completed walk through of 14,244 rows in 136 millis.
2019-01-08T13:15:28,704 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed time column in 2 millis.
2019-01-08T13:15:28,705 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed metric columns in 0 millis.
2019-01-08T13:15:28,707 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[channel] inverted with cardinality[50] in 2 millis.
2019-01-08T13:15:28,713 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[cityName] inverted with cardinality[491] in 5 millis.
2019-01-08T13:15:28,781 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[comment] inverted with cardinality[7,653] in 66 millis.
2019-01-08T13:15:28,786 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryIsoCode] inverted with cardinality[88] in 2 millis.
2019-01-08T13:15:28,790 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryName] inverted with cardinality[88] in 2 millis.
2019-01-08T13:15:28,793 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isAnonymous] inverted with cardinality[2] in 1 millis.
2019-01-08T13:15:28,798 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isMinor] inverted with cardinality[2] in 1 millis.
2019-01-08T13:15:28,805 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isNew] inverted with cardinality[2] in 1 millis.
2019-01-08T13:15:28,809 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isRobot] inverted with cardinality[2] in 1 millis.
2019-01-08T13:15:28,819 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isUnpatrolled] inverted with cardinality[2] in 0 millis.
2019-01-08T13:15:28,825 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[metroCode] inverted with cardinality[69] in 2 millis.
2019-01-08T13:15:28,830 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[namespace] inverted with cardinality[197] in 4 millis.
2019-01-08T13:15:28,907 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[page] inverted with cardinality[12,879] in 74 millis.
2019-01-08T13:15:28,911 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionIsoCode] inverted with cardinality[262] in 3 millis.
2019-01-08T13:15:28,915 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionName] inverted with cardinality[319] in 3 millis.
2019-01-08T13:15:28,945 INFO [wikipedia-incremental-persist] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[user] inverted with cardinality[4,682] in 28 millis.
2019-01-08T13:15:28,955 INFO [wikipedia-incremental-persist] org.apache.druid.segment.IndexMergerV9 - Completed index.drd in 1 millis.
2019-01-08T13:15:28,955 INFO [wikipedia-incremental-persist] org.apache.druid.java.util.common.io.smoosh.FileSmoosher - Created smoosh file [/opt/druid-0.13.0/var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/1/00000.smoosh] of size [1859797] bytes.
2019-01-08T13:15:28,978 INFO [appenderator_merge_0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Pushing merged index for segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z].
2019-01-08T13:15:28,979 INFO [appenderator_merge_0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Adding hydrant[FireHydrant{, queryable=wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z, count=0}]
2019-01-08T13:15:28,979 INFO [appenderator_merge_0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Adding hydrant[FireHydrant{, queryable=wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z, count=1}]
2019-01-08T13:15:28,984 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMergerV9 - Using SegmentWriteOutMediumFactory[TmpFileSegmentWriteOutMediumFactory]
2019-01-08T13:15:28,985 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMergerV9 - Completed version.bin in 0 millis.
2019-01-08T13:15:28,985 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMergerV9 - Completed factory.json in 0 millis
2019-01-08T13:15:28,991 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[200]
2019-01-08T13:15:28,992 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[200]
2019-01-08T13:15:28,995 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[channel] conversions with cardinality[51] in 9 millis.
2019-01-08T13:15:28,996 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[2,688]
2019-01-08T13:15:28,996 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[1,964]
2019-01-08T13:15:29,016 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[cityName] conversions with cardinality[1,006] in 21 millis.
2019-01-08T13:15:29,017 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[49,760]
2019-01-08T13:15:29,017 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[30,612]
2019-01-08T13:15:29,221 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[comment] conversions with cardinality[19,314] in 204 millis.
2019-01-08T13:15:29,222 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[408]
2019-01-08T13:15:29,222 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[352]
2019-01-08T13:15:29,224 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryIsoCode] conversions with cardinality[114] in 2 millis.
2019-01-08T13:15:29,225 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[408]
2019-01-08T13:15:29,225 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[352]
2019-01-08T13:15:29,226 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryName] conversions with cardinality[114] in 2 millis.
2019-01-08T13:15:29,227 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,227 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,227 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isAnonymous] conversions with cardinality[2] in 0 millis.
2019-01-08T13:15:29,228 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,228 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,228 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isMinor] conversions with cardinality[2] in 0 millis.
2019-01-08T13:15:29,229 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,229 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,229 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isNew] conversions with cardinality[2] in 1 millis.
2019-01-08T13:15:29,230 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,230 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,230 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isRobot] conversions with cardinality[2] in 1 millis.
2019-01-08T13:15:29,231 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,231 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[8]
2019-01-08T13:15:29,231 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isUnpatrolled] conversions with cardinality[2] in 1 millis.
2019-01-08T13:15:29,231 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[280]
2019-01-08T13:15:29,231 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[276]
2019-01-08T13:15:29,233 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[metroCode] conversions with cardinality[93] in 1 millis.
2019-01-08T13:15:29,233 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[924]
2019-01-08T13:15:29,233 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[788]
2019-01-08T13:15:29,236 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[namespace] conversions with cardinality[265] in 3 millis.
2019-01-08T13:15:29,237 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[91,296]
2019-01-08T13:15:29,237 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[51,516]
2019-01-08T13:15:29,488 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[page] conversions with cardinality[35,143] in 252 millis.
2019-01-08T13:15:29,489 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[1,312]
2019-01-08T13:15:29,489 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[1,048]
2019-01-08T13:15:29,491 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionIsoCode] conversions with cardinality[408] in 3 millis.
2019-01-08T13:15:29,491 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[1,632]
2019-01-08T13:15:29,492 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[1,276]
2019-01-08T13:15:29,496 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionName] conversions with cardinality[539] in 5 millis.
2019-01-08T13:15:29,497 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[29,048]
2019-01-08T13:15:29,497 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Allocating dictionary merging direct buffer with size[18,728]
2019-01-08T13:15:29,548 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[user] conversions with cardinality[10,531] in 52 millis.
2019-01-08T13:15:29,549 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMergerV9 - Completed dim conversions in 563 millis.
2019-01-08T13:15:29,568 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[1]
2019-01-08T13:15:29,569 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[2]
2019-01-08T13:15:29,570 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[3]
2019-01-08T13:15:29,570 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[4]
2019-01-08T13:15:29,570 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[5]
2019-01-08T13:15:29,570 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[6]
2019-01-08T13:15:29,570 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[7]
2019-01-08T13:15:29,570 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[8]
2019-01-08T13:15:29,570 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[9]
2019-01-08T13:15:29,571 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[10]
2019-01-08T13:15:29,571 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[11]
2019-01-08T13:15:29,571 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[12]
2019-01-08T13:15:29,571 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[13]
2019-01-08T13:15:29,571 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[14]
2019-01-08T13:15:29,571 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[15]
2019-01-08T13:15:29,571 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[16]
2019-01-08T13:15:29,571 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[17]
2019-01-08T13:15:29,572 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[18]
2019-01-08T13:15:29,572 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[19]
2019-01-08T13:15:29,572 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[20]
2019-01-08T13:15:29,572 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[21]
2019-01-08T13:15:29,572 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[22]
2019-01-08T13:15:29,572 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[23]
2019-01-08T13:15:29,573 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[24]
2019-01-08T13:15:29,573 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[25]
2019-01-08T13:15:29,573 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[26]
2019-01-08T13:15:29,573 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[27]
2019-01-08T13:15:29,573 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[28]
2019-01-08T13:15:29,573 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[29]
2019-01-08T13:15:29,573 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[30]
2019-01-08T13:15:29,573 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[31]
2019-01-08T13:15:29,573 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[32]
2019-01-08T13:15:29,574 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[33]
2019-01-08T13:15:29,574 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[34]
2019-01-08T13:15:29,574 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[35]
2019-01-08T13:15:29,574 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[36]
2019-01-08T13:15:29,574 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[37]
2019-01-08T13:15:29,574 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[38]
2019-01-08T13:15:29,575 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[39]
2019-01-08T13:15:29,575 INFO [appenderator_merge_0] org.apache.druid.segment.CompressedPools - Allocating new littleEndByteBuf[40]
2019-01-08T13:15:30,038 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMergerV9 - completed walk through of 39,244 rows in 463 millis.
2019-01-08T13:15:30,041 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMergerV9 - Completed time column in 3 millis.
2019-01-08T13:15:30,041 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMergerV9 - Completed metric columns in 0 millis.
2019-01-08T13:15:30,093 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[channel] inverted with cardinality[51] in 52 millis.
2019-01-08T13:15:30,093 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[200]
2019-01-08T13:15:30,093 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[200]
2019-01-08T13:15:30,153 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[cityName] inverted with cardinality[1,006] in 56 millis.
2019-01-08T13:15:30,153 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[2,688]
2019-01-08T13:15:30,153 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[1,964]
2019-01-08T13:15:30,475 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[comment] inverted with cardinality[19,314] in 321 millis.
2019-01-08T13:15:30,475 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[49,760]
2019-01-08T13:15:30,475 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[30,612]
2019-01-08T13:15:30,506 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryIsoCode] inverted with cardinality[114] in 26 millis.
2019-01-08T13:15:30,507 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[408]
2019-01-08T13:15:30,507 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[352]
2019-01-08T13:15:30,532 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[countryName] inverted with cardinality[114] in 19 millis.
2019-01-08T13:15:30,532 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[408]
2019-01-08T13:15:30,533 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[352]
2019-01-08T13:15:30,545 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isAnonymous] inverted with cardinality[2] in 7 millis.
2019-01-08T13:15:30,545 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,545 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,563 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isMinor] inverted with cardinality[2] in 6 millis.
2019-01-08T13:15:30,563 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,563 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,591 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isNew] inverted with cardinality[2] in 7 millis.
2019-01-08T13:15:30,591 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,591 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,607 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isRobot] inverted with cardinality[2] in 6 millis.
2019-01-08T13:15:30,607 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,607 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,632 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[isUnpatrolled] inverted with cardinality[2] in 6 millis.
2019-01-08T13:15:30,632 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,632 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[8]
2019-01-08T13:15:30,652 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[metroCode] inverted with cardinality[93] in 6 millis.
2019-01-08T13:15:30,652 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[280]
2019-01-08T13:15:30,652 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[276]
2019-01-08T13:15:30,662 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[namespace] inverted with cardinality[265] in 8 millis.
2019-01-08T13:15:30,662 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[924]
2019-01-08T13:15:30,663 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[788]
2019-01-08T13:15:30,986 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[page] inverted with cardinality[35,143] in 320 millis.
2019-01-08T13:15:30,986 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[91,296]
2019-01-08T13:15:30,986 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[51,516]
2019-01-08T13:15:30,999 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionIsoCode] inverted with cardinality[408] in 8 millis.
2019-01-08T13:15:30,999 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[1,312]
2019-01-08T13:15:30,999 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[1,048]
2019-01-08T13:15:31,008 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[regionName] inverted with cardinality[539] in 8 millis.
2019-01-08T13:15:31,008 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[1,632]
2019-01-08T13:15:31,008 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[1,276]
2019-01-08T13:15:31,072 INFO [appenderator_merge_0] org.apache.druid.segment.StringDimensionMergerV9 - Completed dim[user] inverted with cardinality[10,531] in 62 millis.
2019-01-08T13:15:31,072 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[29,048]
2019-01-08T13:15:31,072 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMerger - Freeing dictionary merging direct buffer with size[18,728]
2019-01-08T13:15:31,083 INFO [appenderator_merge_0] org.apache.druid.segment.IndexMergerV9 - Completed index.drd in 1 millis.
2019-01-08T13:15:31,083 INFO [appenderator_merge_0] org.apache.druid.java.util.common.io.smoosh.FileSmoosher - Created smoosh file [/opt/druid-0.13.0/var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged/00000.smoosh] of size [4826065] bytes.
2019-01-08T13:15:31,775 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Copying segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z] to HDFS at location[hdfs://vm-hadoop:9000/druid/segments/wikipedia/20150912T000000.000Z_20150913T000000.000Z/2019-01-08T13_15_15.696Z]
2019-01-08T13:15:31,860 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Compressing files from[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged] to [hdfs://vm-hadoop:9000/druid/segments/wikipedia/9421766ec5774aa9a5d91fb69c270027/0_index.zip]
2019-01-08T13:15:31,913 INFO [appenderator_merge_0] org.apache.druid.java.util.common.CompressionUtils - Adding file[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged/00000.smoosh] with size[4,826,065].  Total size so far[0]
2019-01-08T13:15:31,971 WARN [Thread-40] org.apache.hadoop.hdfs.DataStreamer - DataStreamer Exception
org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/9421766ec5774aa9a5d91fb69c270027/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) [hadoop-hdfs-client-2.8.3.jar:?]
2019-01-08T13:15:32,002 WARN [appenderator_merge_0] org.apache.druid.java.util.common.RetryUtils - Retrying (1 of 4) in 1,224ms.
java.lang.IllegalArgumentException: Self-suppression not permitted
    at java.lang.Throwable.addSuppressed(Throwable.java:1043) ~[?:1.8.0_191]
    at org.apache.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:131) ~[?:?]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$mergeAndPush$4(AppenderatorImpl.java:740) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:86) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:114) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:104) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.mergeAndPush(AppenderatorImpl.java:736) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$push$1(AppenderatorImpl.java:623) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at com.google.common.util.concurrent.Futures$1.apply(Futures.java:713) [guava-16.0.1.jar:?]
    at com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:861) [guava-16.0.1.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_191]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
Caused by: org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/9421766ec5774aa9a5d91fb69c270027/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[?:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[?:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) ~[?:?]
2019-01-08T13:15:33,234 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Copying segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z] to HDFS at location[hdfs://vm-hadoop:9000/druid/segments/wikipedia/20150912T000000.000Z_20150913T000000.000Z/2019-01-08T13_15_15.696Z]
2019-01-08T13:15:33,244 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Compressing files from[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged] to [hdfs://vm-hadoop:9000/druid/segments/wikipedia/faaef6c30c7e49459b85559b7d16d64e/0_index.zip]
2019-01-08T13:15:33,252 INFO [appenderator_merge_0] org.apache.druid.java.util.common.CompressionUtils - Adding file[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged/00000.smoosh] with size[4,826,065].  Total size so far[0]
2019-01-08T13:15:33,278 WARN [Thread-42] org.apache.hadoop.hdfs.DataStreamer - DataStreamer Exception
org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/faaef6c30c7e49459b85559b7d16d64e/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) [hadoop-hdfs-client-2.8.3.jar:?]
2019-01-08T13:15:33,293 WARN [appenderator_merge_0] org.apache.druid.java.util.common.RetryUtils - Retrying (2 of 4) in 1,649ms.
java.lang.IllegalArgumentException: Self-suppression not permitted
    at java.lang.Throwable.addSuppressed(Throwable.java:1043) ~[?:1.8.0_191]
    at org.apache.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:131) ~[?:?]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$mergeAndPush$4(AppenderatorImpl.java:740) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:86) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:114) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:104) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.mergeAndPush(AppenderatorImpl.java:736) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$push$1(AppenderatorImpl.java:623) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at com.google.common.util.concurrent.Futures$1.apply(Futures.java:713) [guava-16.0.1.jar:?]
    at com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:861) [guava-16.0.1.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_191]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
Caused by: org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/faaef6c30c7e49459b85559b7d16d64e/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[?:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[?:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) ~[?:?]
2019-01-08T13:15:34,950 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Copying segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z] to HDFS at location[hdfs://vm-hadoop:9000/druid/segments/wikipedia/20150912T000000.000Z_20150913T000000.000Z/2019-01-08T13_15_15.696Z]
2019-01-08T13:15:34,959 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Compressing files from[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged] to [hdfs://vm-hadoop:9000/druid/segments/wikipedia/d2113bde14fb4ec48d6340ea1f5ec974/0_index.zip]
2019-01-08T13:15:34,968 INFO [appenderator_merge_0] org.apache.druid.java.util.common.CompressionUtils - Adding file[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged/00000.smoosh] with size[4,826,065].  Total size so far[0]
2019-01-08T13:15:34,992 WARN [Thread-43] org.apache.hadoop.hdfs.DataStreamer - DataStreamer Exception
org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/d2113bde14fb4ec48d6340ea1f5ec974/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) [hadoop-hdfs-client-2.8.3.jar:?]
2019-01-08T13:15:35,012 WARN [appenderator_merge_0] org.apache.druid.java.util.common.RetryUtils - Retrying (3 of 4) in 3,256ms.
java.lang.IllegalArgumentException: Self-suppression not permitted
    at java.lang.Throwable.addSuppressed(Throwable.java:1043) ~[?:1.8.0_191]
    at org.apache.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:131) ~[?:?]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$mergeAndPush$4(AppenderatorImpl.java:740) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:86) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:114) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:104) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.mergeAndPush(AppenderatorImpl.java:736) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$push$1(AppenderatorImpl.java:623) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at com.google.common.util.concurrent.Futures$1.apply(Futures.java:713) [guava-16.0.1.jar:?]
    at com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:861) [guava-16.0.1.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_191]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
Caused by: org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/d2113bde14fb4ec48d6340ea1f5ec974/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[?:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[?:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) ~[?:?]
2019-01-08T13:15:38,279 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Copying segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z] to HDFS at location[hdfs://vm-hadoop:9000/druid/segments/wikipedia/20150912T000000.000Z_20150913T000000.000Z/2019-01-08T13_15_15.696Z]
2019-01-08T13:15:38,289 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Compressing files from[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged] to [hdfs://vm-hadoop:9000/druid/segments/wikipedia/ebed6400f1364afd859067792add0a90/0_index.zip]
2019-01-08T13:15:38,301 INFO [appenderator_merge_0] org.apache.druid.java.util.common.CompressionUtils - Adding file[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged/00000.smoosh] with size[4,826,065].  Total size so far[0]
2019-01-08T13:15:38,329 WARN [Thread-44] org.apache.hadoop.hdfs.DataStreamer - DataStreamer Exception
org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/ebed6400f1364afd859067792add0a90/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) [hadoop-hdfs-client-2.8.3.jar:?]
2019-01-08T13:15:38,344 WARN [appenderator_merge_0] org.apache.druid.java.util.common.RetryUtils - Retrying (4 of 4) in 6,373ms.
java.lang.IllegalArgumentException: Self-suppression not permitted
    at java.lang.Throwable.addSuppressed(Throwable.java:1043) ~[?:1.8.0_191]
    at org.apache.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:131) ~[?:?]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$mergeAndPush$4(AppenderatorImpl.java:740) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:86) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:114) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:104) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.mergeAndPush(AppenderatorImpl.java:736) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$push$1(AppenderatorImpl.java:623) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at com.google.common.util.concurrent.Futures$1.apply(Futures.java:713) [guava-16.0.1.jar:?]
    at com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:861) [guava-16.0.1.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_191]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
Caused by: org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/ebed6400f1364afd859067792add0a90/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[?:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[?:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) ~[?:?]
2019-01-08T13:15:44,725 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Copying segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z] to HDFS at location[hdfs://vm-hadoop:9000/druid/segments/wikipedia/20150912T000000.000Z_20150913T000000.000Z/2019-01-08T13_15_15.696Z]
2019-01-08T13:15:44,734 INFO [appenderator_merge_0] org.apache.druid.storage.hdfs.HdfsDataSegmentPusher - Compressing files from[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged] to [hdfs://vm-hadoop:9000/druid/segments/wikipedia/50582a7270764dcc8b9b95b6b1c253e8/0_index.zip]
2019-01-08T13:15:44,744 INFO [appenderator_merge_0] org.apache.druid.java.util.common.CompressionUtils - Adding file[var/druid/task/index_wikipedia_2019-01-08T13:15:15.682Z/work/persist/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z/merged/00000.smoosh] with size[4,826,065].  Total size so far[0]
2019-01-08T13:15:44,773 WARN [Thread-45] org.apache.hadoop.hdfs.DataStreamer - DataStreamer Exception
org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/50582a7270764dcc8b9b95b6b1c253e8/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-2.8.3.jar:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[hadoop-common-2.8.3.jar:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[hadoop-hdfs-client-2.8.3.jar:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) [hadoop-hdfs-client-2.8.3.jar:?]
2019-01-08T13:15:44,789 WARN [appenderator_merge_0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Failed to push merged index for segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z].
java.lang.IllegalArgumentException: Self-suppression not permitted
    at java.lang.Throwable.addSuppressed(Throwable.java:1043) ~[?:1.8.0_191]
    at org.apache.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:131) ~[?:?]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$mergeAndPush$4(AppenderatorImpl.java:740) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:86) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:114) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:104) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.mergeAndPush(AppenderatorImpl.java:736) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$push$1(AppenderatorImpl.java:623) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at com.google.common.util.concurrent.Futures$1.apply(Futures.java:713) [guava-16.0.1.jar:?]
    at com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:861) [guava-16.0.1.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_191]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
Caused by: org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/50582a7270764dcc8b9b95b6b1c253e8/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[?:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[?:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) ~[?:?]
2019-01-08T13:15:44,806 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Shutting down...
2019-01-08T13:15:44,824 INFO [appenderator_persist_0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Removing sink for segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-01-08T13:15:15.696Z].
2019-01-08T13:15:44,827 ERROR [task-runner-0-priority-0] org.apache.druid.indexing.common.task.IndexTask - Encountered exception in BUILD_SEGMENTS.
java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: Self-suppression not permitted
    at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?]
    at org.apache.druid.indexing.common.task.IndexTask.generateAndPublishSegments(IndexTask.java:1098) ~[druid-indexing-service-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.indexing.common.task.IndexTask.run(IndexTask.java:466) [druid-indexing-service-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:421) [druid-indexing-service-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:393) [druid-indexing-service-0.13.0-incubating.jar:0.13.0-incubating]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_191]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: Self-suppression not permitted
    at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299) ~[guava-16.0.1.jar:?]
    at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286) ~[guava-16.0.1.jar:?]
    at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) ~[guava-16.0.1.jar:?]
    at org.apache.druid.segment.realtime.appenderator.BatchAppenderatorDriver.pushAndClear(BatchAppenderatorDriver.java:141) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.BatchAppenderatorDriver.pushAllAndClear(BatchAppenderatorDriver.java:124) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.indexing.common.task.IndexTask.generateAndPublishSegments(IndexTask.java:1060) ~[druid-indexing-service-0.13.0-incubating.jar:0.13.0-incubating]
    ... 7 more
Caused by: java.lang.IllegalArgumentException: Self-suppression not permitted
    at java.lang.Throwable.addSuppressed(Throwable.java:1043) ~[?:1.8.0_191]
    at org.apache.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:131) ~[?:?]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$mergeAndPush$4(AppenderatorImpl.java:740) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:86) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:114) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.java.util.common.RetryUtils.retry(RetryUtils.java:104) ~[java-util-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.mergeAndPush(AppenderatorImpl.java:736) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$push$1(AppenderatorImpl.java:623) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
    at com.google.common.util.concurrent.Futures$1.apply(Futures.java:713) ~[guava-16.0.1.jar:?]
    at com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:861) ~[guava-16.0.1.jar:?]
    ... 3 more
Caused by: org.apache.hadoop.ipc.RemoteException: File /druid/segments/wikipedia/50582a7270764dcc8b9b95b6b1c253e8/0_index.zip could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
    at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[?:?]
    at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[?:?]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[?:?]
    at com.sun.proxy.$Proxy135.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_191]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_191]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_191]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_191]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[?:?]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[?:?]
    at com.sun.proxy.$Proxy136.addBlock(Unknown Source) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[?:?]
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) ~[?:?]
2019-01-08T13:15:44,841 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Unregistering chat handler[index_wikipedia_2019-01-08T13:15:15.682Z]
2019-01-08T13:15:44,841 INFO [task-runner-0-priority-0] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-01-08T13:15:15.682Z] status changed to [FAILED].
2019-01-08T13:15:44,843 INFO [task-runner-0-priority-0] org.apache.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
  "id" : "index_wikipedia_2019-01-08T13:15:15.682Z",
  "status" : "FAILED",
  "duration" : 22650,
  "errorMsg" : "java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentExcept..."
}
2019-01-08T13:15:44,847 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.server.listener.announcer.ListenerResourceAnnouncer.stop()] on object[org.apache.druid.query.lookup.LookupResourceListenerAnnouncer@52909a97].
2019-01-08T13:15:44,847 INFO [main] org.apache.druid.curator.announcement.Announcer - unannouncing [/druid/listeners/lookups/__default/http:vm-druid:8100]
2019-01-08T13:15:44,873 INFO [main] org.apache.druid.server.listener.announcer.ListenerResourceAnnouncer - Unannouncing start time on [/druid/listeners/lookups/__default/http:vm-druid:8100]
2019-01-08T13:15:44,873 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.query.lookup.LookupReferencesManager.stop()] on object[org.apache.druid.query.lookup.LookupReferencesManager@18c820d2].
2019-01-08T13:15:44,873 INFO [main] org.apache.druid.query.lookup.LookupReferencesManager - LookupReferencesManager is stopping.
2019-01-08T13:15:44,873 INFO [LookupReferencesManager-MainThread] org.apache.druid.query.lookup.LookupReferencesManager - Lookup Management loop exited, Lookup notices are not handled anymore.
2019-01-08T13:15:44,873 INFO [main] org.apache.druid.query.lookup.LookupReferencesManager - LookupReferencesManager is stopped.
2019-01-08T13:15:44,873 INFO [main] org.apache.druid.server.initialization.jetty.JettyServerModule - Stopping Jetty Server...
2019-01-08T13:15:44,876 INFO [main] org.eclipse.jetty.server.AbstractConnector - Stopped ServerConnector@52b30054{HTTP/1.1,[http/1.1]}{0.0.0.0:8100}
2019-01-08T13:15:44,876 INFO [main] org.eclipse.jetty.server.session - node0 Stopped scavenging
2019-01-08T13:15:44,877 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Stopped o.e.j.s.ServletContextHandler@2e1add6f{/,null,UNAVAILABLE}
2019-01-08T13:15:44,880 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.indexing.worker.executor.ExecutorLifecycle.stop() throws java.lang.Exception] on object[org.apache.druid.indexing.worker.executor.ExecutorLifecycle@149aa7b2].
2019-01-08T13:15:44,880 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner.stop()] on object[org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner@7f0d8eff].
2019-01-08T13:15:44,880 INFO [main] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-01-08T13:15:15.682Z] status changed to [FAILED].
2019-01-08T13:15:44,888 INFO [main] org.apache.druid.java.util.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2019-01-08T13:15:44.881Z","service":"druid/middleManager","host":"vm-druid:8100","version":"0.13.0-incubating","metric":"task/interrupt/count","value":1,"dataSource":"wikipedia","error":"false","graceful":"false","task":"index_wikipedia_2019-01-08T13:15:15.682Z"}]
2019-01-08T13:15:44,888 INFO [main] org.apache.druid.java.util.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2019-01-08T13:15:44.888Z","service":"druid/middleManager","host":"vm-druid:8100","version":"0.13.0-incubating","metric":"task/interrupt/elapsed","value":0,"dataSource":"wikipedia","error":"false","graceful":"false","task":"index_wikipedia_2019-01-08T13:15:15.682Z"}]
2019-01-08T13:15:44,889 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.discovery.DruidLeaderClient.stop()] on object[org.apache.druid.discovery.DruidLeaderClient@76a9a009].
2019-01-08T13:15:44,889 INFO [main] org.apache.druid.discovery.DruidLeaderClient - Stopped.
2019-01-08T13:15:44,889 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.curator.discovery.ServerDiscoverySelector.stop() throws java.io.IOException] on object[org.apache.druid.curator.discovery.ServerDiscoverySelector@2f4d32bf].
2019-01-08T13:15:44,890 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.curator.announcement.Announcer.stop()] on object[org.apache.druid.curator.announcement.Announcer@19fbc594].
2019-01-08T13:15:44,890 INFO [main] org.apache.druid.curator.announcement.Announcer - Stopping announcer
2019-01-08T13:15:44,891 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.discovery.DruidLeaderClient.stop()] on object[org.apache.druid.discovery.DruidLeaderClient@1693ff90].
2019-01-08T13:15:44,891 INFO [main] org.apache.druid.discovery.DruidLeaderClient - Stopped.
2019-01-08T13:15:44,891 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.curator.discovery.ServerDiscoverySelector.stop() throws java.io.IOException] on object[org.apache.druid.curator.discovery.ServerDiscoverySelector@245cb8df].
2019-01-08T13:15:44,891 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider.stop()] on object[org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider@53ea380b].
2019-01-08T13:15:44,891 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider - stopping
2019-01-08T13:15:44,891 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeDiscoveryProvider - stopped
2019-01-08T13:15:44,891 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.java.util.http.client.NettyHttpClient.stop()] on object[org.apache.druid.java.util.http.client.NettyHttpClient@78116659].
2019-01-08T13:15:44,929 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.storage.hdfs.HdfsStorageAuthentication.stop()] on object[org.apache.druid.storage.hdfs.HdfsStorageAuthentication@e3899fd].
2019-01-08T13:15:44,930 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.java.util.metrics.MonitorScheduler.stop()] on object[org.apache.druid.java.util.metrics.MonitorScheduler@24d8f87a].
2019-01-08T13:15:44,933 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.java.util.emitter.service.ServiceEmitter.close() throws java.io.IOException] on object[ServiceEmitter{serviceDimensions={service=druid/middleManager, host=vm-druid:8100, version=0.13.0-incubating}, emitter=LoggingEmitter{log=Logger{name=[org.apache.druid.java.util.emitter.core.LoggingEmitter], class[class org.apache.logging.slf4j.Log4jLogger]}, level=INFO}}].
2019-01-08T13:15:44,933 INFO [main] org.apache.druid.java.util.emitter.core.LoggingEmitter - Close: started [false]
2019-01-08T13:15:44,933 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void org.apache.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner.stop()] on object[org.apache.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner@226d5af0].
swd543 commented 5 years ago

I fixed this issue by copying the files in $HADOOP_HOME/etc/hadoop/*.xml to $DRUID_HOME/conf/druid/_common/. Just a reminder to myself to read the docs before making a fool out of myself.