apache / druid

Apache Druid: a high performance real-time analytics database.
https://druid.apache.org/
Apache License 2.0
13.47k stars 3.7k forks source link

IOException: No FileSystem for scheme: hdfs #1813

Closed mark1900 closed 8 years ago

mark1900 commented 9 years ago

I merged the changes in #1721 (https://github.com/druid-io/druid/pull/1721/files) into the druid release tagged "druid-0.8.1" and it seems that the issue this is supposed to address still occurs. See my previous comments on #1022 where the fix in #1721 when merged against the released tagged "druid-0.8.0" seems to work. Other related issues: #1721 , #1714 .

2015-10-08T17:39:01,106 INFO [druid_datasource_01-2015-10-08T17:30:00.000Z-persist-n-merge] io.druid.storage.hdfs.HdfsDataSegmentPusher - Copying segment[druid_datasource_01_2015-10-08T17:30:00.000Z_2015-10-08T17:31:00.000Z_2015-10-08T17:32:20.971Z] to HDFS at location[hdfs://server1:9000/druid/druid-hdfs-storage/druid_datasource_01/20151008T173000.000Z_20151008T173100.000Z/2015-10-08T17_32_20.971Z/0]
2015-10-08T17:39:01,109 ERROR [druid_datasource_01-2015-10-08T17:30:00.000Z-persist-n-merge] io.druid.segment.realtime.plumber.RealtimePlumber - Failed to persist merged index[druid_datasource_01]: {class=io.druid.segment.realtime.plumber.RealtimePlumber, exceptionType=class java.io.IOException, exceptionMessage=No FileSystem for scheme: hdfs, interval=2015-10-08T17:30:00.000Z/2015-10-08T17:31:00.000Z}
java.io.IOException: No FileSystem for scheme: hdfs
        at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2304) ~[?:?]
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2311) ~[?:?]
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:90) ~[?:?]
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2350) ~[?:?]
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2332) ~[?:?]
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:369) ~[?:?]
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) ~[?:?]
        at io.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:83) ~[?:?]
        at io.druid.segment.realtime.plumber.RealtimePlumber$4.doRun(RealtimePlumber.java:454) [druid-server-0.8.1.jar:0.8.1]
        at io.druid.common.guava.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:40) [druid-common-0.8.1.jar:0.8.1]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_60]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_60]
        at java.lang.Thread.run(Thread.java:745) [?:1.8.0_60]
2015-10-08T17:39:01,133 INFO [druid_datasource_01-2015-10-08T17:30:00.000Z-persist-n-merge] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"alerts","timestamp":"2015-10-08T17:39:01.131Z","service":"overlord","host":"server1:8100","severity":"component-failure","description":"Failed to persist merged index[druid_datasource_01]","data":{"class":"io.druid.segment.realtime.plumber.RealtimePlumber","exceptionType":"java.io.IOException","exceptionMessage":"No FileSystem for scheme: hdfs","exceptionStackTrace":"java.io.IOException: No FileSystem for scheme: hdfs\n\tat org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2304)\n\tat org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2311)\n\tat org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:90)\n\tat org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2350)\n\tat org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2332)\n\tat org.apache.hadoop.fs.FileSystem.get(FileSystem.java:369)\n\tat org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)\n\tat io.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:83)\n\tat io.druid.segment.realtime.plumber.RealtimePlumber$4.doRun(RealtimePlumber.java:454)\n\tat io.druid.common.guava.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:40)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)\n\tat java.lang.Thread.run(Thread.java:745)\n","interval":"2015-10-08T17:30:00.000Z/2015-10-08T17:31:00.000Z"}}]
himanshug commented 9 years ago

is it working with 0.8.0 and not working with 0.8.1 ?

mark1900 commented 9 years ago

Yes that patch seems to work with 0.8.0 though it seems to fail with 0.8.1. The above stack trace is on of a druid 0.8.1 instance with the patch from #1721 applied.

himanshug commented 9 years ago

hmmm, don't think anything related to that change... let me try another thing by making setup more early at class loading time... can you apply https://github.com/druid-io/druid/compare/master...himanshug:hdfs_no_scheme patch and see if that fixes things?

mark1900 commented 9 years ago

Applied the patch given here: https://github.com/druid-io/druid/compare/master...himanshug:hdfs_no_scheme to the HdfsStorageDruidModule (Tag "druid-0.8.1"). I still get the exception given above.

himanshug commented 9 years ago

can you attach your log? also, did you see this log

LOG.info("Setting up Hdfs Storage Module with classloader used for loading the class");
mark1900 commented 9 years ago

Yes I see that log entry you mentioned.

2015-10-09T18:44:11,957 INFO [main] io.druid.storage.hdfs.HdfsStorageDruidModule - Setting up Hdfs Storage Module with classloader used for loading the class

More detailed logs given below:

2015-10-09T18:44:07,217 INFO [main] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2015-10-09T18:44:07,220 INFO [main] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
Oct 09, 2015 6:44:07 PM org.hibernate.validator.internal.util.Version <clinit> 
INFO: HV000001: Hibernate Validator 5.1.3.Final
2015-10-09T18:44:08,064 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, coordinates=[io.druid.extensions:postgresql-metadata-storage, io.druid.extensions:druid-kafka-eight, io.druid.extensions:druid-hdfs-storage, io.druid.extensions:druid-histogram], defaultVersion='0.8.1', localRepository='extensions-repo', remoteRepositories=[https://repo1.maven.org/maven2/, https://metamx.artifactoryonline.com/metamx/pub-libs-releases-local]}]
2015-10-09T18:44:08,273 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:postgresql-metadata-storage] for class[io.druid.cli.CliCommandCreator]
2015-10-09T18:44:09,509 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/io/druid/extensions/postgresql-metadata-storage/0.8.1/postgresql-metadata-storage-0.8.1.jar]
2015-10-09T18:44:09,525 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/postgresql/postgresql/9.3-1102-jdbc41/postgresql-9.3-1102-jdbc41.jar]
2015-10-09T18:44:09,525 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/jdbi/jdbi/2.32/jdbi-2.32.jar]
2015-10-09T18:44:09,526 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-kafka-eight] for class[io.druid.cli.CliCommandCreator]
2015-10-09T18:44:09,813 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/io/druid/extensions/druid-kafka-eight/0.8.1/druid-kafka-eight-0.8.1.jar]
2015-10-09T18:44:09,813 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/kafka/kafka_2.10/0.8.2.1/kafka_2.10-0.8.2.1.jar]
2015-10-09T18:44:09,813 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/com/yammer/metrics/metrics-core/2.2.0/metrics-core-2.2.0.jar]
2015-10-09T18:44:09,813 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar]
2015-10-09T18:44:09,813 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/kafka/kafka-clients/0.8.2.1/kafka-clients-0.8.2.1.jar]
2015-10-09T18:44:09,813 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/net/jpountz/lz4/lz4/1.3.0/lz4-1.3.0.jar]
2015-10-09T18:44:09,813 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/xerial/snappy/snappy-java/1.1.1.6/snappy-java-1.1.1.6.jar]
2015-10-09T18:44:09,813 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar]
2015-10-09T18:44:09,813 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar]
2015-10-09T18:44:09,814 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/log4j/log4j/1.2.16/log4j-1.2.16.jar]
2015-10-09T18:44:09,814 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/jline/jline/0.9.94/jline-0.9.94.jar]
2015-10-09T18:44:09,814 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/io/netty/netty/3.7.0.Final/netty-3.7.0.Final.jar]
2015-10-09T18:44:09,814 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/net/sf/jopt-simple/jopt-simple/3.2/jopt-simple-3.2.jar]
2015-10-09T18:44:09,814 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/com/101tec/zkclient/0.3/zkclient-0.3.jar]
2015-10-09T18:44:09,818 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-hdfs-storage] for class[io.druid.cli.CliCommandCreator]
2015-10-09T18:44:10,860 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/io/druid/extensions/druid-hdfs-storage/0.8.1/druid-hdfs-storage-0.8.1.jar]
2015-10-09T18:44:10,860 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/net/java/dev/jets3t/jets3t/0.9.3/jets3t-0.9.3.jar]
2015-10-09T18:44:10,860 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/codehaus/jackson/jackson-core-asl/1.9.12/jackson-core-asl-1.9.12.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/codehaus/jackson/jackson-mapper-asl/1.9.12/jackson-mapper-asl-1.9.12.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/javax/activation/activation/1.1.1/activation-1.1.1.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/mx4j/mx4j/3.0.2/mx4j-3.0.2.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/javax/mail/mail/1.4.7/mail-1.4.7.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/bouncycastle/bcprov-jdk15on/1.51/bcprov-jdk15on-1.51.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/httpcomponents/httpclient/4.2/httpclient-4.2.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/httpcomponents/httpcore/4.2/httpcore-4.2.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-client/2.3.0/hadoop-client-2.3.0.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-common/2.3.0/hadoop-common-2.3.0.jar]
2015-10-09T18:44:10,861 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar]
2015-10-09T18:44:10,862 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar]
2015-10-09T18:44:10,862 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/xmlenc/xmlenc/0.52/xmlenc-0.52.jar]
2015-10-09T18:44:10,862 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar]
2015-10-09T18:44:10,862 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-net/commons-net/3.1/commons-net-3.1.jar]
2015-10-09T18:44:10,862 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar]
2015-10-09T18:44:10,862 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/log4j/log4j/1.2.17/log4j-1.2.17.jar]
2015-10-09T18:44:10,862 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-lang/commons-lang/2.6/commons-lang-2.6.jar]
2015-10-09T18:44:10,862 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar]
2015-10-09T18:44:10,863 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-digester/commons-digester/1.8/commons-digester-1.8.jar]
2015-10-09T18:44:10,863 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar]
2015-10-09T18:44:10,863 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar]
2015-10-09T18:44:10,863 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar]
2015-10-09T18:44:10,864 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/avro/avro/1.7.4/avro-1.7.4.jar]
2015-10-09T18:44:10,864 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar]
2015-10-09T18:44:10,864 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/xerial/snappy/snappy-java/1.0.4.1/snappy-java-1.0.4.1.jar]
2015-10-09T18:44:10,864 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar]
2015-10-09T18:44:10,864 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-auth/2.3.0/hadoop-auth-2.3.0.jar]
2015-10-09T18:44:10,865 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar]
2015-10-09T18:44:10,865 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar]
2015-10-09T18:44:10,865 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/tukaani/xz/1.0/xz-1.0.jar]
2015-10-09T18:44:10,865 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-hdfs/2.3.0/hadoop-hdfs-2.3.0.jar]
2015-10-09T18:44:10,865 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar]
2015-10-09T18:44:10,865 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-app/2.3.0/hadoop-mapreduce-client-app-2.3.0.jar]
2015-10-09T18:44:10,865 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-common/2.3.0/hadoop-mapreduce-client-common-2.3.0.jar]
2015-10-09T18:44:10,865 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-yarn-client/2.3.0/hadoop-yarn-client-2.3.0.jar]
2015-10-09T18:44:10,865 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-yarn-server-common/2.3.0/hadoop-yarn-server-common-2.3.0.jar]
2015-10-09T18:44:10,866 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.3.0/hadoop-mapreduce-client-shuffle-2.3.0.jar]
2015-10-09T18:44:10,866 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-yarn-api/2.3.0/hadoop-yarn-api-2.3.0.jar]
2015-10-09T18:44:10,866 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-core/2.3.0/hadoop-mapreduce-client-core-2.3.0.jar]
2015-10-09T18:44:10,866 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-yarn-common/2.3.0/hadoop-yarn-common-2.3.0.jar]
2015-10-09T18:44:10,866 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar]
2015-10-09T18:44:10,866 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar]
2015-10-09T18:44:10,866 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar]
2015-10-09T18:44:10,866 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/com/sun/jersey/jersey-core/1.19/jersey-core-1.19.jar]
2015-10-09T18:44:10,867 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/javax/ws/rs/jsr311-api/1.1.1/jsr311-api-1.1.1.jar]
2015-10-09T18:44:10,867 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.3.0/hadoop-mapreduce-client-jobclient-2.3.0.jar]
2015-10-09T18:44:10,867 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/org/apache/hadoop/hadoop-annotations/2.3.0/hadoop-annotations-2.3.0.jar]
2015-10-09T18:44:10,867 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/com/metamx/emitter/0.3.1/emitter-0.3.1.jar]
2015-10-09T18:44:10,867 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/com/metamx/http-client/1.0.2/http-client-1.0.2.jar]
2015-10-09T18:44:10,867 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/io/netty/netty/3.9.5.Final/netty-3.9.5.Final.jar]
2015-10-09T18:44:10,867 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/commons-io/commons-io/2.0.1/commons-io-2.0.1.jar]
2015-10-09T18:44:10,932 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-histogram] for class[io.druid.cli.CliCommandCreator]
2015-10-09T18:44:11,397 INFO [main] io.druid.initialization.Initialization - Added URL[file:/opt/druid/druid-0.8.1/extensions-repo/io/druid/extensions/druid-histogram/0.8.1/druid-histogram-0.8.1.jar]
2015-10-09T18:44:11,935 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:postgresql-metadata-storage] for class[io.druid.initialization.DruidModule]
2015-10-09T18:44:11,936 INFO [main] io.druid.initialization.Initialization - Adding remote extension module[io.druid.metadata.storage.postgresql.PostgreSQLMetadataStorageModule] for class[io.druid.initialization.DruidModule]
2015-10-09T18:44:11,936 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-kafka-eight] for class[io.druid.initialization.DruidModule]
2015-10-09T18:44:11,955 INFO [main] io.druid.initialization.Initialization - Adding remote extension module[io.druid.firehose.kafka.KafkaEightDruidModule] for class[io.druid.initialization.DruidModule]
2015-10-09T18:44:11,955 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-hdfs-storage] for class[io.druid.initialization.DruidModule]
2015-10-09T18:44:11,957 INFO [main] io.druid.storage.hdfs.HdfsStorageDruidModule - Setting up Hdfs Storage Module with classloader used for loading the class
2015-10-09T18:44:12,780 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-10-09T18:44:13,358 INFO [main] io.druid.initialization.Initialization - Adding remote extension module[io.druid.storage.hdfs.HdfsStorageDruidModule] for class[io.druid.initialization.DruidModule]
2015-10-09T18:44:13,359 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-histogram] for class[io.druid.initialization.DruidModule]
2015-10-09T18:44:13,360 INFO [main] io.druid.initialization.Initialization - Adding remote extension module[io.druid.query.aggregation.histogram.ApproximateHistogramDruidModule] for class[io.druid.initialization.DruidModule]
2015-10-09T18:44:15,177 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class com.metamx.emitter.core.LoggingEmitterConfig] from props[druid.emitter.logging.] as [LoggingEmitterConfig{loggerClass='com.metamx.emitter.core.LoggingEmitter', logLevel='info'}]
2015-10-09T18:44:15,265 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.DruidMonitorSchedulerConfig] from props[druid.monitoring.] as [io.druid.server.metrics.DruidMonitorSchedulerConfig@4acb7ecc]
2015-10-09T18:44:15,273 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.MonitorsConfig] from props[druid.monitoring.] as [MonitorsConfig{monitors=[]}]
2015-10-09T18:44:15,310 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.DruidNode] from props[druid.] as [DruidNode{serviceName='overlord', host='server1', port=8100}]
2015-10-09T18:44:15,338 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ServerConfig] from props[druid.server.http.] as [ServerConfig{numThreads=40, maxIdleTime=PT5M}]
2015-10-09T18:44:15,350 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.config.TaskConfig] from props[druid.indexer.task.] as [io.druid.indexing.common.config.TaskConfig@4993febc]
2015-10-09T18:44:15,354 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.http.DruidHttpClientConfig] from props[druid.global.http.] as [io.druid.guice.http.DruidHttpClientConfig@177515d1]
2015-10-09T18:44:15,430 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.indexing.IndexingServiceSelectorConfig] from props[druid.selectors.indexing.] as [io.druid.client.indexing.IndexingServiceSelectorConfig@79b2852b]
2015-10-09T18:44:15,434 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.curator.CuratorConfig] from props[druid.zk.service.] as [io.druid.curator.CuratorConfig@57f847af]
2015-10-09T18:44:15,451 WARN [main] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29
2015-10-09T18:44:15,517 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.CuratorDiscoveryConfig] from props[druid.discovery.curator.] as [io.druid.server.initialization.CuratorDiscoveryConfig@6ed922e1]
2015-10-09T18:44:15,861 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.RetryPolicyConfig] from props[druid.peon.taskActionClient.retry.] as [io.druid.indexing.common.RetryPolicyConfig@4f2d995e]
2015-10-09T18:44:15,863 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.storage.hdfs.HdfsDataSegmentPusherConfig] from props[druid.storage.] as [io.druid.storage.hdfs.HdfsDataSegmentPusherConfig@40729f01]
2015-10-09T18:44:15,864 INFO [main] io.druid.storage.hdfs.HdfsDataSegmentPusher - Configured HDFS as deep storage
2015-10-09T18:44:15,917 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.DruidServerConfig] from props[druid.server.] as [io.druid.client.DruidServerConfig@21a9a705]
2015-10-09T18:44:15,920 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from props[druid.announcer.] as [io.druid.server.initialization.BatchDataSegmentAnnouncerConfig@712c5463]
2015-10-09T18:44:15,938 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ZkPathsConfig] from props[druid.zk.paths.] as [io.druid.server.initialization.ZkPathsConfig@58d3f4be]
2015-10-09T18:44:15,943 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.coordination.DataSegmentAnnouncerProvider] from props[druid.announcer.] as [io.druid.server.coordination.BatchDataSegmentAnnouncerProvider@3a6045c6]
2015-10-09T18:44:15,971 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.client.FilteredServerViewProvider] from props[druid.announcer.] as [io.druid.client.FilteredBatchServerViewProvider@58af5076]
2015-10-09T18:44:15,992 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [100000000] for [druid.computation.buffer.size] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2015-10-09T18:44:15,993 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [1] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2015-10-09T18:44:15,994 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2015-10-09T18:44:15,995 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2015-10-09T18:44:16,144 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.search.search.SearchQueryConfig] from props[druid.query.search.] as [io.druid.query.search.search.SearchQueryConfig@630bf683]
2015-10-09T18:44:16,148 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.groupby.GroupByQueryConfig] from props[druid.query.groupBy.] as [io.druid.query.groupby.GroupByQueryConfig@4dac121d]
2015-10-09T18:44:16,170 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.topn.TopNQueryConfig] from props[druid.query.topN.] as [io.druid.query.topn.TopNQueryConfig@f438904]
2015-10-09T18:44:16,233 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.log.RequestLoggerProvider] from props[druid.request.logging.] as [io.druid.server.log.NoopRequestLoggerProvider@1b01a0d]
2015-10-09T18:44:16,255 INFO [main] org.eclipse.jetty.util.log - Logging initialized @11012ms
2015-10-09T18:44:16,420 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.core.LoggingEmitter.start()] on object[com.metamx.emitter.core.LoggingEmitter@54336976].
2015-10-09T18:44:16,420 INFO [main] com.metamx.emitter.core.LoggingEmitter - Start: started [true]
2015-10-09T18:44:16,421 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.service.ServiceEmitter.start()] on object[com.metamx.emitter.service.ServiceEmitter@3b2e5c0d].
2015-10-09T18:44:16,421 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.metrics.MonitorScheduler.start()] on object[com.metamx.metrics.MonitorScheduler@7499eac7].
2015-10-09T18:44:16,434 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.http.client.NettyHttpClient.start()] on object[com.metamx.http.client.NettyHttpClient@8bd076a].
2015-10-09T18:44:16,434 INFO [main] io.druid.curator.CuratorModule - Starting Curator
2015-10-09T18:44:16,434 INFO [main] org.apache.curator.framework.imps.CuratorFrameworkImpl - Starting
2015-10-09T18:44:16,455 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:host.name=server1
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_60
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.60-2.b27.el7_1.x86_64/jre
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=config/_common:config/overlord:lib/druid-services-0.8.1.jar:lib/druid-common-0.8.1.jar:lib/java-util-0.27.0.jar:lib/slf4j-api-1.6.4.jar:lib/config-magic-0.9.jar:lib/guava-16.0.1.jar:lib/jackson-annotations-2.4.4.jar:lib/jackson-core-2.4.4.jar:lib/jackson-databind-2.4.4.jar:lib/opencsv-2.3.jar:lib/joda-time-2.6.jar:lib/druid-api-0.3.9.jar:lib/guice-4.0-beta.jar:lib/javax.inject-1.jar:lib/aopalliance-1.0.jar:lib/guice-multibindings-4.0-beta.jar:lib/airline-0.6.jar:lib/jackson-dataformat-smile-2.4.4.jar:lib/hibernate-validator-5.1.3.Final.jar:lib/validation-api-1.1.0.Final.jar:lib/jboss-logging-3.1.3.GA.jar:lib/classmate-1.0.0.jar:lib/commons-io-2.0.1.jar:lib/commons-codec-1.7.jar:lib/commons-dbcp2-2.0.1.jar:lib/commons-pool2-2.2.jar:lib/commons-logging-1.1.1.jar:lib/commons-pool-1.6.jar:lib/javax.el-api-3.0.0.jar:lib/jackson-datatype-guava-2.4.4.jar:lib/jackson-datatype-joda-2.4.4.jar:lib/jdbi-2.32.jar:lib/jsr305-2.0.1.jar:lib/log4j-api-2.2.jar:lib/log4j-core-2.2.jar:lib/log4j-slf4j-impl-2.2.jar:lib/log4j-jul-2.2.jar:lib/log4j-1.2-api-2.2.jar:lib/jcl-over-slf4j-1.7.10.jar:lib/disruptor-3.3.0.jar:lib/druid-server-0.8.1.jar:lib/druid-processing-0.8.1.jar:lib/bytebuffer-collections-0.1.6.jar:lib/extendedset-1.3.9.jar:lib/RoaringBitmap-0.4.5.jar:lib/emitter-0.3.1.jar:lib/http-client-1.0.2.jar:lib/netty-3.9.5.Final.jar:lib/compress-lzf-1.0.3.jar:lib/protobuf-java-2.5.0.jar:lib/icu4j-4.8.1.jar:lib/rhino-1.7R5.jar:lib/lz4-1.3.0.jar:lib/mapdb-1.0.7.jar:lib/druid-aws-common-0.8.1.jar:lib/aws-java-sdk-1.8.11.jar:lib/aws-java-sdk-core-1.8.11.jar:lib/httpclient-4.2.jar:lib/httpcore-4.2.jar:lib/druid-console-0.0.2.jar:lib/server-metrics-0.2.0.jar:lib/commons-cli-1.2.jar:lib/commons-lang-2.6.jar:lib/javax.el-3.0.0.jar:lib/curator-framework-2.8.0.jar:lib/curator-client-2.8.0.jar:lib/zookeeper-3.4.6.jar:lib/jline-0.9.94.jar:lib/curator-x-discovery-2.8.0.jar:lib/curator-recipes-2.8.0.jar:lib/jackson-mapper-asl-1.9.13.jar:lib/jackson-core-asl-1.9.12.jar:lib/jackson-jaxrs-json-provider-2.4.4.jar:lib/jackson-jaxrs-base-2.4.4.jar:lib/jackson-module-jaxb-annotations-2.4.4.jar:lib/jackson-jaxrs-smile-provider-2.4.4.jar:lib/jersey-server-1.19.jar:lib/jersey-core-1.19.jar:lib/jsr311-api-1.1.1.jar:lib/guice-servlet-4.0-beta.jar:lib/jersey-guice-1.19.jar:lib/jersey-servlet-1.19.jar:lib/jetty-server-9.2.5.v20141112.jar:lib/javax.servlet-api-3.1.0.jar:lib/jetty-http-9.2.5.v20141112.jar:lib/jetty-util-9.2.5.v20141112.jar:lib/jetty-io-9.2.5.v20141112.jar:lib/jetty-proxy-9.2.5.v20141112.jar:lib/jetty-client-9.2.5.v20141112.jar:lib/tesla-aether-0.0.5.jar:lib/aether-api-0.9.0.M2.jar:lib/aether-spi-0.9.0.M2.jar:lib/aether-util-0.9.0.M2.jar:lib/aether-impl-0.9.0.M2.jar:lib/aether-connector-file-0.9.0.M2.jar:lib/aether-connector-okhttp-0.0.9.jar:lib/okhttp-1.0.2.jar:lib/wagon-provider-api-2.4.jar:lib/plexus-utils-3.0.15.jar:lib/maven-aether-provider-3.1.1.jar:lib/maven-model-3.1.1.jar:lib/maven-model-builder-3.1.1.jar:lib/plexus-interpolation-1.19.jar:lib/maven-repository-metadata-3.1.1.jar:lib/maven-settings-builder-3.1.1.jar:lib/irc-api-1.0-0011.jar:lib/maven-settings-3.1.1.jar:lib/antlr4-runtime-4.0.jar:lib/org.abego.treelayout.core-1.0.1.jar:lib/spymemcached-2.11.7.jar:lib/jetty-servlet-9.2.5.v20141112.jar:lib/jetty-security-9.2.5.v20141112.jar:lib/jetty-servlets-9.2.5.v20141112.jar:lib/jetty-continuation-9.2.5.v20141112.jar:lib/geoip2-0.4.0.jar:lib/maxminddb-0.2.0.jar:lib/google-http-client-1.15.0-rc.jar:lib/xpp3-1.1.4c.jar:lib/google-http-client-jackson2-1.15.0-rc.jar:lib/derby-10.11.1.1.jar:lib/derbynet-10.11.1.1.jar:lib/derbyclient-10.11.1.1.jar:lib/druid-indexing-hadoop-0.8.1.jar:lib/jets3t-0.9.3.jar:lib/activation-1.1.1.jar:lib/mx4j-3.0.2.jar:lib/mail-1.4.7.jar:lib/bcprov-jdk15on-1.51.jar:lib/java-xmlbuilder-0.4.jar:lib/druid-indexing-service-0.8.1.jar:/opt/druid/hadoop-2.7.0/etc
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=/tmp
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.version=3.10.0-229.14.1.el7.x86_64
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.name=druid
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.home=/home/druid
2015-10-09T18:44:16,456 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/opt/druid/druid-0.8.1
2015-10-09T18:44:16,457 INFO [main] org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=server1 sessionTimeout=30000 watcher=org.apache.curator.ConnectionState@5e230fc6
2015-10-09T18:44:16,542 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.discovery.ServerDiscoverySelector.start() throws java.lang.Exception] on object[io.druid.curator.discovery.ServerDiscoverySelector@2349f14d].
2015-10-09T18:44:16,573 INFO [main-SendThread(server1:2181)] org.apache.zookeeper.ClientCnxn - Opening socket connection to server server1/10.10.20.20:2181. Will not attempt to authenticate using SASL (unknown error)
2015-10-09T18:44:16,583 INFO [main-SendThread(server1:2181)] org.apache.zookeeper.ClientCnxn - Socket connection established to server1/10.10.20.20:2181, initiating session
2015-10-09T18:44:16,597 INFO [main-SendThread(server1:2181)] org.apache.zookeeper.ClientCnxn - Session establishment complete on server server1/10.10.20.20:2181, sessionid = 0x150482f7f89001b, negotiated timeout = 30000
2015-10-09T18:44:16,601 INFO [main-EventThread] org.apache.curator.framework.state.ConnectionStateManager - State change: CONNECTED
2015-10-09T18:44:17,015 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.announcement.Announcer.start()] on object[io.druid.curator.announcement.Announcer@1cc8416a].
2015-10-09T18:44:17,016 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.client.ServerInventoryView.start() throws java.lang.Exception] on object[io.druid.client.BatchServerInventoryView@68ab6ab0].
2015-10-09T18:44:17,058 INFO [main] org.eclipse.jetty.server.Server - jetty-9.2.5.v20141112
2015-10-09T18:44:18,120 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/server1:8083
2015-10-09T18:44:18,120 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='server1:8083', host='server1:8083', maxSize=10000000000, tier='_default_tier', type='historical', priority='0'}]
2015-10-09T18:44:18,151 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - Inventory Initialized
Oct 09, 2015 6:44:18 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class
Oct 09, 2015 6:44:18 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering io.druid.server.StatusResource as a root resource class
Oct 09, 2015 6:44:18 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.19 02/11/2015 03:25 AM'
Oct 09, 2015 6:44:18 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
Oct 09, 2015 6:44:19 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.QueryResource to GuiceInstantiatedComponentProvider
Oct 09, 2015 6:44:19 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.segment.realtime.firehose.ChatHandlerResource to GuiceInstantiatedComponentProvider
Oct 09, 2015 6:44:19 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2015-10-09T18:44:19,350 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@42c0a16e{/,null,AVAILABLE}
2015-10-09T18:44:19,357 INFO [main] org.eclipse.jetty.server.ServerConnector - Started ServerConnector@5c3d4f05{HTTP/1.1}{0.0.0.0:8100}
2015-10-09T18:44:19,357 INFO [main] org.eclipse.jetty.server.Server - Started @14131ms
2015-10-09T18:44:19,357 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.start()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@119290b9].
2015-10-09T18:44:19,358 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Announcing self[DruidServerMetadata{name='server1:8100', host='server1:8100', maxSize=0, tier='_default_tier', type='realtime', priority='0'}] at [/druid/announcements/server1:8100]
2015-10-09T18:44:19,376 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/server1:8101
2015-10-09T18:44:19,377 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='server1:8101', host='server1:8101', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2015-10-09T18:44:19,470 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.start()] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@2dbfa972].
2015-10-09T18:44:19,630 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/server1:8100
2015-10-09T18:44:19,631 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='server1:8100', host='server1:8100', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2015-10-09T18:44:19,785 INFO [main] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2015-10-09T18:44:19,786 INFO [main] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-10-09T18:44:19,798 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [100000000] for [druid.computation.buffer.size] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2015-10-09T18:44:19,798 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [1] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2015-10-09T18:44:19,798 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2015-10-09T18:44:19,798 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2015-10-09T18:44:19,815 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.segment.data.BitmapSerdeFactory] from props[druid.processing.bitmap.] as [ConciseBitmapSerdeFactory{}]
2015-10-09T18:44:19,896 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Running with task: {
  "type" : "index_realtime",
  "id" : "index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0",
  "resource" : {
    "availabilityGroup" : "druid_datasource_01-41-0000",
    "requiredCapacity" : 1
  },
  "spec" : {
    "dataSchema" : {
      "dataSource" : "druid_datasource_01",
      "parser" : {
        "type" : "map",
        "parseSpec" : {
          "format" : "json",
          "timestampSpec" : {
            "column" : "dimE",
            "format" : "auto",
            "missingValue" : null
          },
          "dimensionsSpec" : {
            "dimensions" : [ "dimA", "dimB", "dimC", "dimD", "dimE", "dimF", "dimG", "dimH", "dimI", "dimJ", "dimK", "dimL" ],
            "dimensionExclusions" : [ "numeric1", "numeric2" ],
            "spatialDimensions" : [ ]
          }
        }
      },
      "metricsSpec" : [ {
        "type" : "count",
        "name" : "count"
      }, {
        "type" : "longSum",
        "name" : "numeric1",
        "fieldName" : "numeric1"
      }, {
        "type" : "longSum",
        "name" : "numeric2",
        "fieldName" : "numeric2"
      }, {
        "type" : "approxHistogram",
        "name" : "dwellHistogram",
        "fieldName" : "numeric2",
        "resolution" : 50,
        "numBuckets" : 7,
        "lowerLimit" : "-Infinity",
        "upperLimit" : "Infinity"
      } ],
      "granularitySpec" : {
        "type" : "uniform",
        "segmentGranularity" : "MINUTE",
        "queryGranularity" : {
          "type" : "duration",
          "duration" : 60000,
          "origin" : "1970-01-01T00:00:00.000Z"
        },
        "intervals" : null
      }
    },
    "ioConfig" : {
      "type" : "realtime",
      "firehose" : {
        "type" : "clipped",
        "delegate" : {
          "type" : "timed",
          "delegate" : {
            "type" : "receiver",
            "serviceName" : "druid:firehose:druid_datasource_01-41-0000-0000",
            "bufferSize" : 100000
          },
          "shutoffTime" : "2015-10-09T14:50:00.000-04:00"
        },
        "interval" : "2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z"
      }
    },
    "tuningConfig" : {
      "type" : "realtime",
      "maxRowsInMemory" : 75000,
      "intermediatePersistPeriod" : "PT10M",
      "windowPeriod" : "PT3M",
      "basePersistDirectory" : "/opt/druid/druid-0.8.1/C:\\Users\\user\\AppData\\Local\\Temp\\1444416243986-0",
      "versioningPolicy" : {
        "type" : "intervalStart"
      },
      "rejectionPolicy" : {
        "type" : "none"
      },
      "maxPendingPersists" : 0,
      "shardSpec" : {
        "type" : "linear",
        "partitionNum" : 0
      },
      "indexSpec" : {
        "bitmap" : {
          "type" : "concise"
        },
        "dimensionCompression" : null,
        "metricCompression" : null
      },
      "persistInHeap" : false,
      "ingestOffheap" : false,
      "aggregationBufferRatio" : 0.5,
      "bufferSize" : 134217728
    }
  },
  "groupId" : "index_realtime_druid_datasource_01",
  "dataSource" : "druid_datasource_01"
}
2015-10-09T18:44:19,927 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Running task: index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0
2015-10-09T18:44:19,928 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0]: LockListAction{}
2015-10-09T18:44:19,975 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0] to overlord[http://server1:8090/druid/indexer/v1/action]: LockListAction{}
2015-10-09T18:44:19,988 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,098 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,098 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,098 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,099 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,099 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,117 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,118 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,118 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,118 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,118 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,119 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,119 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,119 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,119 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,120 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,120 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,120 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,120 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,121 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:44:20,230 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Creating plumber using rejectionPolicy[io.druid.segment.realtime.plumber.NoopRejectionPolicyFactory$1@7749daa1]
2015-10-09T18:44:20,234 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Expect to run at [2015-10-09T18:48:00.000Z]
2015-10-09T18:44:20,238 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2015-10-09T18:44:20,239 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] segments. Attempting to hand off segments that start before [1970-01-01T00:00:00.000Z].
2015-10-09T18:44:20,239 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2015-10-09T18:44:20,242 INFO [task-runner-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Connecting firehose: druid:firehose:druid_datasource_01-41-0000-0000
2015-10-09T18:44:20,242 INFO [task-runner-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Found chathandler of class[io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider]
2015-10-09T18:44:20,243 INFO [task-runner-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Registering Eventhandler[druid:firehose:druid_datasource_01-41-0000-0000]
2015-10-09T18:44:20,247 INFO [task-runner-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Announcing service[DruidNode{serviceName='druid:firehose:druid_datasource_01-41-0000-0000', host='server1', port=8100}]
2015-10-09T18:44:20,284 INFO [task-runner-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Registering Eventhandler[druid_datasource_01-41-0000-0000]
2015-10-09T18:44:20,284 INFO [task-runner-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Announcing service[DruidNode{serviceName='druid_datasource_01-41-0000-0000', host='server1', port=8100}]
2015-10-09T18:44:20,348 INFO [task-runner-0] io.druid.data.input.FirehoseFactory - Firehose created, will shut down at: 2015-10-09T14:50:00.000-04:00
2015-10-09T18:44:23,509 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0]: LockAcquireAction{interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z}
2015-10-09T18:44:23,512 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0] to overlord[http://server1:8090/druid/indexer/v1/action]: LockAcquireAction{interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z}
2015-10-09T18:44:23,548 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0]: LockAcquireAction{interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z}
2015-10-09T18:44:23,550 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0] to overlord[http://server1:8090/druid/indexer/v1/action]: LockAcquireAction{interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z}
2015-10-09T18:44:23,604 INFO [task-runner-0] io.druid.server.coordination.BatchDataSegmentAnnouncer - Announcing segment[druid_datasource_01_2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z_2015-10-09T18:44:23.522Z] at path[/druid/segments/server1:8100/server1:8100_realtime__default_tier_2015-10-09T18:44:23.600Z_aac86b9a7fd24c65a0692efabd4d2b120]
2015-10-09T18:45:09,211 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/server1:8102
2015-10-09T18:45:09,211 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='server1:8102', host='server1:8102', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2015-10-09T18:45:20,252 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:45:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/events/thrownAway","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:45:20,252 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:45:20.252Z","service":"overlord","host":"server1:8100","metric":"ingest/events/unparseable","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:45:20,253 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:45:20.252Z","service":"overlord","host":"server1:8100","metric":"ingest/events/processed","value":2400,"dataSource":"druid_datasource_01"}]
2015-10-09T18:45:20,253 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:45:20.253Z","service":"overlord","host":"server1:8100","metric":"ingest/rows/output","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:45:20,253 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:45:20.253Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/count","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:45:20,254 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:45:20.253Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/time","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:45:20,254 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:45:20.254Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/backPressure","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:45:20,255 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:45:20.254Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:45:20,255 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:45:20.255Z","service":"overlord","host":"server1:8100","metric":"ingest/handoff/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:46:20,241 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:46:20.240Z","service":"overlord","host":"server1:8100","metric":"ingest/events/thrownAway","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:46:20,241 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:46:20.241Z","service":"overlord","host":"server1:8100","metric":"ingest/events/unparseable","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:46:20,241 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:46:20.241Z","service":"overlord","host":"server1:8100","metric":"ingest/events/processed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:46:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:46:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/rows/output","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:46:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:46:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/count","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:46:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:46:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/time","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:46:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:46:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/backPressure","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:46:20,244 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:46:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:46:20,244 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:46:20.244Z","service":"overlord","host":"server1:8100","metric":"ingest/handoff/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:47:20,241 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:47:20.240Z","service":"overlord","host":"server1:8100","metric":"ingest/events/thrownAway","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:47:20,241 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:47:20.241Z","service":"overlord","host":"server1:8100","metric":"ingest/events/unparseable","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:47:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:47:20.241Z","service":"overlord","host":"server1:8100","metric":"ingest/events/processed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:47:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:47:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/rows/output","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:47:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:47:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/count","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:47:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:47:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/time","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:47:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:47:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/backPressure","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:47:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:47:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:47:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:47:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/handoff/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:48:00,002 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2015-10-09T18:48:00,003 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [1] segments. Attempting to hand off segments that start before [1970-01-01T00:00:00.000Z].
2015-10-09T18:48:00,004 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Skipping persist and merge for entry [1444416060000=Sink{interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z, schema=io.druid.segment.indexing.DataSchema@d3a4560}] : Start time [2015-10-09T18:41:00.000Z] >= [1970-01-01T00:00:00.000Z] min timestamp required in this run. Segment will be picked up in a future run.
2015-10-09T18:48:00,004 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2015-10-09T18:48:20,241 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:48:20.240Z","service":"overlord","host":"server1:8100","metric":"ingest/events/thrownAway","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:48:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:48:20.241Z","service":"overlord","host":"server1:8100","metric":"ingest/events/unparseable","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:48:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:48:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/events/processed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:48:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:48:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/rows/output","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:48:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:48:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/count","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:48:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:48:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/time","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:48:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:48:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/backPressure","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:48:20,244 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:48:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:48:20,244 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:48:20.244Z","service":"overlord","host":"server1:8100","metric":"ingest/handoff/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:49:00,002 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2015-10-09T18:49:00,003 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [1] segments. Attempting to hand off segments that start before [1970-01-01T00:00:00.000Z].
2015-10-09T18:49:00,003 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Skipping persist and merge for entry [1444416060000=Sink{interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z, schema=io.druid.segment.indexing.DataSchema@d3a4560}] : Start time [2015-10-09T18:41:00.000Z] >= [1970-01-01T00:00:00.000Z] min timestamp required in this run. Segment will be picked up in a future run.
2015-10-09T18:49:00,003 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2015-10-09T18:49:20,241 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:49:20.241Z","service":"overlord","host":"server1:8100","metric":"ingest/events/thrownAway","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:49:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:49:20.241Z","service":"overlord","host":"server1:8100","metric":"ingest/events/unparseable","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:49:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:49:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/events/processed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:49:20,242 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:49:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/rows/output","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:49:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:49:20.242Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/count","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:49:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:49:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/time","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:49:20,243 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:49:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/backPressure","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:49:20,244 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:49:20.243Z","service":"overlord","host":"server1:8100","metric":"ingest/persists/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:49:20,244 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2015-10-09T18:49:20.244Z","service":"overlord","host":"server1:8100","metric":"ingest/handoff/failed","value":0,"dataSource":"druid_datasource_01"}]
2015-10-09T18:50:00,001 INFO [timed-shutoff-firehose-0] io.druid.data.input.FirehoseFactory - Closing delegate firehose.
2015-10-09T18:50:00,001 INFO [timed-shutoff-firehose-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Firehose closing.
2015-10-09T18:50:00,001 INFO [timed-shutoff-firehose-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Unregistering chat handler[druid:firehose:druid_datasource_01-41-0000-0000]
2015-10-09T18:50:00,002 INFO [timed-shutoff-firehose-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Unannouncing service[DruidNode{serviceName='druid:firehose:druid_datasource_01-41-0000-0000', host='server1', port=8100}]
2015-10-09T18:50:00,003 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2015-10-09T18:50:00,003 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [1] segments. Attempting to hand off segments that start before [1970-01-01T00:00:00.000Z].
2015-10-09T18:50:00,003 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Skipping persist and merge for entry [1444416060000=Sink{interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z, schema=io.druid.segment.indexing.DataSchema@d3a4560}] : Start time [2015-10-09T18:41:00.000Z] >= [1970-01-01T00:00:00.000Z] min timestamp required in this run. Segment will be picked up in a future run.
2015-10-09T18:50:00,003 INFO [druid_datasource_01-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2015-10-09T18:50:00,468 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Submitting persist runnable for dataSource[druid_datasource_01]
2015-10-09T18:50:00,470 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Shutting down...
2015-10-09T18:50:00,472 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Cannot shut down yet! Sinks remaining: druid_datasource_01_2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z_2015-10-09T18:44:23.522Z
2015-10-09T18:50:00,473 INFO [druid_datasource_01-incremental-persist] io.druid.segment.realtime.plumber.RealtimePlumber - DataSource[druid_datasource_01], Interval[2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z], persisting Hydrant[FireHydrant{index=io.druid.segment.incremental.OnheapIncrementalIndex@1327aac4, queryable=io.druid.segment.ReferenceCountingSegment@1de046d0, count=0}]
2015-10-09T18:50:00,480 INFO [druid_datasource_01-incremental-persist] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2015-10-09T18:50:00,481 INFO [druid_datasource_01-incremental-persist] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-10-09T18:50:00,491 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting persist for interval[2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z], rows[2,400]
2015-10-09T18:50:00,561 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - outDir[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/0/v8-tmp] completed index.drd in 6 millis.
2015-10-09T18:50:00,649 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - outDir[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/0/v8-tmp] completed dim conversions in 87 millis.
2015-10-09T18:50:00,899 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - outDir[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/0/v8-tmp] completed walk through of 2,400 rows in 250 millis.
2015-10-09T18:50:00,903 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimA] with cardinality[3]
2015-10-09T18:50:00,933 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimA] in 33 millis.
2015-10-09T18:50:00,933 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimB] with cardinality[3]
2015-10-09T18:50:00,939 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimB] in 6 millis.
2015-10-09T18:50:00,939 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimC] with cardinality[1]
2015-10-09T18:50:00,944 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimC] in 5 millis.
2015-10-09T18:50:00,944 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimD] with cardinality[1]
2015-10-09T18:50:00,949 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimD] in 5 millis.
2015-10-09T18:50:00,949 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimE] with cardinality[8]
2015-10-09T18:50:00,955 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimE] in 6 millis.
2015-10-09T18:50:00,955 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimF] with cardinality[2]
2015-10-09T18:50:00,960 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimF] in 5 millis.
2015-10-09T18:50:00,960 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimG] with cardinality[2]
2015-10-09T18:50:00,964 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimG] in 4 millis.
2015-10-09T18:50:00,964 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimH] with cardinality[3]
2015-10-09T18:50:00,969 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimH] in 5 millis.
2015-10-09T18:50:00,969 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimI] with cardinality[2]
2015-10-09T18:50:00,973 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimI] in 4 millis.
2015-10-09T18:50:00,973 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimJ] with cardinality[3]
2015-10-09T18:50:00,977 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimJ] in 4 millis.
2015-10-09T18:50:00,977 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimK] with cardinality[1]
2015-10-09T18:50:00,981 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimK] in 4 millis.
2015-10-09T18:50:00,981 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Starting dimension[dimL] with cardinality[2,400]
2015-10-09T18:50:01,205 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - Completed dimension[dimL] in 224 millis.
2015-10-09T18:50:01,205 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexMerger - outDir[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/0/v8-tmp] completed inverted.drd in 306 millis.
2015-10-09T18:50:01,234 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Converting v8[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/0/v8-tmp] to v9[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/0]
2015-10-09T18:50:01,238 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimA.drd]
2015-10-09T18:50:01,247 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimA] is single value, converting...
2015-10-09T18:50:01,300 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimB.drd]
2015-10-09T18:50:01,302 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimB] is single value, converting...
2015-10-09T18:50:01,310 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimC.drd]
2015-10-09T18:50:01,313 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimC] is single value, converting...
2015-10-09T18:50:01,320 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimD.drd]
2015-10-09T18:50:01,322 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimD] is single value, converting...
2015-10-09T18:50:01,329 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimE.drd]
2015-10-09T18:50:01,331 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimE] is single value, converting...
2015-10-09T18:50:01,344 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimF.drd]
2015-10-09T18:50:01,346 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimF] is single value, converting...
2015-10-09T18:50:01,352 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimG.drd]
2015-10-09T18:50:01,354 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimG] is single value, converting...
2015-10-09T18:50:01,365 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimH.drd]
2015-10-09T18:50:01,366 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimH] is single value, converting...
2015-10-09T18:50:01,366 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimH] has null rows.
2015-10-09T18:50:01,387 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimI.drd]
2015-10-09T18:50:01,388 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimI] is single value, converting...
2015-10-09T18:50:01,397 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimJ.drd]
2015-10-09T18:50:01,399 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimJ] is single value, converting...
2015-10-09T18:50:01,399 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimJ] has null rows.
2015-10-09T18:50:01,406 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimK.drd]
2015-10-09T18:50:01,407 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimK] is single value, converting...
2015-10-09T18:50:01,412 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimL.drd]
2015-10-09T18:50:01,414 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimL] is single value, converting...
2015-10-09T18:50:01,419 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[index.drd]
2015-10-09T18:50:01,419 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[inverted.drd]
2015-10-09T18:50:01,420 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_count_LITTLE_ENDIAN.drd]
2015-10-09T18:50:01,426 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_dwellHistogram_LITTLE_ENDIAN.drd]
2015-10-09T18:50:01,428 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_numeric2_LITTLE_ENDIAN.drd]
2015-10-09T18:50:01,429 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_numeric1_LITTLE_ENDIAN.drd]
2015-10-09T18:50:01,430 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[spatial.drd]
2015-10-09T18:50:01,431 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[time_LITTLE_ENDIAN.drd]
2015-10-09T18:50:01,434 INFO [druid_datasource_01-incremental-persist] io.druid.segment.IndexIO$DefaultIndexIOHandler - Skipped files[[index.drd, inverted.drd, spatial.drd]]
2015-10-09T18:50:01,492 INFO [druid_datasource_01-incremental-persist] io.druid.segment.ReferenceCountingSegment - Closing druid_datasource_01_2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z_2015-10-09T18:44:23.522Z
2015-10-09T18:50:01,492 INFO [druid_datasource_01-incremental-persist] io.druid.segment.ReferenceCountingSegment - Closing druid_datasource_01_2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z_2015-10-09T18:44:23.522Z, numReferences: 0
2015-10-09T18:50:01,493 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.realtime.plumber.RealtimePlumber - Adding hydrant[FireHydrant{index=null, queryable=io.druid.segment.ReferenceCountingSegment@3e8c6add, count=0}]
2015-10-09T18:50:01,513 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - outDir[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/merged/v8-tmp] completed index.drd in 3 millis.
2015-10-09T18:50:01,599 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - outDir[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/merged/v8-tmp] completed dim conversions in 86 millis.
2015-10-09T18:50:01,637 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[1]
2015-10-09T18:50:01,639 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[2]
2015-10-09T18:50:01,639 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[3]
2015-10-09T18:50:01,639 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[4]
2015-10-09T18:50:01,639 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[5]
2015-10-09T18:50:01,640 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[6]
2015-10-09T18:50:01,640 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[7]
2015-10-09T18:50:01,640 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[8]
2015-10-09T18:50:01,640 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[9]
2015-10-09T18:50:01,641 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[10]
2015-10-09T18:50:01,641 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[11]
2015-10-09T18:50:01,641 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[12]
2015-10-09T18:50:01,643 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[13]
2015-10-09T18:50:01,643 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[14]
2015-10-09T18:50:01,643 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[15]
2015-10-09T18:50:01,644 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[16]
2015-10-09T18:50:01,805 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - outDir[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/merged/v8-tmp] completed walk through of 2,400 rows in 206 millis.
2015-10-09T18:50:01,806 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimA] with cardinality[3]
2015-10-09T18:50:01,821 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimA] in 15 millis.
2015-10-09T18:50:01,821 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimB] with cardinality[3]
2015-10-09T18:50:01,829 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimB] in 8 millis.
2015-10-09T18:50:01,829 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimC] with cardinality[1]
2015-10-09T18:50:01,836 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimC] in 7 millis.
2015-10-09T18:50:01,837 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimD] with cardinality[1]
2015-10-09T18:50:01,844 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimD] in 8 millis.
2015-10-09T18:50:01,845 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimE] with cardinality[8]
2015-10-09T18:50:01,852 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimE] in 7 millis.
2015-10-09T18:50:01,852 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimF] with cardinality[2]
2015-10-09T18:50:01,859 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimF] in 7 millis.
2015-10-09T18:50:01,859 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimG] with cardinality[2]
2015-10-09T18:50:01,866 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimG] in 7 millis.
2015-10-09T18:50:01,866 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimH] with cardinality[3]
2015-10-09T18:50:01,873 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimH] in 7 millis.
2015-10-09T18:50:01,873 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimI] with cardinality[2]
2015-10-09T18:50:01,880 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimI] in 6 millis.
2015-10-09T18:50:01,880 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimJ] with cardinality[3]
2015-10-09T18:50:01,886 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimJ] in 6 millis.
2015-10-09T18:50:01,887 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimK] with cardinality[1]
2015-10-09T18:50:01,899 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimK] in 12 millis.
2015-10-09T18:50:01,899 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Starting dimension[dimL] with cardinality[2,400]
2015-10-09T18:50:02,051 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - Completed dimension[dimL] in 152 millis.
2015-10-09T18:50:02,051 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexMerger - outDir[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/merged/v8-tmp] completed inverted.drd in 246 millis.
2015-10-09T18:50:02,056 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Converting v8[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/merged/v8-tmp] to v9[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z/merged]
2015-10-09T18:50:02,057 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimA.drd]
2015-10-09T18:50:02,057 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimA] is single value, converting...
2015-10-09T18:50:02,063 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimB.drd]
2015-10-09T18:50:02,064 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimB] is single value, converting...
2015-10-09T18:50:02,066 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimC.drd]
2015-10-09T18:50:02,066 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimC] is single value, converting...
2015-10-09T18:50:02,068 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimD.drd]
2015-10-09T18:50:02,069 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimD] is single value, converting...
2015-10-09T18:50:02,071 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimE.drd]
2015-10-09T18:50:02,072 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimE] is single value, converting...
2015-10-09T18:50:02,079 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimF.drd]
2015-10-09T18:50:02,079 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimF] is single value, converting...
2015-10-09T18:50:02,081 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimG.drd]
2015-10-09T18:50:02,082 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimG] is single value, converting...
2015-10-09T18:50:02,086 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimH.drd]
2015-10-09T18:50:02,086 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimH] is single value, converting...
2015-10-09T18:50:02,092 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimI.drd]
2015-10-09T18:50:02,093 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimI] is single value, converting...
2015-10-09T18:50:02,095 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimJ.drd]
2015-10-09T18:50:02,096 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimJ] is single value, converting...
2015-10-09T18:50:02,098 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimK.drd]
2015-10-09T18:50:02,098 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimK] is single value, converting...
2015-10-09T18:50:02,099 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_dimL.drd]
2015-10-09T18:50:02,100 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[dimL] is single value, converting...
2015-10-09T18:50:02,104 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[index.drd]
2015-10-09T18:50:02,105 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[inverted.drd]
2015-10-09T18:50:02,105 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_count_LITTLE_ENDIAN.drd]
2015-10-09T18:50:02,105 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_dwellHistogram_LITTLE_ENDIAN.drd]
2015-10-09T18:50:02,105 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_numeric2_LITTLE_ENDIAN.drd]
2015-10-09T18:50:02,105 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_numeric1_LITTLE_ENDIAN.drd]
2015-10-09T18:50:02,105 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[spatial.drd]
2015-10-09T18:50:02,105 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[time_LITTLE_ENDIAN.drd]
2015-10-09T18:50:02,108 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.IndexIO$DefaultIndexIOHandler - Skipped files[[index.drd, inverted.drd, spatial.drd]]
2015-10-09T18:50:02,115 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.storage.hdfs.HdfsDataSegmentPusher - Copying segment[druid_datasource_01_2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z_2015-10-09T18:44:23.522Z] to HDFS at location[hdfs://server1:9000/druid/druid-hdfs-storage/druid_datasource_01/20151009T184100.000Z_20151009T184200.000Z/2015-10-09T18_44_23.522Z/0]
2015-10-09T18:50:02,119 ERROR [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.realtime.plumber.RealtimePlumber - Failed to persist merged index[druid_datasource_01]: {class=io.druid.segment.realtime.plumber.RealtimePlumber, exceptionType=class java.io.IOException, exceptionMessage=No FileSystem for scheme: hdfs, interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z}
java.io.IOException: No FileSystem for scheme: hdfs
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2304) ~[?:?]
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2311) ~[?:?]
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:90) ~[?:?]
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2350) ~[?:?]
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2332) ~[?:?]
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:369) ~[?:?]
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) ~[?:?]
    at io.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:83) ~[?:?]
    at io.druid.segment.realtime.plumber.RealtimePlumber$4.doRun(RealtimePlumber.java:454) [druid-server-0.8.1.jar:0.8.1]
    at io.druid.common.guava.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:40) [druid-common-0.8.1.jar:0.8.1]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_60]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_60]
    at java.lang.Thread.run(Thread.java:745) [?:1.8.0_60]
2015-10-09T18:50:02,136 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"alerts","timestamp":"2015-10-09T18:50:02.133Z","service":"overlord","host":"server1:8100","severity":"component-failure","description":"Failed to persist merged index[druid_datasource_01]","data":{"class":"io.druid.segment.realtime.plumber.RealtimePlumber","exceptionType":"java.io.IOException","exceptionMessage":"No FileSystem for scheme: hdfs","exceptionStackTrace":"java.io.IOException: No FileSystem for scheme: hdfs\n\tat org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2304)\n\tat org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2311)\n\tat org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:90)\n\tat org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2350)\n\tat org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2332)\n\tat org.apache.hadoop.fs.FileSystem.get(FileSystem.java:369)\n\tat org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)\n\tat io.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:83)\n\tat io.druid.segment.realtime.plumber.RealtimePlumber$4.doRun(RealtimePlumber.java:454)\n\tat io.druid.common.guava.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:40)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)\n\tat java.lang.Thread.run(Thread.java:745)\n","interval":"2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z"}}]
2015-10-09T18:50:02,137 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.server.coordination.BatchDataSegmentAnnouncer - Unannouncing segment[druid_datasource_01_2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z_2015-10-09T18:44:23.522Z] at path[/druid/segments/server1:8100/server1:8100_realtime__default_tier_2015-10-09T18:44:23.600Z_aac86b9a7fd24c65a0692efabd4d2b120]
2015-10-09T18:50:02,137 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.curator.announcement.Announcer - unannouncing [/druid/segments/server1:8100/server1:8100_realtime__default_tier_2015-10-09T18:44:23.600Z_aac86b9a7fd24c65a0692efabd4d2b120]
2015-10-09T18:50:02,166 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0]: LockReleaseAction{interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z}
2015-10-09T18:50:02,169 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0] to overlord[http://server1:8090/druid/indexer/v1/action]: LockReleaseAction{interval=2015-10-09T18:41:00.000Z/2015-10-09T18:42:00.000Z}
2015-10-09T18:50:02,170 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://server1:8090
2015-10-09T18:50:02,200 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.realtime.plumber.RealtimePlumber - Deleting Index File[/opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work/persist/druid_datasource_01/2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z]
2015-10-09T18:50:02,201 INFO [druid_datasource_01-2015-10-09T18:41:00.000Z-persist-n-merge] io.druid.segment.realtime.plumber.RealtimePlumber - Removing sinkKey 1444416060000 for segment druid_datasource_01_2015-10-09T18:41:00.000Z_2015-10-09T18:42:00.000Z_2015-10-09T18:44:23.522Z
2015-10-09T18:50:02,205 ERROR [task-runner-0] io.druid.indexing.common.task.RealtimeIndexTask - Failed to finish realtime task: {class=io.druid.indexing.common.task.RealtimeIndexTask, exceptionType=class com.metamx.common.ISE, exceptionMessage=Exception occurred during persist and merge.}
com.metamx.common.ISE: Exception occurred during persist and merge.
    at io.druid.segment.realtime.plumber.RealtimePlumber.finishJob(RealtimePlumber.java:532) ~[druid-server-0.8.1.jar:0.8.1]
    at io.druid.indexing.common.task.RealtimeIndexTask.run(RealtimeIndexTask.java:342) [druid-indexing-service-0.8.1.jar:0.8.1]
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235) [druid-indexing-service-0.8.1.jar:0.8.1]
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214) [druid-indexing-service-0.8.1.jar:0.8.1]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_60]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_60]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_60]
    at java.lang.Thread.run(Thread.java:745) [?:1.8.0_60]
2015-10-09T18:50:02,209 INFO [task-runner-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"alerts","timestamp":"2015-10-09T18:50:02.205Z","service":"overlord","host":"server1:8100","severity":"component-failure","description":"Failed to finish realtime task","data":{"class":"io.druid.indexing.common.task.RealtimeIndexTask","exceptionType":"com.metamx.common.ISE","exceptionMessage":"Exception occurred during persist and merge.","exceptionStackTrace":"com.metamx.common.ISE: Exception occurred during persist and merge.\n\tat io.druid.segment.realtime.plumber.RealtimePlumber.finishJob(RealtimePlumber.java:532)\n\tat io.druid.indexing.common.task.RealtimeIndexTask.run(RealtimeIndexTask.java:342)\n\tat io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235)\n\tat io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214)\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)\n\tat java.lang.Thread.run(Thread.java:745)\n"}}]
2015-10-09T18:50:02,210 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Removing task directory: /opt/druid/druid-data/base/tasks/index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0/work
2015-10-09T18:50:02,225 INFO [task-runner-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
  "id" : "index_realtime_druid_datasource_01_2015-10-09T14:41:00.000-04:00_0_0",
  "status" : "SUCCESS",
  "duration" : 342283
}
2015-10-09T18:50:02,234 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.stop()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@119290b9].
2015-10-09T18:50:02,234 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Stopping class io.druid.server.coordination.BatchDataSegmentAnnouncer with config[io.druid.server.initialization.ZkPathsConfig@58d3f4be]
2015-10-09T18:50:02,234 INFO [main] io.druid.curator.announcement.Announcer - unannouncing [/druid/announcements/server1:8100]
2015-10-09T18:50:02,257 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.stop()] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@2dbfa972].
2015-10-09T18:50:02,264 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - Server Disappeared[DruidServerMetadata{name='server1:8100', host='server1:8100', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2015-10-09T18:50:02,280 INFO [main] org.eclipse.jetty.server.ServerConnector - Stopped ServerConnector@5c3d4f05{HTTP/1.1}{0.0.0.0:8100}
2015-10-09T18:50:02,282 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Stopped o.e.j.s.ServletContextHandler@42c0a16e{/,null,UNAVAILABLE}
2015-10-09T18:50:02,292 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.overlord.ThreadPoolTaskRunner.stop()] on object[io.druid.indexing.overlord.ThreadPoolTaskRunner@25ce435].
2015-10-09T18:50:02,292 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.client.ServerInventoryView.stop() throws java.io.IOException] on object[io.druid.client.BatchServerInventoryView@68ab6ab0].
2015-10-09T18:50:02,292 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.announcement.Announcer.stop()] on object[io.druid.curator.announcement.Announcer@1cc8416a].
2015-10-09T18:50:02,293 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.discovery.ServerDiscoverySelector.stop() throws java.io.IOException] on object[io.druid.curator.discovery.ServerDiscoverySelector@2349f14d].
2015-10-09T18:50:02,297 INFO [main] io.druid.curator.CuratorModule - Stopping Curator
2015-10-09T18:50:02,302 INFO [main] org.apache.zookeeper.ZooKeeper - Session: 0x150482f7f89001b closed
2015-10-09T18:50:02,302 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.http.client.NettyHttpClient.stop()] on object[com.metamx.http.client.NettyHttpClient@8bd076a].
2015-10-09T18:50:02,305 INFO [main-EventThread] org.apache.zookeeper.ClientCnxn - EventThread shut down
2015-10-09T18:50:02,315 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.metrics.MonitorScheduler.stop()] on object[com.metamx.metrics.MonitorScheduler@7499eac7].
2015-10-09T18:50:02,315 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.emitter.service.ServiceEmitter.close() throws java.io.IOException] on object[com.metamx.emitter.service.ServiceEmitter@3b2e5c0d].
2015-10-09T18:50:02,315 INFO [main] com.metamx.emitter.core.LoggingEmitter - Close: started [false]
2015-10-09T18:50:02,315 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.emitter.core.LoggingEmitter.close() throws java.io.IOException] on object[com.metamx.emitter.core.LoggingEmitter@54336976].
2015-10-09T18:50:02,315 INFO [main] io.druid.cli.CliPeon - Finished peon task
mark1900 commented 9 years ago

Quick question to @himanshug . I am wondering if you have a Druid deployment that uses HDFS for Druid Deep Storage, and are you seeing similar issues with Druid 0.8.1?

himanshug commented 9 years ago

@mark1900 we do not load hadoop libs via extension but always have them on the classpath of main druid process (mainly because of the custom hadoop versions we have, they are not available in maven and it was simpler to just have all on the classpath), so we will never see this issue anyway. I'm not sure what change between 0.8.0 and 0.8.1 would cause this (I haven't had any time to repro it though, so not sure if this is really an issue or if there is something special about your setup) I am on the druid IRC, ping me if you are there.

mark1900 commented 9 years ago

FYI: I used Ambari 2.1.2 to install and configure my Hadoop instance on CentOs 7.

mark1900 commented 9 years ago

I tired a bunch of different options to address this "IOException: No FileSystem for scheme: hdfs" issue in Druid 0.8.1. A possible option that worked for me:

My Druid Launching Script Snippets:


HADOOP_CLASSPATH="`/usr/hdp/current/hadoop-client/bin/hadoop classpath`"

nohup java -Xms2g -Xmx2g -Duser.timezone=UTC -Dfile.encoding=UTF-8 -classpath config/_common:config/overlord:lib/*:${HADOOP_CLASSPATH} io.druid.cli.Main server overlord >> druid-overlord-node-console.log 2>&1 &

nohup java -Xms64m -Xmx64m -Duser.timezone=UTC -Dfile.encoding=UTF-8 -classpath config/_common:config/middlemanager:lib/*:${HADOOP_CLASSPATH} io.druid.cli.Main server middleManager >> druid-middlemanager-node-console.log 2>&1 &

nohup java -Xmx256m -Duser.timezone=UTC -Dfile.encoding=UTF-8 -classpath config/_common:config/coordinator:lib/*:${HADOOP_CLASSPATH} io.druid.cli.Main server coordinator >> druid-coordinator-node-console.log 2>&1 &

nohup java -Xms1g -Xmx1g -Duser.timezone=UTC -Dfile.encoding=UTF-8 -classpath config/_common:config/historical:lib/*:${HADOOP_CLASSPATH} io.druid.cli.Main server historical >> druid-historical-node-console.log 2>&1 &

nohup java -Xms2g -Xmx2g -Duser.timezone=UTC -Dfile.encoding=UTF-8 -classpath config/_common:config/broker:lib/*:${HADOOP_CLASSPATH} io.druid.cli.Main server broker >> druid-broker-node-console.log 2>&1 &

Assumptions:

vim /opt/project_name/druid-0.8.1/config/_common/common.runtime.properties

# Extensions
druid.extensions.coordinates=["io.druid.extensions:postgresql-metadata-storage","io.druid.extensions:druid-kafka-eight","io.druid.extensions:druid-hdfs-storage","io.druid.extensions:druid-histogram"]
druid.extensions.localRepository=extensions-repo

# Zookeeper
druid.zk.service.host=server-hostname

# Metadata Storage
druid.metadata.storage.type=postgresql
druid.metadata.storage.connector.connectURI=jdbc\:postgresql\://server-hostname\:5432/druid
druid.metadata.storage.connector.user=druid
druid.metadata.storage.connector.password=diurd

# Deep storage
druid.storage.type=hdfs
druid.storage.storageDirectory=hdfs://server-hostname:8020/project_name/druid-hdfs-storage

# Query Cache (we use a simple 10mb heap-based local cache on the broker)
druid.cache.type=local
druid.cache.sizeInBytes=10000000

# Indexing service discovery
druid.selectors.indexing.serviceName=overlord

# Monitoring (disabled for examples, if you enable SysMonitor, make sure to include sigar jar in your cp)
# druid.monitoring.monitors=["com.metamx.metrics.SysMonitor","com.metamx.metrics.JvmMonitor"]

# Metrics logging (disabled for examples - change this to logging or http in production)
#druid.emitter=noop
druid.emitter=logging

druid.worker.capacity=8

druid.indexer.task.defaultHadoopCoordinates=["org.apache.hadoop:hadoop-client:2.7.1"]
#druid.indexer.task.defaultHadoopCoordinates=["org.apache.hadoop:hadoop-client:2.7.1","org.apache.hadoop:hadoop-hdfs:2.7.1"]

druid.indexer.task.baseDir=/opt/project_name/druid-data/base
druid.indexer.task.baseTaskDir=/opt/project_name/druid-data/base/tasks
druid.indexer.task.hadoopWorkingPath=/opt/project_name/druid-data/hadoop-tasks
druid.realtime.chathandler.type=announce

druid.host=server-hostname
mark1900 commented 9 years ago

My environment setup:

mark1900 commented 8 years ago

Issue seems to be resolved in the latest Druid 0.8.2 release: http://static.druid.io/artifacts/releases/druid-0.8.2-bin.tar.gz

drcrallen commented 8 years ago

Note that 0.9.0 also revamps how extensions are loaded. Since @mark1900 reported this as resolved, and extensions are revamped in 0.9.0, I'm marking this as close.

Please re-open if new evidence arises.