apache / druid

Apache Druid: a high performance real-time analytics database.
https://druid.apache.org/
Apache License 2.0
13.46k stars 3.7k forks source link

Indexing task fails when values have backslashes #1852

Closed milimetric closed 8 years ago

milimetric commented 8 years ago

We post-ed an index task that was using an input file with this row:

{"project":"ar.wikibooks","article":"ﻕﺎﺌﻣﺓ_ﺎﻠﻣﺮﻜﺑﺎﺗ_ﺎﻠﻜﻴﻤﻳﺎﺌﻳﺓ_ﺎﻠﻌﺿﻮﻳﺓ\ﻢﺳﺎﻋﺩﺓ","access":"desktop","agent":"user","view_count":"1","time":"2015-10-14T20:00:00.000Z"}

The POST command came back with FAILED, and the overlord log said "went bye bye":

2015-10-23T21:57:16,072 INFO [Curator-PathChildrenCache-0] io.druid.indexing.overlord.MetadataTaskStorage - Deleting TaskLock with id[24]: TaskLock{groupId=index_hadoop_pvtest_2015-10-23T21:57:01.427Z, dataSource=pvtest, interval=2015-10-10T00:00:00.000Z/2015-10-20T00:00:00.000Z, version=2015-10-23T21:57:01.462Z}
2015-10-23T21:57:16,079 INFO [Curator-PathChildrenCache-0] io.druid.indexing.overlord.TaskQueue - Task done: HadoopIndexTask{id=index_hadoop_pvtest_2015-10-23T21:57:01.427Z, type=index_hadoop, dataSource=pvtest}
2015-10-23T21:57:16,079 INFO [Curator-PathChildrenCache-0] io.druid.indexing.overlord.TaskQueue - Task FAILED: HadoopIndexTask{id=index_hadoop_pvtest_2015-10-23T21:57:01.427Z, type=index_hadoop, dataSource=pvtest} (0 run duration)
2015-10-23T21:57:16,079 INFO [Curator-PathChildrenCache-0] io.druid.indexing.overlord.RemoteTaskRunner - Task[index_hadoop_pvtest_2015-10-23T21:57:01.427Z] went bye bye.

Removing the backslash from the article title fixed the problem. This is happening with the Druid version that's packaged in Imply 1.0. It's pretty easy to replicate, I did nothing besides the basic quickstart tutorial.

gianm commented 8 years ago

@milimetric, do you have logs from the task that failed? (you should be able to click "logs (all)" for "index_hadoop_pvtest_2015-10-23T21:57:01.427Z" in the overlord web console)

I'm wondering if this is a Jackson parsing error or if it is something else.

milimetric commented 8 years ago

I didn't see a Jackson parsing error. I'll update the issue with the Druid version, but I don't have the web console, it's a headless VM instance that doesn't have ports open.

gianm commented 8 years ago

Ah okay. Fwiw, if there were a data parsing error then it would be in the task logs.

milimetric commented 8 years ago

Hm, in var/sv/ I just see these logs:

bard.log           coordinator.log    historical.log     middleManager.log  zk.log             
broker.log         .ctrl              .lock              overlord.log
milimetric commented 8 years ago

Tried to attach a file but I don't have write permissions to the repo. Here's the log:

milimetric@druid1:~/imply-1.0.0/var/druid/indexing-logs$ cat index_hadoop_pvtest_2015-10-23T22:02:23.984Z.log
2015-10-23T22:02:25,666 INFO [main] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2015-10-23T22:02:25,669 INFO [main] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-10-23T22:02:26,824 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, coordinates=[], defaultVersion='0.8.1-iap2', localRepository='dist/druid/extensions-repo', remoteRepositories=[]}]
2015-10-23T22:02:29,026 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class com.metamx.emitter.core.LoggingEmitterConfig] from props[druid.emitter.logging.] as [LoggingEmitterConfig{loggerClass='com.metamx.emitter.core.LoggingEmitter', logLevel='debug'}]
2015-10-23T22:02:29,149 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.DruidMonitorSchedulerConfig] from props[druid.monitoring.] as [io.druid.server.metrics.DruidMonitorSchedulerConfig@baf1bb3]
2015-10-23T22:02:29,183 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.MonitorsConfig] from props[druid.monitoring.] as [MonitorsConfig{monitors=[class com.metamx.metrics.JvmMonitor]}]
2015-10-23T22:02:29,241 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.DruidNode] from props[druid.] as [DruidNode{serviceName='middleManager', host='druid1.analytics.eqiad.wmflabs', port=8100}]
2015-10-23T22:02:29,255 INFO [main] io.druid.server.metrics.MetricsModule - Adding monitor[com.metamx.metrics.JvmMonitor@703feacd]
2015-10-23T22:02:29,329 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ServerConfig] from props[druid.server.http.] as [ServerConfig{numThreads=8, maxIdleTime=PT5M}]
2015-10-23T22:02:29,352 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.config.TaskConfig] from props[druid.indexer.task.] as [io.druid.indexing.common.config.TaskConfig@7578e06a]
2015-10-23T22:02:29,365 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.http.DruidHttpClientConfig] from props[druid.global.http.] as [io.druid.guice.http.DruidHttpClientConfig@4443ef6f]
2015-10-23T22:02:29,519 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.indexing.IndexingServiceSelectorConfig] from props[druid.selectors.indexing.] as [io.druid.client.indexing.IndexingServiceSelectorConfig@26ae880a]
2015-10-23T22:02:29,530 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.curator.CuratorConfig] from props[druid.zk.service.] as [io.druid.curator.CuratorConfig@1dbb650b]
2015-10-23T22:02:29,541 WARN [main] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29
2015-10-23T22:02:29,591 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.CuratorDiscoveryConfig] from props[druid.discovery.curator.] as [io.druid.server.initialization.CuratorDiscoveryConfig@3db64bd4]
2015-10-23T22:02:29,907 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.RetryPolicyConfig] from props[druid.peon.taskActionClient.retry.] as [io.druid.indexing.common.RetryPolicyConfig@23b3aa8c]
2015-10-23T22:02:29,915 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.segment.loading.LocalDataSegmentPusherConfig] from props[druid.storage.] as [io.druid.segment.loading.LocalDataSegmentPusherConfig@30669dac]
2015-10-23T22:02:29,916 INFO [main] io.druid.segment.loading.LocalDataSegmentPusher - Configured local filesystem as deep storage
2015-10-23T22:02:29,949 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.DruidServerConfig] from props[druid.server.] as [io.druid.client.DruidServerConfig@1816e24a]
2015-10-23T22:02:29,955 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from props[druid.announcer.] as [io.druid.server.initialization.BatchDataSegmentAnnouncerConfig@e48bf9a]
2015-10-23T22:02:29,990 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ZkPathsConfig] from props[druid.zk.paths.] as [io.druid.server.initialization.ZkPathsConfig@22e2266d]
2015-10-23T22:02:30,007 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.coordination.DataSegmentAnnouncerProvider] from props[druid.announcer.] as [io.druid.server.coordination.BatchDataSegmentAnnouncerProvider@440eaa07]
2015-10-23T22:02:30,020 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.client.FilteredServerViewProvider] from props[druid.announcer.] as [io.druid.client.FilteredBatchServerViewProvider@117bcfdc]
2015-10-23T22:02:30,036 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [256000000] for [druid.processing.buffer.sizeBytes] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2015-10-23T22:02:30,039 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [2] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2015-10-23T22:02:30,039 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2015-10-23T22:02:30,047 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2015-10-23T22:02:30,230 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.search.search.SearchQueryConfig] from props[druid.query.search.] as [io.druid.query.search.search.SearchQueryConfig@6794ac0b]
2015-10-23T22:02:30,240 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.metadata.SegmentMetadataQueryConfig] from props[druid.query.segmentMetadata.] as [io.druid.query.metadata.SegmentMetadataQueryConfig@cf67838]
2015-10-23T22:02:30,251 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.groupby.GroupByQueryConfig] from props[druid.query.groupBy.] as [io.druid.query.groupby.GroupByQueryConfig@34f392be]
2015-10-23T22:02:30,268 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.topn.TopNQueryConfig] from props[druid.query.topN.] as [io.druid.query.topn.TopNQueryConfig@74dbb1ee]
2015-10-23T22:02:30,287 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.log.RequestLoggerProvider] from props[druid.request.logging.] as [io.druid.server.log.NoopRequestLoggerProvider@6fd12c5]
2015-10-23T22:02:30,315 INFO [main] org.eclipse.jetty.util.log - Logging initialized @6221ms
2015-10-23T22:02:30,494 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.core.LoggingEmitter.start()] on object[com.metamx.emitter.core.LoggingEmitter@6a937336].
2015-10-23T22:02:30,495 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.service.ServiceEmitter.start()] on object[com.metamx.emitter.service.ServiceEmitter@6b52dd31].
2015-10-23T22:02:30,496 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.metrics.MonitorScheduler.start()] on object[com.metamx.metrics.MonitorScheduler@1a2909ae].
2015-10-23T22:02:30,505 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.http.client.NettyHttpClient.start()] on object[com.metamx.http.client.NettyHttpClient@3d37203b].
2015-10-23T22:02:30,506 INFO [main] io.druid.curator.CuratorModule - Starting Curator
2015-10-23T22:02:30,506 INFO [main] org.apache.curator.framework.imps.CuratorFrameworkImpl - Starting
2015-10-23T22:02:30,540 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2015-10-23T22:02:30,541 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:host.name=druid1.analytics.eqiad.wmflabs
2015-10-23T22:02:30,541 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_66-internal
2015-10-23T22:02:30,541 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.home=/usr/lib/jvm/java-8-openjdk-amd64/jre
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=/home/milimetric/imply-1.0.0/conf-quickstart/druid/_common:/home/milimetric/imply-1.0.0/conf-quickstart/druid/middleManager:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/activation-1.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/aether-api-0.9.0.M2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/aether-connector-file-0.9.0.M2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/aether-connector-okhttp-0.0.9.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/aether-impl-0.9.0.M2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/aether-spi-0.9.0.M2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/aether-util-0.9.0.M2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/airline-0.6.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/antlr4-runtime-4.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/aopalliance-1.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/aws-java-sdk-1.8.11.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/aws-java-sdk-core-1.8.11.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/bcprov-jdk15on-1.51.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/bytebuffer-collections-0.1.6.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/classmate-1.0.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/commons-cli-1.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/commons-codec-1.7.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/commons-dbcp2-2.0.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/commons-io-2.0.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/commons-lang-2.6.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/commons-logging-1.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/commons-pool-1.6.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/commons-pool2-2.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/compress-lzf-1.0.3.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/config-magic-0.9.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/curator-client-2.8.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/curator-framework-2.8.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/curator-recipes-2.8.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/curator-x-discovery-2.8.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/derby-10.11.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/derbyclient-10.11.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/derbynet-10.11.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/derbytools-10.11.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/disruptor-3.3.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/druid-api-0.3.9.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/druid-aws-common-0.8.1-iap2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/druid-common-0.8.1-iap2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/druid-console-0.0.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/druid-indexing-hadoop-0.8.1-iap2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/druid-indexing-service-0.8.1-iap2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/druid-processing-0.8.1-iap2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/druid-server-0.8.1-iap2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/druid-services-0.8.1-iap2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/emitter-0.3.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/extendedset-1.3.9.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/geoip2-0.4.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/google-http-client-1.15.0-rc.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/google-http-client-jackson2-1.15.0-rc.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/guava-16.0.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/guice-4.0-beta.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/guice-multibindings-4.0-beta.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/guice-servlet-4.0-beta.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/hibernate-validator-5.1.3.Final.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/http-client-1.0.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/httpclient-4.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/httpcore-4.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/icu4j-4.8.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/irc-api-1.0-0011.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-annotations-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-core-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-core-asl-1.9.12.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-databind-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-dataformat-smile-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-datatype-guava-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-datatype-joda-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-jaxrs-base-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-jaxrs-json-provider-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-jaxrs-smile-provider-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-mapper-asl-1.9.13.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jackson-module-jaxb-annotations-2.4.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/java-util-0.27.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/javax.el-3.0.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/javax.el-api-3.0.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/javax.inject-1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/java-xmlbuilder-0.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/javax.servlet-api-3.1.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jboss-logging-3.1.3.GA.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jcl-over-slf4j-1.7.10.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jdbi-2.32.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jersey-core-1.19.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jersey-guice-1.19.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jersey-server-1.19.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jersey-servlet-1.19.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jets3t-0.9.3.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-client-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-continuation-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-http-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-io-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-proxy-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-security-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-server-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-servlet-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-servlets-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jetty-util-9.2.5.v20141112.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jline-0.9.94.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/joda-time-2.6.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jsr305-2.0.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/jsr311-api-1.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/log4j-1.2-api-2.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/log4j-api-2.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/log4j-core-2.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/log4j-jul-2.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/log4j-slf4j-impl-2.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/lz4-1.3.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/mail-1.4.7.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/mapdb-1.0.7.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/maven-aether-provider-3.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/maven-model-3.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/maven-model-builder-3.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/maven-repository-metadata-3.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/maven-settings-3.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/maven-settings-builder-3.1.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/maxminddb-0.2.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/mx4j-3.0.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/netty-3.9.5.Final.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/okhttp-1.0.2.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/opencsv-2.3.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/org.abego.treelayout.core-1.0.1.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/plexus-interpolation-1.19.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/plexus-utils-3.0.15.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/protobuf-java-2.5.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/rhino-1.7R5.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/RoaringBitmap-0.4.5.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/server-metrics-0.2.0.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/slf4j-api-1.6.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/spymemcached-2.11.7.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/tesla-aether-0.0.5.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/validation-api-1.1.0.Final.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/wagon-provider-api-2.4.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/xpp3-1.1.4c.jar:/home/milimetric/imply-1.0.0/bin/../dist/druid/lib/zookeeper-3.4.6.jar
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=var/tmp
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.version=3.16.0-4-amd64
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.name=milimetric
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.home=/home/milimetric
2015-10-23T22:02:30,542 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/home/milimetric/imply-1.0.0
2015-10-23T22:02:30,546 INFO [main] org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=localhost sessionTimeout=30000 watcher=org.apache.curator.ConnectionState@56afdf9a
2015-10-23T22:02:30,578 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.discovery.ServerDiscoverySelector.start() throws java.lang.Exception] on object[io.druid.curator.discovery.ServerDiscoverySelector@5cbe2654].
2015-10-23T22:02:30,590 INFO [main-SendThread(localhost:2181)] org.apache.zookeeper.ClientCnxn - Opening socket connection to server localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error)
2015-10-23T22:02:30,613 INFO [main-SendThread(localhost:2181)] org.apache.zookeeper.ClientCnxn - Socket connection established to localhost/0:0:0:0:0:0:0:1:2181, initiating session
2015-10-23T22:02:30,631 INFO [main-SendThread(localhost:2181)] org.apache.zookeeper.ClientCnxn - Session establishment complete on server localhost/0:0:0:0:0:0:0:1:2181, sessionid = 0x15096a2ebe0001e, negotiated timeout = 30000
2015-10-23T22:02:30,637 INFO [main-EventThread] org.apache.curator.framework.state.ConnectionStateManager - State change: CONNECTED
2015-10-23T22:02:31,863 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.announcement.Announcer.start()] on object[io.druid.curator.announcement.Announcer@262816a8].
2015-10-23T22:02:31,870 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.client.ServerInventoryView.start() throws java.lang.Exception] on object[io.druid.client.BatchServerInventoryView@558756be].
2015-10-23T22:02:31,882 INFO [main] org.eclipse.jetty.server.Server - jetty-9.2.5.v20141112
2015-10-23T22:02:32,028 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/druid1.analytics.eqiad.wmflabs:8083
2015-10-23T22:02:32,029 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='druid1.analytics.eqiad.wmflabs:8083', host='druid1.analytics.eqiad.wmflabs:8083', maxSize=300000000000, tier='_default_tier', type='historical', priority='0'}]
2015-10-23T22:02:32,139 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - Inventory Initialized
2015-10-23T22:02:32,175 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class
2015-10-23T22:02:32,182 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering io.druid.server.StatusResource as a root resource class
2015-10-23T22:02:32,185 INFO [main] com.sun.jersey.server.impl.application.WebApplicationImpl - Initiating Jersey application, version 'Jersey: 1.19 02/11/2015 03:25 AM'
2015-10-23T22:02:32,384 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
2015-10-23T22:02:33,312 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.QueryResource to GuiceInstantiatedComponentProvider
2015-10-23T22:02:33,326 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.segment.realtime.firehose.ChatHandlerResource to GuiceInstantiatedComponentProvider
2015-10-23T22:02:33,327 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2015-10-23T22:02:33,373 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@314b9e4b{/,null,AVAILABLE}
2015-10-23T22:02:33,391 INFO [main] org.eclipse.jetty.server.ServerConnector - Started ServerConnector@abbe000{HTTP/1.1}{0.0.0.0:8100}
2015-10-23T22:02:33,392 INFO [main] org.eclipse.jetty.server.Server - Started @9309ms
2015-10-23T22:02:33,394 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.start()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@3f81621c].
2015-10-23T22:02:33,394 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Announcing self[DruidServerMetadata{name='druid1.analytics.eqiad.wmflabs:8100', host='druid1.analytics.eqiad.wmflabs:8100', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}] at [/druid/announcements/druid1.analytics.eqiad.wmflabs:8100]
2015-10-23T22:02:33,490 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.start()] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@7bd96822].
2015-10-23T22:02:33,506 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/druid1.analytics.eqiad.wmflabs:8100
2015-10-23T22:02:33,506 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='druid1.analytics.eqiad.wmflabs:8100', host='druid1.analytics.eqiad.wmflabs:8100', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}]
2015-10-23T22:02:33,838 INFO [main] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2015-10-23T22:02:33,838 INFO [main] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-10-23T22:02:33,850 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [256000000] for [druid.processing.buffer.sizeBytes] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2015-10-23T22:02:33,851 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [2] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2015-10-23T22:02:33,851 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2015-10-23T22:02:33,851 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2015-10-23T22:02:33,864 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.segment.data.BitmapSerdeFactory] from props[druid.processing.bitmap.] as [ConciseBitmapSerdeFactory{}]
2015-10-23T22:02:33,867 INFO [main] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2015-10-23T22:02:33,867 INFO [main] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-10-23T22:02:33,899 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, coordinates=[], defaultVersion='0.8.1-iap2', localRepository='dist/druid/extensions-repo', remoteRepositories=[]}]
2015-10-23T22:02:33,983 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Running with task: {
  "type" : "index_hadoop",
  "id" : "index_hadoop_pvtest_2015-10-23T22:02:23.984Z",
  "spec" : {
    "dataSchema" : {
      "dataSource" : "pvtest",
      "parser" : {
        "type" : "string",
        "parseSpec" : {
          "format" : "json",
          "timestampSpec" : {
            "column" : "time",
            "format" : "auto",
            "missingValue" : null
          },
          "dimensionsSpec" : {
            "dimensions" : [ "access", "agent", "article", "project" ],
            "dimensionExclusions" : [ "time", "view_count" ],
            "spatialDimensions" : [ ]
          }
        }
      },
      "metricsSpec" : [ {
        "type" : "doubleSum",
        "name" : "view_count",
        "fieldName" : "view_count"
      } ],
      "granularitySpec" : {
        "type" : "uniform",
        "segmentGranularity" : "DAY",
        "queryGranularity" : {
          "type" : "none"
        },
        "intervals" : [ "2015-10-10T00:00:00.000Z/2015-10-20T00:00:00.000Z" ]
      }
    },
    "ioConfig" : {
      "type" : "hadoop",
      "inputSpec" : {
        "type" : "static",
        "paths" : "quickstart/testdata.10k.json"
      },
      "metadataUpdateSpec" : null,
      "segmentOutputPath" : null
    },
    "tuningConfig" : {
      "type" : "hadoop",
      "workingPath" : null,
      "version" : "2015-10-23T22:02:23.984Z",
      "partitionsSpec" : {
        "type" : "hashed",
        "targetPartitionSize" : 5000000,
        "maxPartitionSize" : 7500000,
        "assumeGrouped" : false,
        "numShards" : -1
      },
      "shardSpecs" : { },
      "indexSpec" : {
        "bitmap" : {
          "type" : "concise"
        },
        "dimensionCompression" : null,
        "metricCompression" : null
      },
      "leaveIntermediate" : false,
      "cleanupOnFailure" : true,
      "overwriteFiles" : false,
      "ignoreInvalidRows" : false,
      "jobProperties" : { },
      "combineText" : false,
      "persistInHeap" : false,
      "ingestOffheap" : false,
      "bufferSize" : 134217728,
      "aggregationBufferRatio" : 0.5,
      "useCombiner" : false,
      "rowFlushBoundary" : 80000
    }
  },
  "hadoopDependencyCoordinates" : null,
  "classpathPrefix" : null,
  "groupId" : "index_hadoop_pvtest_2015-10-23T22:02:23.984Z",
  "dataSource" : "pvtest",
  "resource" : {
    "availabilityGroup" : "index_hadoop_pvtest_2015-10-23T22:02:23.984Z",
    "requiredCapacity" : 1
  }
}
2015-10-23T22:02:33,995 INFO [main] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_hadoop_pvtest_2015-10-23T22:02:23.984Z]: LockTryAcquireAction{interval=2015-10-10T00:00:00.000Z/2015-10-20T00:00:00.000Z}
2015-10-23T22:02:34,011 INFO [main] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_hadoop_pvtest_2015-10-23T22:02:23.984Z] to overlord[http://druid1.analytics.eqiad.wmflabs:8084/druid/indexer/v1/action]: LockTryAcquireAction{interval=2015-10-10T00:00:00.000Z/2015-10-20T00:00:00.000Z}
2015-10-23T22:02:34,046 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,095 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,096 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,096 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,103 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,105 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,105 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,106 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,106 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,108 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,117 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,117 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,139 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,141 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,142 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,143 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,144 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,153 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,154 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,155 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid1.analytics.eqiad.wmflabs:8084
2015-10-23T22:02:34,287 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Running task: index_hadoop_pvtest_2015-10-23T22:02:23.984Z
2015-10-23T22:02:35,305 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-client/2.3.0/hadoop-client-2.3.0.jar]
2015-10-23T22:02:35,305 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-common/2.3.0/hadoop-common-2.3.0.jar]
2015-10-23T22:02:35,305 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/com/google/guava/guava/11.0.2/guava-11.0.2.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/xmlenc/xmlenc/0.52/xmlenc-0.52.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-io/commons-io/2.4/commons-io-2.4.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-net/commons-net/3.1/commons-net-3.1.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/log4j/log4j/1.2.17/log4j-1.2.17.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-lang/commons-lang/2.6/commons-lang-2.6.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar]
2015-10-23T22:02:35,306 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-digester/commons-digester/1.8/commons-digester-1.8.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/avro/avro/1.7.4/avro-1.7.4.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/xerial/snappy/snappy-java/1.0.4.1/snappy-java-1.0.4.1.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-auth/2.3.0/hadoop-auth-2.3.0.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/httpcomponents/httpcore/4.2.5/httpcore-4.2.5.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar]
2015-10-23T22:02:35,307 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar]
2015-10-23T22:02:35,308 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar]
2015-10-23T22:02:35,308 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/tukaani/xz/1.0/xz-1.0.jar]
2015-10-23T22:02:35,308 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-hdfs/2.3.0/hadoop-hdfs-2.3.0.jar]
2015-10-23T22:02:35,308 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar]
2015-10-23T22:02:35,308 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-app/2.3.0/hadoop-mapreduce-client-app-2.3.0.jar]
2015-10-23T22:02:35,308 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-common/2.3.0/hadoop-mapreduce-client-common-2.3.0.jar]
2015-10-23T22:02:35,308 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-yarn-client/2.3.0/hadoop-yarn-client-2.3.0.jar]
2015-10-23T22:02:35,308 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-yarn-server-common/2.3.0/hadoop-yarn-server-common-2.3.0.jar]
2015-10-23T22:02:35,308 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.3.0/hadoop-mapreduce-client-shuffle-2.3.0.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-yarn-api/2.3.0/hadoop-yarn-api-2.3.0.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-core/2.3.0/hadoop-mapreduce-client-core-2.3.0.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-yarn-common/2.3.0/hadoop-yarn-common-2.3.0.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/javax/activation/activation/1.1/activation-1.1.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.3.0/hadoop-mapreduce-client-jobclient-2.3.0.jar]
2015-10-23T22:02:35,311 INFO [task-runner-0] io.druid.initialization.Initialization - Added URL[file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/hadoop/hadoop-annotations/2.3.0/hadoop-annotations-2.3.0.jar]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/apache/logging/log4j/log4j-slf4j-impl/2.2/log4j-slf4j-impl-2.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/milimetric/imply-1.0.0/dist/druid/extensions-repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2015-10-23 22:02:35,847 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup
    at java.lang.Class.cast(Class.java:3369)
    at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:163)
    at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:311)
    at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:92)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:104)
    at org.apache.logging.log4j.core.config.DefaultConfiguration.<init>(DefaultConfiguration.java:55)
    at org.apache.logging.log4j.core.LoggerContext.<init>(LoggerContext.java:70)
    at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.locateContext(ClassLoaderContextSelector.java:145)
    at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:70)
    at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:57)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:142)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
    at org.apache.logging.log4j.LogManager.getContext(LogManager.java:175)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:102)
    at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
    at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
    at java.util.logging.LogManager.demandLogger(LogManager.java:551)
    at java.util.logging.Logger.demandLogger(Logger.java:455)
    at java.util.logging.Logger.getLogger(Logger.java:502)
    at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
    at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
    at com.google.inject.Guice.createInjector(Guice.java:96)
    at com.google.inject.Guice.createInjector(Guice.java:73)
    at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:57)
    at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:95)
    at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:275)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129)
    at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:173)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

2015-10-23 22:02:35,851 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup
    at java.lang.Class.cast(Class.java:3369)
    at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:163)
    at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:311)
    at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:103)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:104)
    at org.apache.logging.log4j.core.config.DefaultConfiguration.<init>(DefaultConfiguration.java:55)
    at org.apache.logging.log4j.core.LoggerContext.<init>(LoggerContext.java:70)
    at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.locateContext(ClassLoaderContextSelector.java:145)
    at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:70)
    at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:57)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:142)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
    at org.apache.logging.log4j.LogManager.getContext(LogManager.java:175)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:102)
    at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
    at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
    at java.util.logging.LogManager.demandLogger(LogManager.java:551)
    at java.util.logging.Logger.demandLogger(Logger.java:455)
    at java.util.logging.Logger.getLogger(Logger.java:502)
    at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
    at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
    at com.google.inject.Guice.createInjector(Guice.java:96)
    at com.google.inject.Guice.createInjector(Guice.java:73)
    at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:57)
    at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:95)
    at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:275)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129)
    at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:173)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

2015-10-23 22:02:35,876 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup
    at java.lang.Class.cast(Class.java:3369)
    at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:163)
    at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:311)
    at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:92)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:104)
    at org.apache.logging.log4j.core.config.xml.XmlConfiguration.<init>(XmlConfiguration.java:127)
    at org.apache.logging.log4j.core.config.xml.XmlConfigurationFactory.getConfiguration(XmlConfigurationFactory.java:44)
    at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:472)
    at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:442)
    at org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:254)
    at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:419)
    at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:138)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:147)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
    at org.apache.logging.log4j.LogManager.getContext(LogManager.java:175)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:102)
    at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
    at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
    at java.util.logging.LogManager.demandLogger(LogManager.java:551)
    at java.util.logging.Logger.demandLogger(Logger.java:455)
    at java.util.logging.Logger.getLogger(Logger.java:502)
    at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
    at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
    at com.google.inject.Guice.createInjector(Guice.java:96)
    at com.google.inject.Guice.createInjector(Guice.java:73)
    at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:57)
    at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:95)
    at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:275)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129)
    at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:173)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

2015-10-23 22:02:35,879 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup
    at java.lang.Class.cast(Class.java:3369)
    at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:163)
    at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:311)
    at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:103)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:104)
    at org.apache.logging.log4j.core.config.xml.XmlConfiguration.<init>(XmlConfiguration.java:127)
    at org.apache.logging.log4j.core.config.xml.XmlConfigurationFactory.getConfiguration(XmlConfigurationFactory.java:44)
    at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:472)
    at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:442)
    at org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:254)
    at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:419)
    at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:138)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:147)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
    at org.apache.logging.log4j.LogManager.getContext(LogManager.java:175)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:102)
    at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
    at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
    at java.util.logging.LogManager.demandLogger(LogManager.java:551)
    at java.util.logging.Logger.demandLogger(Logger.java:455)
    at java.util.logging.Logger.getLogger(Logger.java:502)
    at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
    at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
    at com.google.inject.Guice.createInjector(Guice.java:96)
    at com.google.inject.Guice.createInjector(Guice.java:73)
    at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:57)
    at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:95)
    at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:275)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129)
    at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:173)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

2015-10-23 22:02:35,894 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup
    at java.lang.Class.cast(Class.java:3369)
    at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:163)
    at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:311)
    at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:92)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:104)
    at org.apache.logging.log4j.core.config.DefaultConfiguration.<init>(DefaultConfiguration.java:55)
    at org.apache.logging.log4j.core.layout.PatternLayout$Builder.build(PatternLayout.java:374)
    at org.apache.logging.log4j.core.layout.PatternLayout.createDefaultLayout(PatternLayout.java:279)
    at org.apache.logging.log4j.core.appender.ConsoleAppender$Builder.<init>(ConsoleAppender.java:118)
    at org.apache.logging.log4j.core.appender.ConsoleAppender.newBuilder(ConsoleAppender.java:113)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.createBuilder(PluginBuilder.java:158)
    at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:118)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:766)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:706)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:698)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:358)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:161)
    at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:359)
    at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:420)
    at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:138)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:147)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
    at org.apache.logging.log4j.LogManager.getContext(LogManager.java:175)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:102)
    at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
    at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
    at java.util.logging.LogManager.demandLogger(LogManager.java:551)
    at java.util.logging.Logger.demandLogger(Logger.java:455)
    at java.util.logging.Logger.getLogger(Logger.java:502)
    at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
    at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
    at com.google.inject.Guice.createInjector(Guice.java:96)
    at com.google.inject.Guice.createInjector(Guice.java:73)
    at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:57)
    at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:95)
    at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:275)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129)
    at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:173)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

2015-10-23 22:02:35,899 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup
    at java.lang.Class.cast(Class.java:3369)
    at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:163)
    at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:311)
    at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:103)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:104)
    at org.apache.logging.log4j.core.config.DefaultConfiguration.<init>(DefaultConfiguration.java:55)
    at org.apache.logging.log4j.core.layout.PatternLayout$Builder.build(PatternLayout.java:374)
    at org.apache.logging.log4j.core.layout.PatternLayout.createDefaultLayout(PatternLayout.java:279)
    at org.apache.logging.log4j.core.appender.ConsoleAppender$Builder.<init>(ConsoleAppender.java:118)
    at org.apache.logging.log4j.core.appender.ConsoleAppender.newBuilder(ConsoleAppender.java:113)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.createBuilder(PluginBuilder.java:158)
    at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:118)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:766)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:706)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:698)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:358)
    at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:161)
    at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:359)
    at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:420)
    at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:138)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:147)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
    at org.apache.logging.log4j.LogManager.getContext(LogManager.java:175)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:102)
    at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
    at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
    at java.util.logging.LogManager.demandLogger(LogManager.java:551)
    at java.util.logging.Logger.demandLogger(Logger.java:455)
    at java.util.logging.Logger.getLogger(Logger.java:502)
    at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
    at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
    at com.google.inject.Guice.createInjector(Guice.java:96)
    at com.google.inject.Guice.createInjector(Guice.java:73)
    at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:57)
    at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:95)
    at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:275)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129)
    at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:173)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

2015-10-23T22:02:36,154 INFO [task-runner-0] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2015-10-23T22:02:36,158 INFO [task-runner-0] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-10-23T22:02:36,260 INFO [task-runner-0] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.1.3.Final
2015-10-23T22:02:37,329 INFO [task-runner-0] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, coordinates=[], defaultVersion='0.8.1-iap2', localRepository='dist/druid/extensions-repo', remoteRepositories=[]}]
2015-10-23T22:02:38,083 INFO [task-runner-0] io.druid.guice.JsonConfigurator - Loaded class[class com.metamx.emitter.core.LoggingEmitterConfig] from props[druid.emitter.logging.] as [LoggingEmitterConfig{loggerClass='com.metamx.emitter.core.LoggingEmitter', logLevel='debug'}]
2015-10-23T22:02:38,170 INFO [task-runner-0] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.DruidMonitorSchedulerConfig] from props[druid.monitoring.] as [io.druid.server.metrics.DruidMonitorSchedulerConfig@1165489a]
2015-10-23T22:02:38,187 INFO [task-runner-0] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.MonitorsConfig] from props[druid.monitoring.] as [MonitorsConfig{monitors=[class com.metamx.metrics.JvmMonitor]}]
2015-10-23T22:02:38,198 INFO [task-runner-0] io.druid.server.metrics.MetricsModule - Adding monitor[com.metamx.metrics.JvmMonitor@75014e52]
2015-10-23T22:02:38,476 INFO [task-runner-0] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2015-10-23T22:02:38,476 INFO [task-runner-0] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-10-23T22:02:38,494 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Assigning value [256000000] for [druid.processing.buffer.sizeBytes] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2015-10-23T22:02:38,496 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Assigning value [2] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2015-10-23T22:02:38,496 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2015-10-23T22:02:38,497 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2015-10-23T22:02:38,634 INFO [task-runner-0] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.segment.data.BitmapSerdeFactory] from props[druid.processing.bitmap.] as [ConciseBitmapSerdeFactory{}]
2015-10-23T22:02:38,664 INFO [task-runner-0] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2015-10-23T22:02:38,665 INFO [task-runner-0] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-10-23T22:02:38,699 INFO [task-runner-0] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, coordinates=[], defaultVersion='0.8.1-iap2', localRepository='dist/druid/extensions-repo', remoteRepositories=[]}]
2015-10-23T22:02:38,700 INFO [task-runner-0] io.druid.indexing.common.task.HadoopIndexTask - Starting a hadoop determine configuration job...
2015-10-23T22:02:39,183 WARN [task-runner-0] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-10-23T22:02:39,239 INFO [task-runner-0] io.druid.indexer.path.StaticPathSpec - Adding paths[quickstart/testdata.10k.json]
2015-10-23T22:02:39,458 INFO [task-runner-0] io.druid.indexer.path.StaticPathSpec - Adding paths[quickstart/testdata.10k.json]
2015-10-23T22:02:39,622 INFO [task-runner-0] org.apache.hadoop.conf.Configuration.deprecation - session.id is deprecated. Instead, use dfs.metrics.session-id
2015-10-23T22:02:39,624 INFO [task-runner-0] org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
2015-10-23T22:02:39,981 WARN [task-runner-0] org.apache.hadoop.mapreduce.JobSubmitter - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2015-10-23T22:02:40,006 WARN [task-runner-0] org.apache.hadoop.mapreduce.JobSubmitter - No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
2015-10-23T22:02:40,027 INFO [task-runner-0] org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2015-10-23T22:02:40,135 INFO [task-runner-0] org.apache.hadoop.mapreduce.JobSubmitter - number of splits:1
2015-10-23T22:02:40,340 INFO [task-runner-0] org.apache.hadoop.mapreduce.JobSubmitter - Submitting tokens for job: job_local352279840_0001
2015-10-23T22:02:40,379 WARN [task-runner-0] org.apache.hadoop.conf.Configuration - file:/tmp/hadoop-milimetric/mapred/staging/milimetric352279840/.staging/job_local352279840_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2015-10-23T22:02:40,379 WARN [task-runner-0] org.apache.hadoop.conf.Configuration - file:/tmp/hadoop-milimetric/mapred/staging/milimetric352279840/.staging/job_local352279840_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2015-10-23T22:02:40,549 WARN [task-runner-0] org.apache.hadoop.conf.Configuration - file:/tmp/hadoop-milimetric/mapred/local/localRunner/milimetric/job_local352279840_0001/job_local352279840_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2015-10-23T22:02:40,549 WARN [task-runner-0] org.apache.hadoop.conf.Configuration - file:/tmp/hadoop-milimetric/mapred/local/localRunner/milimetric/job_local352279840_0001/job_local352279840_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2015-10-23T22:02:40,563 INFO [task-runner-0] org.apache.hadoop.mapreduce.Job - The url to track the job: http://localhost:8080/
2015-10-23T22:02:40,570 INFO [task-runner-0] io.druid.indexer.DetermineHashedPartitionsJob - Job pvtest-determine_partitions_hashed-Optional.of([2015-10-10T00:00:00.000Z/2015-10-20T00:00:00.000Z]) submitted, status available at: http://localhost:8080/
2015-10-23T22:02:40,582 INFO [task-runner-0] org.apache.hadoop.mapreduce.Job - Running job: job_local352279840_0001
2015-10-23T22:02:40,596 INFO [Thread-21] org.apache.hadoop.mapred.LocalJobRunner - OutputCommitter set in config null
2015-10-23T22:02:40,611 INFO [Thread-21] org.apache.hadoop.mapred.LocalJobRunner - OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2015-10-23T22:02:40,720 INFO [Thread-21] org.apache.hadoop.mapred.LocalJobRunner - Waiting for map tasks
2015-10-23T22:02:40,725 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local352279840_0001_m_000000_0
2015-10-23T22:02:40,809 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.Task -  Using ResourceCalculatorProcessTree : [ ]
2015-10-23T22:02:40,815 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Processing split: file:/home/milimetric/imply-1.0.0/quickstart/testdata.10k.json:0+1475175
2015-10-23T22:02:40,838 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
2015-10-23T22:02:40,867 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - (EQUATOR) 0 kvi 26214396(104857584)
2015-10-23T22:02:40,867 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - mapreduce.task.io.sort.mb: 100
2015-10-23T22:02:40,867 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - soft limit at 83886080
2015-10-23T22:02:40,867 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - bufstart = 0; bufvoid = 104857600
2015-10-23T22:02:40,867 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - kvstart = 26214396; length = 6553600
2015-10-23T22:02:40,910 INFO [LocalJobRunner Map Task Executor #0] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
  "spec" : {
    "dataSchema" : {
      "dataSource" : "pvtest",
      "parser" : {
        "type" : "string",
        "parseSpec" : {
          "format" : "json",
          "timestampSpec" : {
            "column" : "time",
            "format" : "auto",
            "missingValue" : null
          },
          "dimensionsSpec" : {
            "dimensions" : [ "access", "agent", "article", "project" ],
            "dimensionExclusions" : [ "time", "view_count" ],
            "spatialDimensions" : [ ]
          }
        }
      },
      "metricsSpec" : [ {
        "type" : "doubleSum",
        "name" : "view_count",
        "fieldName" : "view_count"
      } ],
      "granularitySpec" : {
        "type" : "uniform",
        "segmentGranularity" : "DAY",
        "queryGranularity" : {
          "type" : "none"
        },
        "intervals" : [ "2015-10-10T00:00:00.000Z/2015-10-20T00:00:00.000Z" ]
      }
    },
    "ioConfig" : {
      "type" : "hadoop",
      "inputSpec" : {
        "type" : "static",
        "paths" : "quickstart/testdata.10k.json"
      },
      "metadataUpdateSpec" : null,
      "segmentOutputPath" : "file:/home/milimetric/imply-1.0.0/var/druid/segments/pvtest/"
    },
    "tuningConfig" : {
      "type" : "hadoop",
      "workingPath" : "var/druid/hadoop-tmp",
      "version" : "2015-10-23T22:02:23.984Z",
      "partitionsSpec" : {
        "type" : "hashed",
        "targetPartitionSize" : 5000000,
        "maxPartitionSize" : 7500000,
        "assumeGrouped" : false,
        "numShards" : -1
      },
      "shardSpecs" : { },
      "indexSpec" : {
        "bitmap" : {
          "type" : "concise"
        },
        "dimensionCompression" : null,
        "metricCompression" : null
      },
      "leaveIntermediate" : false,
      "cleanupOnFailure" : true,
      "overwriteFiles" : false,
      "ignoreInvalidRows" : false,
      "jobProperties" : { },
      "combineText" : false,
      "persistInHeap" : false,
      "ingestOffheap" : false,
      "bufferSize" : 134217728,
      "aggregationBufferRatio" : 0.5,
      "useCombiner" : false,
      "rowFlushBoundary" : 80000
    }
  }
}
2015-10-23T22:02:40,931 INFO [LocalJobRunner Map Task Executor #0] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
  "spec" : {
    "dataSchema" : {
      "dataSource" : "pvtest",
      "parser" : {
        "type" : "string",
        "parseSpec" : {
          "format" : "json",
          "timestampSpec" : {
            "column" : "time",
            "format" : "auto",
            "missingValue" : null
          },
          "dimensionsSpec" : {
            "dimensions" : [ "access", "agent", "article", "project" ],
            "dimensionExclusions" : [ "time", "view_count" ],
            "spatialDimensions" : [ ]
          }
        }
      },
      "metricsSpec" : [ {
        "type" : "doubleSum",
        "name" : "view_count",
        "fieldName" : "view_count"
      } ],
      "granularitySpec" : {
        "type" : "uniform",
        "segmentGranularity" : "DAY",
        "queryGranularity" : {
          "type" : "none"
        },
        "intervals" : [ "2015-10-10T00:00:00.000Z/2015-10-20T00:00:00.000Z" ]
      }
    },
    "ioConfig" : {
      "type" : "hadoop",
      "inputSpec" : {
        "type" : "static",
        "paths" : "quickstart/testdata.10k.json"
      },
      "metadataUpdateSpec" : null,
      "segmentOutputPath" : "file:/home/milimetric/imply-1.0.0/var/druid/segments/pvtest/"
    },
    "tuningConfig" : {
      "type" : "hadoop",
      "workingPath" : "var/druid/hadoop-tmp",
      "version" : "2015-10-23T22:02:23.984Z",
      "partitionsSpec" : {
        "type" : "hashed",
        "targetPartitionSize" : 5000000,
        "maxPartitionSize" : 7500000,
        "assumeGrouped" : false,
        "numShards" : -1
      },
      "shardSpecs" : { },
      "indexSpec" : {
        "bitmap" : {
          "type" : "concise"
        },
        "dimensionCompression" : null,
        "metricCompression" : null
      },
      "leaveIntermediate" : false,
      "cleanupOnFailure" : true,
      "overwriteFiles" : false,
      "ignoreInvalidRows" : false,
      "jobProperties" : { },
      "combineText" : false,
      "persistInHeap" : false,
      "ingestOffheap" : false,
      "bufferSize" : 134217728,
      "aggregationBufferRatio" : 0.5,
      "useCombiner" : false,
      "rowFlushBoundary" : 80000
    }
  }
}
2015-10-23T22:02:40,935 INFO [LocalJobRunner Map Task Executor #0] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
  "spec" : {
    "dataSchema" : {
      "dataSource" : "pvtest",
      "parser" : {
        "type" : "string",
        "parseSpec" : {
          "format" : "json",
          "timestampSpec" : {
            "column" : "time",
            "format" : "auto",
            "missingValue" : null
          },
          "dimensionsSpec" : {
            "dimensions" : [ "access", "agent", "article", "project" ],
            "dimensionExclusions" : [ "time", "view_count" ],
            "spatialDimensions" : [ ]
          }
        }
      },
      "metricsSpec" : [ {
        "type" : "doubleSum",
        "name" : "view_count",
        "fieldName" : "view_count"
      } ],
      "granularitySpec" : {
        "type" : "uniform",
        "segmentGranularity" : "DAY",
        "queryGranularity" : {
          "type" : "none"
        },
        "intervals" : [ "2015-10-10T00:00:00.000Z/2015-10-20T00:00:00.000Z" ]
      }
    },
    "ioConfig" : {
      "type" : "hadoop",
      "inputSpec" : {
        "type" : "static",
        "paths" : "quickstart/testdata.10k.json"
      },
      "metadataUpdateSpec" : null,
      "segmentOutputPath" : "file:/home/milimetric/imply-1.0.0/var/druid/segments/pvtest/"
    },
    "tuningConfig" : {
      "type" : "hadoop",
      "workingPath" : "var/druid/hadoop-tmp",
      "version" : "2015-10-23T22:02:23.984Z",
      "partitionsSpec" : {
        "type" : "hashed",
        "targetPartitionSize" : 5000000,
        "maxPartitionSize" : 7500000,
        "assumeGrouped" : false,
        "numShards" : -1
      },
      "shardSpecs" : { },
      "indexSpec" : {
        "bitmap" : {
          "type" : "concise"
        },
        "dimensionCompression" : null,
        "metricCompression" : null
      },
      "leaveIntermediate" : false,
      "cleanupOnFailure" : true,
      "overwriteFiles" : false,
      "ignoreInvalidRows" : false,
      "jobProperties" : { },
      "combineText" : false,
      "persistInHeap" : false,
      "ingestOffheap" : false,
      "bufferSize" : 134217728,
      "aggregationBufferRatio" : 0.5,
      "useCombiner" : false,
      "rowFlushBoundary" : 80000
    }
  }
}
2015-10-23T22:02:41,596 INFO [task-runner-0] org.apache.hadoop.mapreduce.Job - Job job_local352279840_0001 running in uber mode : false
2015-10-23T22:02:41,602 INFO [task-runner-0] org.apache.hadoop.mapreduce.Job -  map 0% reduce 0%
2015-10-23T22:02:42,219 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Starting flush of map output
2015-10-23T22:02:42,239 INFO [Thread-21] org.apache.hadoop.mapred.LocalJobRunner - map task executor complete.
2015-10-23T22:02:42,240 WARN [Thread-21] org.apache.hadoop.mapred.LocalJobRunner - job_local352279840_0001
java.lang.Exception: com.metamx.common.RE: Failure on row[{"project":"ar.wikibooks","article":"قائمة_المركبات_الكيميائية_العضوية\مساعدة","access":"desktop","agent":"user","view_count":"1","time":"2015-10-14T20:00:00.000Z"}]
    at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522) [hadoop-mapreduce-client-common-2.3.0.jar:?]
Caused by: com.metamx.common.RE: Failure on row[{"project":"ar.wikibooks","article":"قائمة_المركبات_الكيميائية_العضوية\مساعدة","access":"desktop","agent":"user","view_count":"1","time":"2015-10-14T20:00:00.000Z"}]
    at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:98) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:277) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
    at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_66-internal]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
    at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]
Caused by: com.metamx.common.parsers.ParseException: Unable to parse row [{"project":"ar.wikibooks","article":"قائمة_المركبات_الكيميائية_العضوية\مساعدة","access":"desktop","agent":"user","view_count":"1","time":"2015-10-14T20:00:00.000Z"}]
    at com.metamx.common.parsers.JSONParser.parse(JSONParser.java:147) ~[java-util-0.27.0.jar:?]
    at io.druid.data.input.impl.StringInputRowParser.parseString(StringInputRowParser.java:86) ~[druid-api-0.3.9.jar:0.3.9]
    at io.druid.data.input.impl.StringInputRowParser.parse(StringInputRowParser.java:91) ~[druid-api-0.3.9.jar:0.3.9]
    at io.druid.indexer.HadoopDruidIndexerMapper.parseInputRow(HadoopDruidIndexerMapper.java:106) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:79) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:277) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
    at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_66-internal]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
    at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]
Caused by: com.fasterxml.jackson.core.JsonParseException: Unrecognized character escape 'م' (code 1605 / 0x645)
 at [Source: {"project":"ar.wikibooks","article":"قائمة_المركبات_الكيميائية_العضوية\مساعدة","access":"desktop","agent":"user","view_count":"1","time":"2015-10-14T20:00:00.000Z"}; line: 1, column: 73]
    at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1419) ~[jackson-core-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:508) ~[jackson-core-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.core.base.ParserMinimalBase._handleUnrecognizedCharacterEscape(ParserMinimalBase.java:485) ~[jackson-core-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._decodeEscaped(ReaderBasedJsonParser.java:2046) ~[jackson-core-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._finishString2(ReaderBasedJsonParser.java:1611) ~[jackson-core-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._finishString(ReaderBasedJsonParser.java:1585) ~[jackson-core-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.getText(ReaderBasedJsonParser.java:233) ~[jackson-core-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.databind.deser.std.BaseNodeDeserializer.deserializeObject(JsonNodeDeserializer.java:224) ~[jackson-databind-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.databind.deser.std.JsonNodeDeserializer.deserialize(JsonNodeDeserializer.java:62) ~[jackson-databind-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.databind.deser.std.JsonNodeDeserializer.deserialize(JsonNodeDeserializer.java:14) ~[jackson-databind-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3066) ~[jackson-databind-2.4.4.jar:2.4.4]
    at com.fasterxml.jackson.databind.ObjectMapper.readTree(ObjectMapper.java:1833) ~[jackson-databind-2.4.4.jar:2.4.4]
    at com.metamx.common.parsers.JSONParser.parse(JSONParser.java:115) ~[java-util-0.27.0.jar:?]
    at io.druid.data.input.impl.StringInputRowParser.parseString(StringInputRowParser.java:86) ~[druid-api-0.3.9.jar:0.3.9]
    at io.druid.data.input.impl.StringInputRowParser.parse(StringInputRowParser.java:91) ~[druid-api-0.3.9.jar:0.3.9]
    at io.druid.indexer.HadoopDruidIndexerMapper.parseInputRow(HadoopDruidIndexerMapper.java:106) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:79) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:277) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
    at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_66-internal]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
    at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]
2015-10-23T22:02:42,608 INFO [task-runner-0] org.apache.hadoop.mapreduce.Job - Job job_local352279840_0001 failed with state FAILED due to: NA
2015-10-23T22:02:42,619 INFO [task-runner-0] org.apache.hadoop.mapreduce.Job - Counters: 0
2015-10-23T22:02:42,620 ERROR [task-runner-0] io.druid.indexer.DetermineHashedPartitionsJob - Job failed: job_local352279840_0001
2015-10-23T22:02:42,621 INFO [task-runner-0] io.druid.indexer.JobHelper - Deleting path[var/druid/hadoop-tmp/pvtest/2015-10-23T220223.984Z]
2015-10-23T22:02:42,655 ERROR [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Exception while running task[HadoopIndexTask{id=index_hadoop_pvtest_2015-10-23T22:02:23.984Z, type=index_hadoop, dataSource=pvtest}]
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
    at com.google.api.client.repackaged.com.google.common.base.Throwables.propagate(Throwables.java:160) ~[google-http-client-1.15.0-rc.jar:?]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:132) ~[druid-indexing-service-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:173) ~[druid-indexing-service-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235) [druid-indexing-service-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214) [druid-indexing-service-0.8.1-iap2.jar:0.8.1-iap2]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_66-internal]
    at java.lang.Thread.run(Thread.java:745) [?:1.8.0_66-internal]
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_66-internal]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_66-internal]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_66-internal]
    at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_66-internal]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129) ~[druid-indexing-service-0.8.1-iap2.jar:0.8.1-iap2]
    ... 7 more
Caused by: com.metamx.common.ISE: Job[class io.druid.indexer.DetermineHashedPartitionsJob] failed!
    at io.druid.indexer.JobHelper.runJobs(JobHelper.java:202) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:84) ~[druid-indexing-hadoop-0.8.1-iap2.jar:0.8.1-iap2]
    at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:289) ~[druid-indexing-service-0.8.1-iap2.jar:0.8.1-iap2]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_66-internal]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_66-internal]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_66-internal]
    at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_66-internal]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129) ~[druid-indexing-service-0.8.1-iap2.jar:0.8.1-iap2]
    ... 7 more
2015-10-23T22:02:42,685 INFO [task-runner-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
  "id" : "index_hadoop_pvtest_2015-10-23T22:02:23.984Z",
  "status" : "FAILED",
  "duration" : 8386
}
2015-10-23T22:02:42,690 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.stop()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@3f81621c].
2015-10-23T22:02:42,691 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Stopping class io.druid.server.coordination.BatchDataSegmentAnnouncer with config[io.druid.server.initialization.ZkPathsConfig@22e2266d]
2015-10-23T22:02:42,691 INFO [main] io.druid.curator.announcement.Announcer - unannouncing [/druid/announcements/druid1.analytics.eqiad.wmflabs:8100]
2015-10-23T22:02:42,710 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - Server Disappeared[DruidServerMetadata{name='druid1.analytics.eqiad.wmflabs:8100', host='druid1.analytics.eqiad.wmflabs:8100', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}]
2015-10-23T22:02:42,715 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.stop()] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@7bd96822].
2015-10-23T22:02:42,727 INFO [main] org.eclipse.jetty.server.ServerConnector - Stopped ServerConnector@abbe000{HTTP/1.1}{0.0.0.0:8100}
2015-10-23T22:02:42,731 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Stopped o.e.j.s.ServletContextHandler@314b9e4b{/,null,UNAVAILABLE}
2015-10-23T22:02:42,734 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.overlord.ThreadPoolTaskRunner.stop()] on object[io.druid.indexing.overlord.ThreadPoolTaskRunner@481558ce].
2015-10-23T22:02:42,736 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.client.ServerInventoryView.stop() throws java.io.IOException] on object[io.druid.client.BatchServerInventoryView@558756be].
2015-10-23T22:02:42,742 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.announcement.Announcer.stop()] on object[io.druid.curator.announcement.Announcer@262816a8].
2015-10-23T22:02:42,742 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.discovery.ServerDiscoverySelector.stop() throws java.io.IOException] on object[io.druid.curator.discovery.ServerDiscoverySelector@5cbe2654].
2015-10-23T22:02:42,745 INFO [main] io.druid.curator.CuratorModule - Stopping Curator
2015-10-23T22:02:42,755 INFO [main] org.apache.zookeeper.ZooKeeper - Session: 0x15096a2ebe0001e closed
2015-10-23T22:02:42,755 INFO [main-EventThread] org.apache.zookeeper.ClientCnxn - EventThread shut down
2015-10-23T22:02:42,755 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.http.client.NettyHttpClient.stop()] on object[com.metamx.http.client.NettyHttpClient@3d37203b].
2015-10-23T22:02:42,815 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.metrics.MonitorScheduler.stop()] on object[com.metamx.metrics.MonitorScheduler@1a2909ae].
2015-10-23T22:02:42,815 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.emitter.service.ServiceEmitter.close() throws java.io.IOException] on object[com.metamx.emitter.service.ServiceEmitter@6b52dd31].
2015-10-23T22:02:42,815 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.emitter.core.LoggingEmitter.close() throws java.io.IOException] on object[com.metamx.emitter.core.LoggingEmitter@6a937336].
2015-10-23T22:02:42,815 INFO [main] io.druid.cli.CliPeon - Finished peon task
2015-10-23T22:02:42,828 INFO [Thread-10] io.druid.cli.CliPeon - Running shutdown hook
milimetric@druid1:~/imply-1.0.0/var/druid/indexing-logs$ 
drcrallen commented 8 years ago

{"project":"ar.wikibooks","article":"قائمة_المركبات_الكيميائية_العضوية\مساعدة","access":"desktop","agent":"user","view_count":"1","time":"2015-10-14T20:00:00.000Z"} is not valid json because the \ is not escaped unless I'm missing something.

Related https://github.com/druid-io/druid/issues/1840 and http://rfc7159.net/rfc7159#rfc.section.7

milimetric commented 8 years ago

That's true. I should have mentioned: I tried the same value with TSV and CSV, and it still didn't work, but I might not have been clear on what format TSVs and CSVs needed to have, couldn't find docs or examples. Also, I realized I was using openjdk which caused other problems, so I'll try with oracle's java in a bit.

milimetric commented 8 years ago

Ok, tried with different JREs and no luck. The main point, I think, is that a single bad line in a huge file shouldn't kill the whole job. The index job could have a setting that's tolerant of failures and just reports the failures instead of killing everything, that seems useful at least for testing.

drcrallen commented 8 years ago

General FYI: ignoreInvalidRows is a boolean in the tuning config for hadoop jobs.

milimetric commented 8 years ago

That is incredibly useful :) I'll close this issue as it looks like just my ignorance of the features and documentation.

drcrallen commented 8 years ago

@milimetric Insufficient documentation is a very real possibility.