apache / druid

Apache Druid: a high performance real-time analytics database.
https://druid.apache.org/
Apache License 2.0
13.45k stars 3.7k forks source link

IngestSegmentFirehose zeros out entire segments #732

Closed pdeva closed 10 years ago

pdeva commented 10 years ago

the timestamp column is treated like a 'dimension' by ingest sgement firehose. this means if it is used on an indexing task, the values of that interval will essentially be zero'd out since druid will have no idea about the timestamp of any row.

pdeva commented 10 years ago

looking at segment metadata, the 'timestamp' column is stored as a dimension column in segments generated which use IngestSegmentFirehose

pdeva commented 10 years ago

just to verify that this is indeed a bug, sending this index query, results in the interval being zero'd out. It actually takes a few minutes after the indexing task is fully complete to see the updated (zero) values. Is there a reason why it is not instant (or within seconds)?

The query below is used to take data at minute level granularity and aggregate it into hour level.

{
    "type": "index",
    "dataSource": "dripstat",
    "granularitySpec": {
        "type": "uniform",
        "gran": "hour",
        "intervals": [ "2014-09-07T09:00:00.000Z/2014-09-07T10:00:00.000Z" ]
    },
    "indexGranularity": "hour",
    "aggregators": [
        {
            "type": "count",
            "name": "count"
        },
        {
            "type": "doubleSum",
            "name": "min",
            "fieldName": "min"
        },
        {
            "type": "doubleSum",
            "name": "max",
            "fieldName": "max"
        },
        {
            "type": "longSum",
            "name": "callCount",
            "fieldName": "callCount"
        },
        {
            "type": "doubleSum",
            "name": "totalTime",
            "fieldName": "totalTime"
        },
        {
            "type": "doubleSum",
            "name": "value",
            "fieldName": "value"
        }
    ],
    "firehose": {
        "type": "ingestSegment",
        "dataSource": "dripstat",
        "interval": "2014-09-07T09:00:00.000Z/2014-09-07T10:00:00.000Z"
    }
}
pdeva commented 10 years ago

this is how the segment looks like after the above query:

Using a select query:

{
   "queryType": "select",
   "dataSource": "dripstat",
   "dimensions":["jvmid", "name"],
   "metrics":["value"],
   "granularity": "all",
   "intervals": [
     "2014-09-07T09:00:00.000Z/2014-09-07T10:00:00.000Z"
   ],
   "pagingSpec":{"pagingIdentifiers": {}, "threshold":5},
  "filter": {
    "type": "and",
    "fields": [
        {
            "type": "selector",
            "dimension": "name",
            "value": "JVM/ClassCount/loaded"
        },
        {
            "type": "selector",
            "dimension": "jvmid",
            "value": "52c896791e7ae170ffcce02d"
        }
    ]
 }
}

This should return only 1 row since we aggregated all the values to an hour, but instead it returns many with the same timestamp. Note the zero'd out values.

[
  {
    "timestamp": "2014-09-07T09:00:00.000Z",
    "result": {
      "pagingIdentifiers": {
        "dripstat_2014-09-07T09:00:00.000Z_2014-09-07T10:00:00.000Z_2014-09-15T03:00:38.309Z": 4
      },
      "events": [
        {
          "segmentId": "dripstat_2014-09-07T09:00:00.000Z_2014-09-07T10:00:00.000Z_2014-09-15T03:00:38.309Z",
          "offset": 0,
          "event": {
            "timestamp": "2014-09-07T09:00:00.000Z",
            "jvmid": "52c896791e7ae170ffcce02d",
            "name": "JVM/ClassCount/loaded",
            "value": 0.0
          }
        },
        {
          "segmentId": "dripstat_2014-09-07T09:00:00.000Z_2014-09-07T10:00:00.000Z_2014-09-15T03:00:38.309Z",
          "offset": 1,
          "event": {
            "timestamp": "2014-09-07T09:00:00.000Z",
            "jvmid": "52c896791e7ae170ffcce02d",
            "name": "JVM/ClassCount/loaded",
            "value": 0.0
          }
        },
        {
          "segmentId": "dripstat_2014-09-07T09:00:00.000Z_2014-09-07T10:00:00.000Z_2014-09-15T03:00:38.309Z",
          "offset": 2,
          "event": {
            "timestamp": "2014-09-07T09:00:00.000Z",
            "jvmid": "52c896791e7ae170ffcce02d",
            "name": "JVM/ClassCount/loaded",
            "value": 0.0
          }
        },
        {
          "segmentId": "dripstat_2014-09-07T09:00:00.000Z_2014-09-07T10:00:00.000Z_2014-09-15T03:00:38.309Z",
          "offset": 3,
          "event": {
            "timestamp": "2014-09-07T09:00:00.000Z",
            "jvmid": "52c896791e7ae170ffcce02d",
            "name": "JVM/ClassCount/loaded",
            "value": 0.0
          }
        },
        {
          "segmentId": "dripstat_2014-09-07T09:00:00.000Z_2014-09-07T10:00:00.000Z_2014-09-15T03:00:38.309Z",
          "offset": 4,
          "event": {
            "timestamp": "2014-09-07T09:00:00.000Z",
            "jvmid": "52c896791e7ae170ffcce02d",
            "name": "JVM/ClassCount/loaded",
            "value": 0.0
          }
        }
      ]
    }
  }
]
nishantmonu51 commented 10 years ago

Hi Prashant, can you share the index task logs as well.

pdeva commented 10 years ago

index task log. the interval here may be different cause i tested it on multiple different intervals. but they were all tried over an hour range.

2014-09-15 02:52:38,931 INFO [main] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2014-09-15 02:52:38,973 INFO [main] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.0.1.Final
2014-09-15 02:52:39,595 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, coordinates=[io.druid.extensions:druid-s3-extensions:0.6.146], localRepository='/root/.m2/repository', remoteRepositories=[http://repo1.maven.org/maven2/, https://metamx.artifactoryonline.com/metamx/pub-libs-releases-local]}]
2014-09-15 02:52:39,740 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-s3-extensions:0.6.146] for class[io.druid.cli.CliCommandCreator]
2014-09-15 02:52:40,106 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/io/druid/extensions/druid-s3-extensions/0.6.146/druid-s3-extensions-0.6.146.jar]
2014-09-15 02:52:40,106 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/net/java/dev/jets3t/jets3t/0.9.1/jets3t-0.9.1.jar]
2014-09-15 02:52:40,106 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/commons-codec/commons-codec/1.7/commons-codec-1.7.jar]
2014-09-15 02:52:40,106 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar]
2014-09-15 02:52:40,106 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar]
2014-09-15 02:52:40,107 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/javax/activation/activation/1.1.1/activation-1.1.1.jar]
2014-09-15 02:52:40,107 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/mx4j/mx4j/3.0.2/mx4j-3.0.2.jar]
2014-09-15 02:52:40,107 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/javax/mail/mail/1.4.7/mail-1.4.7.jar]
2014-09-15 02:52:40,107 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar]
2014-09-15 02:52:40,107 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/bouncycastle/bcprov-jdk15/1.46/bcprov-jdk15-1.46.jar]
2014-09-15 02:52:40,107 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar]
2014-09-15 02:52:40,107 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/amazonaws/aws-java-sdk/1.6.0.1/aws-java-sdk-1.6.0.1.jar]
2014-09-15 02:52:40,107 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.2.3/jackson-core-2.2.3.jar]
2014-09-15 02:52:40,108 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.2.3/jackson-annotations-2.2.3.jar]
2014-09-15 02:52:40,108 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/apache/httpcomponents/httpclient/4.2/httpclient-4.2.jar]
2014-09-15 02:52:40,108 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/apache/httpcomponents/httpcore/4.2/httpcore-4.2.jar]
2014-09-15 02:52:40,108 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/metamx/emitter/0.2.11/emitter-0.2.11.jar]
2014-09-15 02:52:40,108 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/metamx/http-client/0.9.6/http-client-0.9.6.jar]
2014-09-15 02:52:40,117 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/io/netty/netty/3.9.0.Final/netty-3.9.0.Final.jar]
2014-09-15 02:52:40,117 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/log4j/log4j/1.2.16/log4j-1.2.16.jar]
2014-09-15 02:52:40,117 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/google/guava/guava/16.0.1/guava-16.0.1.jar]
2014-09-15 02:52:40,117 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/commons-io/commons-io/2.0.1/commons-io-2.0.1.jar]
2014-09-15 02:52:40,312 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-s3-extensions:0.6.146] for class[io.druid.initialization.DruidModule]
2014-09-15 02:52:40,395 INFO [main] io.druid.initialization.Initialization - Adding extension module[class io.druid.storage.s3.S3StorageDruidModule] for class[io.druid.initialization.DruidModule]
2014-09-15 02:52:40,396 INFO [main] io.druid.initialization.Initialization - Adding extension module[class io.druid.firehose.s3.S3FirehoseDruidModule] for class[io.druid.initialization.DruidModule]
2014-09-15 02:52:40,788 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class com.metamx.emitter.core.LoggingEmitterConfig] from props[druid.emitter.logging.] as [LoggingEmitterConfig{loggerClass='com.metamx.emitter.core.LoggingEmitter', logLevel='info'}]
2014-09-15 02:52:40,946 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.DruidMonitorSchedulerConfig] from props[druid.monitoring.] as [io.druid.server.metrics.DruidMonitorSchedulerConfig@1556f2dd]
2014-09-15 02:52:40,962 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.MonitorsConfig] from props[druid.monitoring.] as [MonitorsConfig{monitors=[]}]
2014-09-15 02:52:40,982 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.DruidNode] from props[druid.] as [DruidNode{serviceName='overlord', host='184.172.97.75:8088', port=8088}]
2014-09-15 02:52:41,068 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ServerConfig] from props[druid.server.http.] as [io.druid.server.initialization.ServerConfig@5aa6202e]
2014-09-15 02:52:41,076 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.config.TaskConfig] from props[druid.indexer.task.] as [io.druid.indexing.common.config.TaskConfig@4aa3d36]
2014-09-15 02:52:41,081 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.http.DruidHttpClientConfig] from props[druid.global.http.] as [io.druid.guice.http.DruidHttpClientConfig@49d98dc5]
2014-09-15 02:52:41,181 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.indexing.IndexingServiceSelectorConfig] from props[druid.selectors.indexing.] as [io.druid.client.indexing.IndexingServiceSelectorConfig@4f449e8f]
2014-09-15 02:52:41,184 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [duzk.dripstat.com] for [druid.zk.service.host] on [io.druid.curator.CuratorConfig#getZkHosts()]
2014-09-15 02:52:41,186 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [30000] for [druid.zk.service.sessionTimeoutMs] on [io.druid.curator.CuratorConfig#getZkSessionTimeoutMs()]
2014-09-15 02:52:41,187 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [true] for [druid.curator.compress] on [io.druid.curator.CuratorConfig#enableCompression()]
2014-09-15 02:52:41,280 WARN [main] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29
2014-09-15 02:52:41,315 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.CuratorDiscoveryConfig] from props[druid.discovery.curator.] as [io.druid.server.initialization.CuratorDiscoveryConfig@4bdc8b5d]
2014-09-15 02:52:41,494 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.RetryPolicyConfig] from props[druid.peon.taskActionClient.retry.] as [io.druid.indexing.common.RetryPolicyConfig@7f829c76]
2014-09-15 02:52:41,498 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.storage.s3.AWSCredentialsConfig] from props[druid.s3.] as [io.druid.storage.s3.AWSCredentialsConfig@4e904fd5]
2014-09-15 02:52:41,594 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.storage.s3.S3DataSegmentPusherConfig] from props[druid.storage.] as [io.druid.storage.s3.S3DataSegmentPusherConfig@15ac59c2]
2014-09-15 02:52:41,598 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.storage.s3.S3DataSegmentArchiverConfig] from props[druid.storage.] as [io.druid.storage.s3.S3DataSegmentArchiverConfig@3cec79d3]
2014-09-15 02:52:41,616 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.DruidServerConfig] from props[druid.server.] as [io.druid.client.DruidServerConfig@1fcf9739]
2014-09-15 02:52:41,620 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from props[druid.announcer.] as [io.druid.server.initialization.BatchDataSegmentAnnouncerConfig@13f9ad9]
2014-09-15 02:52:41,620 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.coordinatorPath] on [io.druid.server.initialization.ZkPathsConfig#getCoordinatorPath()]
2014-09-15 02:52:41,621 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.propertiesPath] on [io.druid.server.initialization.ZkPathsConfig#getPropertiesPath()]
2014-09-15 02:52:41,621 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.loadQueuePath] on [io.druid.server.initialization.ZkPathsConfig#getLoadQueuePath()]
2014-09-15 02:52:41,621 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.announcementsPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerAnnouncementPath()]
2014-09-15 02:52:41,621 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.tasksPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerTaskPath()]
2014-09-15 02:52:41,621 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.statusPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerStatusPath()]
2014-09-15 02:52:41,621 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.leaderLatchPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerLeaderLatchPath()]
2014-09-15 02:52:41,622 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.liveSegmentsPath] on [io.druid.server.initialization.ZkPathsConfig#getLiveSegmentsPath()]
2014-09-15 02:52:41,622 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.announcementsPath] on [io.druid.server.initialization.ZkPathsConfig#getAnnouncementsPath()]
2014-09-15 02:52:41,622 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.base] on [io.druid.server.initialization.ZkPathsConfig#getZkBasePath()]
2014-09-15 02:52:41,622 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.servedSegmentsPath] on [io.druid.server.initialization.ZkPathsConfig#getServedSegmentsPath()]
2014-09-15 02:52:41,622 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.connectorPath] on [io.druid.server.initialization.ZkPathsConfig#getConnectorPath()]
2014-09-15 02:52:41,650 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.coordination.DataSegmentAnnouncerProvider] from props[druid.announcer.] as [io.druid.server.coordination.BatchDataSegmentAnnouncerProvider@3ac8cf9b]
2014-09-15 02:52:41,654 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.client.FilteredServerViewProvider] from props[druid.announcer.] as [io.druid.client.FilteredBatchServerViewProvider@c1fa7d4]
2014-09-15 02:52:41,664 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.QueryConfig] from props[druid.query.] as [io.druid.query.QueryConfig@537c8c7e]
2014-09-15 02:52:41,675 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.search.search.SearchQueryConfig] from props[druid.query.search.] as [io.druid.query.search.search.SearchQueryConfig@15e0fe05]
2014-09-15 02:52:41,681 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.groupby.GroupByQueryConfig] from props[druid.query.groupBy.] as [io.druid.query.groupby.GroupByQueryConfig@18ac53e8]
2014-09-15 02:52:41,681 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [1003960960] for [druid.computation.buffer.size] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2014-09-15 02:52:41,681 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [2] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2014-09-15 02:52:41,682 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2014-09-15 02:52:41,682 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2014-09-15 02:52:41,691 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.topn.TopNQueryConfig] from props[druid.query.topN.] as [io.druid.query.topn.TopNQueryConfig@4d6ee47]
2014-09-15 02:52:41,702 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.log.RequestLoggerProvider] from props[druid.request.logging.] as [io.druid.server.log.NoopRequestLoggerProvider@2c768ada]
2014-09-15 02:52:41,711 INFO [main] org.eclipse.jetty.util.log - Logging initialized @3385ms
2014-09-15 02:52:41,814 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.core.LoggingEmitter.start()] on object[com.metamx.emitter.core.LoggingEmitter@3681037].
2014-09-15 02:52:41,815 INFO [main] com.metamx.emitter.core.LoggingEmitter - Start: started [true]
2014-09-15 02:52:41,815 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.service.ServiceEmitter.start()] on object[com.metamx.emitter.service.ServiceEmitter@1dc76fa1].
2014-09-15 02:52:41,815 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.metrics.MonitorScheduler.start()] on object[com.metamx.metrics.MonitorScheduler@5eed2d86].
2014-09-15 02:52:41,818 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.http.client.HttpClient.start()] on object[com.metamx.http.client.HttpClient@3db663d0].
2014-09-15 02:52:41,818 INFO [main] io.druid.curator.CuratorModule - Starting Curator
2014-09-15 02:52:41,818 INFO [main] org.apache.curator.framework.imps.CuratorFrameworkImpl - Starting
2014-09-15 02:52:41,829 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2014-09-15 02:52:41,829 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:host.name=du-indexer.dripstat.com
2014-09-15 02:52:41,829 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_20
2014-09-15 02:52:41,829 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation
2014-09-15 02:52:41,829 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.home=/usr/lib/jvm/java-8-oracle/jre
2014-09-15 02:52:41,829 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=lib/org.abego.treelayout.core-1.0.1.jar:lib/rhino-1.7R4.jar:lib/wagon-provider-api-2.4.jar:lib/mysql-connector-java-5.1.18.jar:lib/javax.inject-1.jar:lib/jsr305-2.0.1.jar:lib/jackson-databind-2.2.3.jar:lib/lz4-1.1.2.jar:lib/commons-logging-1.1.1.jar:lib/mx4j-3.0.2.jar:lib/guava-16.0.1.jar:lib/maxminddb-0.2.0.jar:lib/aether-util-0.9.0.M2.jar:lib/jersey-server-1.17.1.jar:lib/commons-pool-1.6.jar:lib/maven-aether-provider-3.1.1.jar:lib/java-xmlbuilder-0.4.jar:lib/maven-model-builder-3.1.1.jar:lib/jackson-mapper-asl-1.9.13.jar:lib/jersey-core-1.17.1.jar:lib/okhttp-1.0.2.jar:lib/jetty-servlets-9.2.2.v20140723.jar:lib/asm-3.1.jar:lib/hibernate-validator-5.0.1.Final.jar:lib/commons-cli-1.2.jar:lib/curator-client-2.6.0.jar:lib/maven-repository-metadata-3.1.1.jar:lib/jline-0.9.94.jar:lib/google-http-client-jackson2-1.15.0-rc.jar:lib/log4j-1.2.16.jar:lib/curator-recipes-2.6.0.jar:lib/zookeeper-3.4.6.jar:lib/aether-connector-file-0.9.0.M2.jar:lib/irc-api-1.0-0011.jar:lib/jackson-jaxrs-json-provider-2.2.3.jar:lib/emitter-0.2.11.jar:lib/druid-indexing-service-0.6.146.jar:lib/http-client-0.9.6.jar:lib/druid-indexing-hadoop-0.6.146.jar:lib/druid-common-0.6.146.jar:lib/jackson-datatype-guava-2.2.3.jar:lib/commons-lang-2.6.jar:lib/activation-1.1.1.jar:lib/joda-time-2.1.jar:lib/jackson-core-2.2.3.jar:lib/sigar-1.6.5.132.jar:lib/jetty-http-9.2.2.v20140723.jar:lib/icu4j-4.8.1.jar:lib/slf4j-api-1.6.4.jar:lib/slf4j-log4j12-1.6.2.jar:lib/google-http-client-1.15.0-rc.jar:lib/curator-framework-2.6.0.jar:lib/druid-server-0.6.146.jar:lib/commons-codec-1.7.jar:lib/compress-lzf-0.8.4.jar:lib/aether-impl-0.9.0.M2.jar:lib/guice-4.0-beta.jar:lib/aopalliance-1.0.jar:lib/maven-settings-builder-3.1.1.jar:lib/druid-api-0.2.7.jar:lib/jackson-core-asl-1.9.13.jar:lib/jetty-server-9.2.2.v20140723.jar:lib/xpp3-1.1.4c.jar:lib/geoip2-0.4.0.jar:lib/netty-3.9.0.Final.jar:lib/guice-servlet-4.0-beta.jar:lib/jetty-io-9.2.2.v20140723.jar:lib/jackson-jaxrs-base-2.2.3.jar:lib/config-magic-0.9.jar:lib/maven-settings-3.1.1.jar:lib/bytebuffer-collections-0.0.2.jar:lib/aether-api-0.9.0.M2.jar:lib/protobuf-java-2.5.0.jar:lib/druid-processing-0.6.146.jar:lib/javax.el-3.0.0.jar:lib/jboss-logging-3.1.1.GA.jar:lib/jersey-guice-1.17.1.jar:lib/aether-connector-okhttp-0.0.9.jar:lib/tesla-aether-0.0.5.jar:lib/jetty-servlet-9.2.2.v20140723.jar:lib/plexus-interpolation-1.19.jar:lib/bcprov-jdk15-1.46.jar:lib/commons-dbcp-1.4.jar:lib/plexus-utils-3.0.15.jar:lib/spymemcached-2.8.4.jar:lib/httpclient-4.2.jar:lib/jackson-dataformat-smile-2.2.3.jar:lib/airline-0.6.jar:lib/curator-x-discovery-2.6.0.jar:lib/httpcore-4.2.jar:lib/jetty-proxy-9.2.2.v20140723.jar:lib/antlr4-runtime-4.0.jar:lib/classmate-0.8.0.jar:lib/server-metrics-0.0.9.jar:lib/javax.servlet-api-3.1.0.jar:lib/maven-model-3.1.1.jar:lib/jdbi-2.32.jar:lib/jetty-security-9.2.2.v20140723.jar:lib/alphanum-1.0.3.jar:lib/extendedset-1.3.4.jar:lib/druid-services-0.6.146.jar:lib/jetty-continuation-9.2.2.v20140723.jar:lib/jackson-module-jaxb-annotations-2.2.3.jar:lib/java-util-0.26.6.jar:lib/validation-api-1.1.0.Final.jar:lib/opencsv-2.3.jar:lib/guice-multibindings-4.0-beta.jar:lib/jersey-servlet-1.17.1.jar:lib/jackson-datatype-joda-2.2.3.jar:lib/jetty-util-9.2.2.v20140723.jar:lib/jets3t-0.9.1.jar:lib/mail-1.4.7.jar:lib/commons-io-2.0.1.jar:lib/jackson-annotations-2.2.3.jar:lib/aws-java-sdk-1.6.0.1.jar:lib/jetty-client-9.2.2.v20140723.jar:lib/aether-spi-0.9.0.M2.jar:config/overlord:/opt/dripstat/dripstat.jar
2014-09-15 02:52:41,829 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2014-09-15 02:52:41,829 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=/tmp
2014-09-15 02:52:41,830 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
2014-09-15 02:52:41,830 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
2014-09-15 02:52:41,830 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
2014-09-15 02:52:41,830 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.version=3.13.0-32-generic
2014-09-15 02:52:41,830 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.name=root
2014-09-15 02:52:41,830 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.home=/root
2014-09-15 02:52:41,830 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/opt/druid-services-0.6.146
2014-09-15 02:52:41,831 INFO [main] org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=duzk.dripstat.com sessionTimeout=30000 watcher=org.apache.curator.ConnectionState@40e60ece
2014-09-15 02:52:41,851 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.discovery.ServerDiscoverySelector.start() throws java.lang.Exception] on object[io.druid.curator.discovery.ServerDiscoverySelector@715d6168].
2014-09-15 02:52:41,968 INFO [main-SendThread(104.130.2.143:2181)] org.apache.zookeeper.ClientCnxn - Opening socket connection to server 104.130.2.143/104.130.2.143:2181. Will not attempt to authenticate using SASL (unknown error)
2014-09-15 02:52:42,060 INFO [main-SendThread(104.130.2.143:2181)] org.apache.zookeeper.ClientCnxn - Socket connection established to 104.130.2.143/104.130.2.143:2181, initiating session
2014-09-15 02:52:42,099 INFO [main-SendThread(104.130.2.143:2181)] org.apache.zookeeper.ClientCnxn - Session establishment complete on server 104.130.2.143/104.130.2.143:2181, sessionid = 0x14815a952eb006a, negotiated timeout = 30000
2014-09-15 02:52:42,104 INFO [main-EventThread] org.apache.curator.framework.state.ConnectionStateManager - State change: CONNECTED
2014-09-15 02:52:43,373 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.announcement.Announcer.start()] on object[io.druid.curator.announcement.Announcer@2b8bd798].
2014-09-15 02:52:43,375 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.client.ServerInventoryView.start() throws java.lang.Exception] on object[io.druid.client.BatchServerInventoryView@43e1692f].
2014-09-15 02:52:43,380 INFO [main] org.eclipse.jetty.server.Server - jetty-9.2.2.v20140723
Sep 15, 2014 2:52:43 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class
Sep 15, 2014 2:52:43 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering io.druid.server.StatusResource as a root resource class
Sep 15, 2014 2:52:43 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.17.1 02/28/2013 12:47 PM'
2014-09-15 02:52:43,561 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/104.130.16.70:8080
2014-09-15 02:52:43,561 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for 104.130.16.70:8080, inventoryPath /druid/segments/104.130.16.70:8080
2014-09-15 02:52:43,562 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='104.130.16.70:8080', host='104.130.16.70:8080', maxSize=3000000000, tier='_default_tier', type='historical', priority='0'}]
2014-09-15 02:52:43,562 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/23.253.158.204:8080
2014-09-15 02:52:43,562 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for 23.253.158.204:8080, inventoryPath /druid/segments/23.253.158.204:8080
2014-09-15 02:52:43,562 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='23.253.158.204:8080', host='23.253.158.204:8080', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-09-15 02:52:43,563 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/23.253.158.210:8080
2014-09-15 02:52:43,563 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for 23.253.158.210:8080, inventoryPath /druid/segments/23.253.158.210:8080
2014-09-15 02:52:43,563 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='23.253.158.210:8080', host='23.253.158.210:8080', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
Sep 15, 2014 2:52:43 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
Sep 15, 2014 2:52:44 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.QueryResource to GuiceInstantiatedComponentProvider
Sep 15, 2014 2:52:44 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.segment.realtime.firehose.ChatHandlerResource to GuiceInstantiatedComponentProvider
Sep 15, 2014 2:52:44 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2014-09-15 02:52:44,097 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@7af3874e{/,null,AVAILABLE}
2014-09-15 02:52:44,106 INFO [main] org.eclipse.jetty.server.ServerConnector - Started ServerConnector@3b705be7{HTTP/1.1}{0.0.0.0:8088}
2014-09-15 02:52:44,106 INFO [main] org.eclipse.jetty.server.Server - Started @5782ms
2014-09-15 02:52:44,107 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.start()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@3a790e40].
2014-09-15 02:52:44,107 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Announcing self[DruidServerMetadata{name='184.172.97.75:8088', host='184.172.97.75:8088', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}] at [/druid/announcements/184.172.97.75:8088]
2014-09-15 02:52:44,252 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.start()] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@2b5c4f17].
2014-09-15 02:52:44,353 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/segments/184.172.97.75:8088
2014-09-15 02:52:44,353 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for 184.172.97.75:8088, inventoryPath /druid/segments/184.172.97.75:8088
2014-09-15 02:52:44,353 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - New Server[DruidServerMetadata{name='184.172.97.75:8088', host='184.172.97.75:8088', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}]
2014-09-15 02:52:44,377 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Running with task: {
  "type" : "index",
  "id" : "index_dripstat_2014-09-15T02:52:38.170Z",
  "schema" : {
    "dataSchema" : {
      "dataSource" : "dripstat",
      "parser" : null,
      "metricsSpec" : [ {
        "type" : "count",
        "name" : "count"
      }, {
        "type" : "doubleSum",
        "name" : "min",
        "fieldName" : "min"
      }, {
        "type" : "doubleSum",
        "name" : "max",
        "fieldName" : "max"
      }, {
        "type" : "longSum",
        "name" : "callCount",
        "fieldName" : "callCount"
      }, {
        "type" : "doubleSum",
        "name" : "totalTime",
        "fieldName" : "totalTime"
      }, {
        "type" : "doubleSum",
        "name" : "value",
        "fieldName" : "value"
      } ],
      "granularitySpec" : {
        "type" : "uniform",
        "segmentGranularity" : "HOUR",
        "queryGranularity" : {
          "type" : "duration",
          "duration" : 3600000,
          "origin" : "1970-01-01T00:00:00.000Z"
        },
        "intervals" : [ "2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z" ]
      }
    },
    "ioConfig" : {
      "type" : "index",
      "firehose" : {
        "type" : "ingestSegment",
        "dataSource" : "dripstat",
        "interval" : "2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z",
        "filter" : null,
        "dimensions" : null,
        "metrics" : null
      }
    },
    "tuningConfig" : {
      "type" : "index",
      "targetPartitionSize" : 5000000,
      "rowFlushBoundary" : 500000,
      "numShards" : -1
    }
  },
  "dataSource" : "dripstat",
  "groupId" : "index_dripstat_2014-09-15T02:52:38.170Z",
  "interval" : "2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z",
  "resource" : {
    "availabilityGroup" : "index_dripstat_2014-09-15T02:52:38.170Z",
    "requiredCapacity" : 1
  }
}
2014-09-15 02:52:44,380 INFO [main] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_dripstat_2014-09-15T02:52:38.170Z]: LockTryAcquireAction{interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z}
2014-09-15 02:52:44,387 INFO [main] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_dripstat_2014-09-15T02:52:38.170Z] to overlord[http://184.172.97.75:8080/druid/indexer/v1/action]: LockTryAcquireAction{interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z}
2014-09-15 02:52:44,402 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://184.172.97.75:8080
2014-09-15 02:52:44,440 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://184.172.97.75:8080
2014-09-15 02:52:44,441 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://184.172.97.75:8080
2014-09-15 02:52:44,441 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://184.172.97.75:8080
2014-09-15 02:52:44,452 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://184.172.97.75:8080
2014-09-15 02:52:44,501 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Running task: index_dripstat_2014-09-15T02:52:38.170Z
2014-09-15 02:52:44,502 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_dripstat_2014-09-15T02:52:38.170Z]: LockListAction{}
2014-09-15 02:52:44,510 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_dripstat_2014-09-15T02:52:38.170Z] to overlord[http://184.172.97.75:8080/druid/indexer/v1/action]: LockListAction{}
2014-09-15 02:52:44,515 INFO [task-runner-0] io.druid.indexing.firehose.IngestSegmentFirehoseFactory - Connecting firehose: dataSource[dripstat], interval[2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z]
2014-09-15 02:52:44,520 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[reingest]: SegmentListUsedAction{dataSource='dripstat', interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z}
2014-09-15 02:52:44,524 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[reingest] to overlord[http://184.172.97.75:8080/druid/indexer/v1/action]: SegmentListUsedAction{dataSource='dripstat', interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z}
2014-09-15 02:52:44,721 INFO [task-runner-0] io.druid.storage.s3.S3DataSegmentPuller - Pulling index at path[s3://dripstat/dripstat/2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z/2014-09-07T08:00:00.000Z/0/index.zip] to outDir[/tmp/persistent/task/reingest/work/dripstat/2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z/2014-09-07T08:00:00.000Z/0]
2014-09-15 02:52:45,877 INFO [task-runner-0] io.druid.storage.s3.S3DataSegmentPuller - Pull of file[dripstat/dripstat/2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z/2014-09-07T08:00:00.000Z/0/index.zip] completed in 380 millis
2014-09-15 02:52:45,896 INFO [task-runner-0] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2014-09-15 02:52:45,903 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Assigning value [1003960960] for [druid.computation.buffer.size] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2014-09-15 02:52:45,903 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Assigning value [2] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2014-09-15 02:52:45,903 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2014-09-15 02:52:45,903 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2014-09-15 02:52:45,966 INFO [task-runner-0] io.druid.segment.CompressedPools - Allocating new littleEndByteBuf[1]
2014-09-15 02:52:45,966 INFO [task-runner-0] io.druid.segment.CompressedPools - Allocating new outputBytesPool[1]
2014-09-15 02:52:47,021 INFO [task-runner-0] io.druid.indexing.common.task.IndexTask - Determining partitions for interval[2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z] with targetPartitionSize[5000000]
2014-09-15 02:52:47,021 INFO [task-runner-0] io.druid.indexing.firehose.IngestSegmentFirehoseFactory - Connecting firehose: dataSource[dripstat], interval[2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z]
2014-09-15 02:52:47,021 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[reingest]: SegmentListUsedAction{dataSource='dripstat', interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z}
2014-09-15 02:52:47,022 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[reingest] to overlord[http://184.172.97.75:8080/druid/indexer/v1/action]: SegmentListUsedAction{dataSource='dripstat', interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z}
2014-09-15 02:52:48,044 INFO [task-runner-0] io.druid.indexing.common.task.IndexTask - Partitioning on dimension[metricid] with cardinality[1332] over rows[103589]
2014-09-15 02:52:48,048 INFO [task-runner-0] io.druid.indexing.common.task.IndexTask - Adding shard: NoneShardSpec
2014-09-15 02:52:48,049 INFO [task-runner-0] io.druid.indexing.firehose.IngestSegmentFirehoseFactory - Connecting firehose: dataSource[dripstat], interval[2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z]
2014-09-15 02:52:48,049 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[reingest]: SegmentListUsedAction{dataSource='dripstat', interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z}
2014-09-15 02:52:48,050 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[reingest] to overlord[http://184.172.97.75:8080/druid/indexer/v1/action]: SegmentListUsedAction{dataSource='dripstat', interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z}
2014-09-15 02:52:51,344 INFO [task-runner-0] io.druid.indexing.common.index.YeOldePlumberSchool - Spilling index[0] with rows[103589] to: /tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0
2014-09-15 02:52:51,348 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting persist for interval[2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z], rows[103,589]
2014-09-15 02:52:52,474 INFO [task-runner-0] io.druid.segment.IndexMerger - outDir[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0/v8-tmp] completed index.drd in 2 millis.
2014-09-15 02:52:52,517 INFO [task-runner-0] io.druid.segment.IndexMerger - outDir[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0/v8-tmp] completed dim conversions in 43 millis.
2014-09-15 02:52:52,691 INFO [task-runner-0] io.druid.segment.CompressedPools - Allocating new chunkEncoder[1]
2014-09-15 02:52:53,390 INFO [task-runner-0] io.druid.segment.IndexMerger - outDir[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0/v8-tmp] completed walk through of 103,589 rows in 872 millis.
2014-09-15 02:52:53,390 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[accountid] with cardinality[44]
2014-09-15 02:52:53,428 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[accountid] in 37 millis.
2014-09-15 02:52:53,428 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[appid] with cardinality[133]
2014-09-15 02:52:53,444 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[appid] in 16 millis.
2014-09-15 02:52:53,445 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[extraparentmetricid] with cardinality[13]
2014-09-15 02:52:53,456 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[extraparentmetricid] in 12 millis.
2014-09-15 02:52:53,456 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[jvmid] with cardinality[182]
2014-09-15 02:52:53,470 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[jvmid] in 14 millis.
2014-09-15 02:52:53,471 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[lowercategory] with cardinality[12]
2014-09-15 02:52:53,482 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[lowercategory] in 11 millis.
2014-09-15 02:52:53,482 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[metricid] with cardinality[1,332]
2014-09-15 02:52:53,512 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[metricid] in 30 millis.
2014-09-15 02:52:53,512 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[metrictype] with cardinality[4]
2014-09-15 02:52:53,527 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[metrictype] in 15 millis.
2014-09-15 02:52:53,527 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[middlecategory] with cardinality[17]
2014-09-15 02:52:53,542 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[middlecategory] in 15 millis.
2014-09-15 02:52:53,542 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[minpresent] with cardinality[2]
2014-09-15 02:52:53,557 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[minpresent] in 15 millis.
2014-09-15 02:52:53,557 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[name] with cardinality[76]
2014-09-15 02:52:53,573 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[name] in 16 millis.
2014-09-15 02:52:53,573 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[parentmetricid] with cardinality[47]
2014-09-15 02:52:53,589 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[parentmetricid] in 16 millis.
2014-09-15 02:52:53,589 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[parentname] with cardinality[23]
2014-09-15 02:52:53,591 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[parentname] in 2 millis.
2014-09-15 02:52:53,591 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[timestamp] with cardinality[60]
2014-09-15 02:52:53,619 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[timestamp] in 28 millis.
2014-09-15 02:52:53,620 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[uppercategory] with cardinality[6]
2014-09-15 02:52:53,635 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[uppercategory] in 16 millis.
2014-09-15 02:52:53,635 INFO [task-runner-0] io.druid.segment.IndexMerger - outDir[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0/v8-tmp] completed inverted.drd in 245 millis.
2014-09-15 02:52:53,683 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Converting v8[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0/v8-tmp] to v9[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0]
2014-09-15 02:52:53,684 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_accountid.drd]
2014-09-15 02:52:53,721 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[accountid] is single value, converting...
2014-09-15 02:52:53,781 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_appid.drd]
2014-09-15 02:52:53,814 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[appid] is single value, converting...
2014-09-15 02:52:53,838 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_extraparentmetricid.drd]
2014-09-15 02:52:53,871 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[extraparentmetricid] is single value, converting...
2014-09-15 02:52:53,893 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_jvmid.drd]
2014-09-15 02:52:53,926 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[jvmid] is single value, converting...
2014-09-15 02:52:53,950 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_lowercategory.drd]
2014-09-15 02:52:53,984 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[lowercategory] is single value, converting...
2014-09-15 02:52:53,984 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[lowercategory] has null rows.
2014-09-15 02:52:53,984 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[lowercategory] has no null value in the dictionary, expanding...
2014-09-15 02:52:54,007 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_metricid.drd]
2014-09-15 02:52:54,064 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[metricid] is single value, converting...
2014-09-15 02:52:54,072 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_metrictype.drd]
2014-09-15 02:52:54,077 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[metrictype] is single value, converting...
2014-09-15 02:52:54,083 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_middlecategory.drd]
2014-09-15 02:52:54,088 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[middlecategory] is single value, converting...
2014-09-15 02:52:54,088 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[middlecategory] has null rows.
2014-09-15 02:52:54,088 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[middlecategory] has no null value in the dictionary, expanding...
2014-09-15 02:52:54,095 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_minpresent.drd]
2014-09-15 02:52:54,099 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[minpresent] is single value, converting...
2014-09-15 02:52:54,106 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_name.drd]
2014-09-15 02:52:54,110 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[name] is single value, converting...
2014-09-15 02:52:54,117 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_parentmetricid.drd]
2014-09-15 02:52:54,121 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[parentmetricid] is single value, converting...
2014-09-15 02:52:54,128 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_parentname.drd]
2014-09-15 02:52:54,134 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[parentname] is single value, converting...
2014-09-15 02:52:54,134 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[parentname] has null rows.
2014-09-15 02:52:54,134 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[parentname] has no null value in the dictionary, expanding...
2014-09-15 02:52:54,140 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_timestamp.drd]
2014-09-15 02:52:54,145 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[timestamp] is single value, converting...
2014-09-15 02:52:54,153 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_uppercategory.drd]
2014-09-15 02:52:54,157 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[uppercategory] is single value, converting...
2014-09-15 02:52:54,164 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[index.drd]
2014-09-15 02:52:54,164 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[inverted.drd]
2014-09-15 02:52:54,164 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_callcount_LITTLE_ENDIAN.drd]
2014-09-15 02:52:54,167 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_count_LITTLE_ENDIAN.drd]
2014-09-15 02:52:54,168 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_max_LITTLE_ENDIAN.drd]
2014-09-15 02:52:54,169 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_min_LITTLE_ENDIAN.drd]
2014-09-15 02:52:54,169 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_totaltime_LITTLE_ENDIAN.drd]
2014-09-15 02:52:54,169 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_value_LITTLE_ENDIAN.drd]
2014-09-15 02:52:54,170 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[spatial.drd]
2014-09-15 02:52:54,170 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[time_LITTLE_ENDIAN.drd]
2014-09-15 02:52:54,173 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Skipped files[[index.drd, inverted.drd, spatial.drd]]
2014-09-15 02:52:54,181 INFO [task-runner-0] io.druid.storage.s3.S3DataSegmentPusher - Uploading [/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0] to S3
2014-09-15 02:52:54,182 INFO [task-runner-0] io.druid.utils.CompressionUtils - Adding file[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0/00000.smoosh] with size[2,148,267].  Total size so far[0]
2014-09-15 02:52:54,225 INFO [task-runner-0] io.druid.utils.CompressionUtils - Adding file[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0/meta.smoosh] with size[606].  Total size so far[2,148,267]
2014-09-15 02:52:54,225 INFO [task-runner-0] io.druid.utils.CompressionUtils - Adding file[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0/version.bin] with size[4].  Total size so far[2,148,873]
2014-09-15 02:52:54,240 INFO [task-runner-0] io.druid.storage.s3.S3DataSegmentPusher - Pushing S3Object [key=dripstat/2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z/2014-09-15T02:52:38.243Z/0/index.zip, bucket=dripstat, lastModified=null, dataInputStream=null, Metadata={x-amz-acl=bucket-owner-full-control, md5-hash=963f21312285badda19cd2a8b996b831, Content-Length=81006, Content-Type=application/zip, Content-MD5=lj8hMSKFut2hnNKouZa4MQ==}].
2014-09-15 02:52:54,642 INFO [task-runner-0] io.druid.storage.s3.S3DataSegmentPusher - Pushing S3Object [key=dripstat/2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z/2014-09-15T02:52:38.243Z/0/descriptor.json, bucket=dripstat, lastModified=null, dataInputStream=null, Metadata={x-amz-acl=bucket-owner-full-control, md5-hash=64cef00baf498c6e65c5b69f3ae7fb87, Content-Length=517, Content-Type=application/octet-stream, Content-MD5=ZM7wC69JjG5lxbafOuf7hw==}]
2014-09-15 02:52:54,730 INFO [task-runner-0] io.druid.storage.s3.S3DataSegmentPusher - Deleting zipped index File[/tmp/druid5426002647486280466index.zip]
2014-09-15 02:52:54,731 INFO [task-runner-0] io.druid.storage.s3.S3DataSegmentPusher - Deleting descriptor file[/tmp/druid4339743250545191195descriptor.json]
2014-09-15 02:52:54,731 INFO [task-runner-0] io.druid.indexing.common.index.YeOldePlumberSchool - Uploaded segment[dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z]
2014-09-15 02:52:54,731 INFO [task-runner-0] io.druid.indexing.common.index.YeOldePlumberSchool - Deleting Index File[/tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z_0/dripstat_2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z_2014-09-15T02:52:38.243Z/spill0]
2014-09-15 02:52:54,732 INFO [task-runner-0] io.druid.indexing.common.task.IndexTask - Task[index_dripstat_2014-09-15T02:52:38.170Z] interval[2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z] partition[0] took in 103,589 rows (103,589 processed, 0 unparseable, 0 thrown away) and output 103,589 rows
2014-09-15 02:52:54,740 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_dripstat_2014-09-15T02:52:38.170Z]: SegmentInsertAction{segments=[DataSegment{size=2148877, shardSpec=NoneShardSpec, metrics=[count, min, max, callCount, totalTime, value], dimensions=[accountid, appid, extraparentmetricid, jvmid, lowercategory, metricid, metrictype, middlecategory, minpresent, name, parentmetricid, parentname, timestamp, uppercategory], version='2014-09-15T02:52:38.243Z', loadSpec={type=s3_zip, bucket=dripstat, key=dripstat/2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z/2014-09-15T02:52:38.243Z/0/index.zip}, interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z, dataSource='dripstat', binaryVersion='9'}]}
2014-09-15 02:52:54,745 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_dripstat_2014-09-15T02:52:38.170Z] to overlord[http://184.172.97.75:8080/druid/indexer/v1/action]: SegmentInsertAction{segments=[DataSegment{size=2148877, shardSpec=NoneShardSpec, metrics=[count, min, max, callCount, totalTime, value], dimensions=[accountid, appid, extraparentmetricid, jvmid, lowercategory, metricid, metrictype, middlecategory, minpresent, name, parentmetricid, parentname, timestamp, uppercategory], version='2014-09-15T02:52:38.243Z', loadSpec={type=s3_zip, bucket=dripstat, key=dripstat/2014-09-07T08:00:00.000Z_2014-09-07T09:00:00.000Z/2014-09-15T02:52:38.243Z/0/index.zip}, interval=2014-09-07T08:00:00.000Z/2014-09-07T09:00:00.000Z, dataSource='dripstat', binaryVersion='9'}]}
2014-09-15 02:52:55,023 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Removing task directory: /tmp/persistent/task/index_dripstat_2014-09-15T02:52:38.170Z/work
2014-09-15 02:52:55,026 INFO [task-runner-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
  "id" : "index_dripstat_2014-09-15T02:52:38.170Z",
  "status" : "SUCCESS",
  "duration" : 10523
}
2014-09-15 02:52:55,030 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.stop()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@3a790e40].
2014-09-15 02:52:55,030 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Stopping class io.druid.server.coordination.BatchDataSegmentAnnouncer with config[io.druid.server.initialization.ZkPathsConfig$$EnhancerByCGLIB$$84a9c10c@6690b9fa]
2014-09-15 02:52:55,030 INFO [main] io.druid.curator.announcement.Announcer - unannouncing [/druid/announcements/184.172.97.75:8088]
2014-09-15 02:52:55,067 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Closing inventory cache for 184.172.97.75:8088. Also removing listeners.
2014-09-15 02:52:55,067 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.stop()] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@2b5c4f17].
2014-09-15 02:52:55,069 INFO [ServerInventoryView-0] io.druid.client.BatchServerInventoryView - Server Disappeared[DruidServerMetadata{name='184.172.97.75:8088', host='184.172.97.75:8088', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}]
2014-09-15 02:52:55,070 INFO [main] org.eclipse.jetty.server.ServerConnector - Stopped ServerConnector@3b705be7{HTTP/1.1}{0.0.0.0:8088}
2014-09-15 02:52:55,072 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Stopped o.e.j.s.ServletContextHandler@7af3874e{/,null,UNAVAILABLE}
2014-09-15 02:52:55,075 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.overlord.ThreadPoolTaskRunner.stop()] on object[io.druid.indexing.overlord.ThreadPoolTaskRunner@10c47c79].
2014-09-15 02:52:55,075 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.client.ServerInventoryView.stop() throws java.io.IOException] on object[io.druid.client.BatchServerInventoryView@43e1692f].
2014-09-15 02:52:55,076 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.announcement.Announcer.stop()] on object[io.druid.curator.announcement.Announcer@2b8bd798].
2014-09-15 02:52:55,076 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.discovery.ServerDiscoverySelector.stop() throws java.io.IOException] on object[io.druid.curator.discovery.ServerDiscoverySelector@715d6168].
2014-09-15 02:52:55,078 INFO [main] io.druid.curator.CuratorModule - Stopping Curator
2014-09-15 02:52:55,112 INFO [main] org.apache.zookeeper.ZooKeeper - Session: 0x14815a952eb006a closed
2014-09-15 02:52:55,112 INFO [main-EventThread] org.apache.zookeeper.ClientCnxn - EventThread shut down
2014-09-15 02:52:55,113 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.http.client.HttpClient.stop()] on object[com.metamx.http.client.HttpClient@3db663d0].
2014-09-15 02:52:55,124 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.metrics.MonitorScheduler.stop()] on object[com.metamx.metrics.MonitorScheduler@5eed2d86].
2014-09-15 02:52:55,124 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.emitter.service.ServiceEmitter.close() throws java.io.IOException] on object[com.metamx.emitter.service.ServiceEmitter@1dc76fa1].
2014-09-15 02:52:55,124 INFO [main] com.metamx.emitter.core.LoggingEmitter - Close: started [false]
2014-09-15 02:52:55,124 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.emitter.core.LoggingEmitter.close() throws java.io.IOException] on object[com.metamx.emitter.core.LoggingEmitter@3681037].
nishantmonu51 commented 10 years ago

hi prashant, your data has timestamp as a dimension which is the reason why rows are not rolling up, check your initial index job on why is it ingesting timestamp as a dimension too.

2014-09-15 02:52:53,591 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[timestamp] with cardinality[60] 2014-09-15 02:52:53,619 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[timestamp] in 28 millis.

pdeva commented 10 years ago

it seems that any data ingested through the realtime nodes has the 'timestamp' column marked in dimension. However, all queries against the data seem to work fine (thats why this wasnt even noticed till now).

Here is my spec.json file for my realtime nodes. Am i doing something wrong?

[
  {
    "schema": {
      "dataSource": "dripstat",
      "aggregators": [
        {
          "type": "count",
          "name": "count"
        },
        {
          "type": "doubleSum",
          "name": "min",
          "fieldName": "min"
        },
        {
          "type": "doubleSum",
          "name": "max",
          "fieldName": "max"
        },
        {
          "type": "longSum",
          "name": "callCount",
          "fieldName": "callCount"
        },
        {
          "type": "doubleSum",
          "name": "totalTime",
          "fieldName": "totalTime"
        },
        {
          "type": "doubleSum",
          "name": "value",
          "fieldName": "value"
        }
      ],
      "indexGranularity": "none",
      "shardSpec": {
        "type": "linear",
        "partitionNum":0
      }
    },
    "config": {
      "maxRowsInMemory": 50000,
      "intermediatePersistPeriod": "PT10m"
    },
    "firehose": {
      "type": "kafka-0.8",
      "consumerProps": {
        "zookeeper.connect": "xx:2181",
        "zookeeper.connection.timeout.ms": "15000",
        "zookeeper.session.timeout.ms": "15000",
        "zookeeper.synctime.ms": "5000",
        "group.id": "druid-dripstat",
        "fetch.size": "1048586",
        "auto.offset.reset": "largest",
        "auto.commit.enable": "false"
      },
      "feed": "dripstat",
      "parser": {
        "timestampSpec": {
          "column": "timestamp"
        },
        "data": {
          "format": "json"
        }
      }
    },
    "plumber": {
      "type": "realtime",
      "windowPeriod": "PT10m",
      "segmentGranularity": "hour",
      "basePersistDirectory": "/tmp/realtime/basePersist"
    }
  }
]
pdeva commented 10 years ago

why would queries from the broker work in this case? shouldnt they be broken too?

pdeva commented 10 years ago

created a bug report #737 to go into the realtime indexing issue

nishantmonu51 commented 10 years ago

Hi pdeva, the reindex task looks fine,I wonder if the segments you reindexed had metrics values set to 0 for some rows, what happens when u run the same select query after ingesting your data without rerolling it up using ingestSegmentfirehose, do you still see zero values for some rows ?

pdeva commented 10 years ago

the values are not 0 before the indexing. in fact i spent an hour with Eric Tschetter and he verified that this is indeed an issue with druid itself.

will update after ingesting without rolling.

pdeva commented 10 years ago

ok, so doing 'none' index granularity still makes the values be 0.

Here is the index task:

{
    "type": "index",
    "dataSource": "dripstat",
    "granularitySpec": {
        "type": "uniform",
        "gran": "hour",
        "intervals": [ "2014-09-16T03:00:00.000Z/2014-09-16T04:00:00.000Z" ]
    },
    "indexGranularity": "none",
    "aggregators": [
        {
            "type": "count",
            "name": "count"
        },
        {
            "type": "doubleSum",
            "name": "min",
            "fieldName": "min"
        },
        {
            "type": "doubleSum",
            "name": "max",
            "fieldName": "max"
        },
        {
            "type": "longSum",
            "name": "callCount",
            "fieldName": "callCount"
        },
        {
            "type": "doubleSum",
            "name": "totalTime",
            "fieldName": "totalTime"
        },
        {
            "type": "doubleSum",
            "name": "value",
            "fieldName": "value"
        }
    ],
    "firehose": {
        "type": "ingestSegment",
        "dataSource": "dripstat",
        "interval": "2014-09-16T03:00:00.000Z/2014-09-16T04:00:00.000Z"
    }
}

here is a select query to examine data:

 {
   "queryType": "select",
   "dataSource": "dripstat",
   "dimensions" : ["jvmid", "name"],
   "metrics":["value"],
   "granularity": "hour",
   "filter": {
    "type": "and",
    "fields": [
      { "type": "selector", "dimension": "name", "value": "JVM/ClassCount/loaded" },
      { "type": "selector", "dimension": "jvmid", "value": "53c45c0d683c6d35edcf7e44" }

    ]
  },
   "intervals": [
     "2014-09-16T03:00:00.000Z/2014-09-16T04:00:00.000Z"
   ],
   "pagingSpec":{"pagingIdentifiers": {}, "threshold":5}
 }

and this is the result

[
    {
        "timestamp": "2014-09-16T03:00:00.000Z",
        "result": {
            "pagingIdentifiers": {
                "dripstat_2014-09-16T03:00:00.000Z_2014-09-16T04:00:00.000Z_2014-09-16T07:42:03.133Z": 4
            },
            "events": [
                {
                    "segmentId": "dripstat_2014-09-16T03:00:00.000Z_2014-09-16T04:00:00.000Z_2014-09-16T07:42:03.133Z",
                    "offset": 0,
                    "event": {
                        "timestamp": "2014-09-16T03:00:00.000Z",
                        "jvmid": "53c45c0d683c6d35edcf7e44",
                        "name": "JVM/ClassCount/loaded",
                        "value": 0
                    }
                },
                {
                    "segmentId": "dripstat_2014-09-16T03:00:00.000Z_2014-09-16T04:00:00.000Z_2014-09-16T07:42:03.133Z",
                    "offset": 1,
                    "event": {
                        "timestamp": "2014-09-16T03:01:00.000Z",
                        "jvmid": "53c45c0d683c6d35edcf7e44",
                        "name": "JVM/ClassCount/loaded",
                        "value": 0
                    }
                },
                {
                    "segmentId": "dripstat_2014-09-16T03:00:00.000Z_2014-09-16T04:00:00.000Z_2014-09-16T07:42:03.133Z",
                    "offset": 2,
                    "event": {
                        "timestamp": "2014-09-16T03:02:00.000Z",
                        "jvmid": "53c45c0d683c6d35edcf7e44",
                        "name": "JVM/ClassCount/loaded",
                        "value": 0
                    }
                },
                {
                    "segmentId": "dripstat_2014-09-16T03:00:00.000Z_2014-09-16T04:00:00.000Z_2014-09-16T07:42:03.133Z",
                    "offset": 3,
                    "event": {
                        "timestamp": "2014-09-16T03:03:00.000Z",
                        "jvmid": "53c45c0d683c6d35edcf7e44",
                        "name": "JVM/ClassCount/loaded",
                        "value": 0
                    }
                },
                {
                    "segmentId": "dripstat_2014-09-16T03:00:00.000Z_2014-09-16T04:00:00.000Z_2014-09-16T07:42:03.133Z",
                    "offset": 4,
                    "event": {
                        "timestamp": "2014-09-16T03:04:00.000Z",
                        "jvmid": "53c45c0d683c6d35edcf7e44",
                        "name": "JVM/ClassCount/loaded",
                        "value": 0
                    }
                }
            ]
        }
    }

I verified before ingesting and the select statement retunrned events with value > 0 in each event. after doing the ingestion all values were zeroed out. Doesnt seem like an aggregator issue since we specified 'none' index granularity and also the select statement correctly returns minute level rows.

Definitely something wrong with the IngestSegment firehose

xvrl commented 10 years ago

@pdeva do you mind sharing your conversation with @cheddar so we can better understand what the problem is? Thanks.