grafana / loki

Like Prometheus, but for logs.
https://grafana.com/loki
GNU Affero General Public License v3.0
23.37k stars 3.38k forks source link

Time range queries do not work when boltdb-shipper is used #2816

Closed kloyan closed 3 years ago

kloyan commented 3 years ago

Bug Description Hello! Since upgrading from 1.6.1 to 2.0.0, any time range queries that I perform from Grafana (or directly from the Loki API) return all logs up until the configured max lines for the data source (e.g. 1000). The specified time range is not taken into account at all. The problem occurs for both relative and absolute time ranges. I'm running a single instance of monolith Loki 2.0.0 with GCS as a chunk store and boltdb-shipper as an index store. This setup worked fine in 1.6.1. If instead of boltdb-shipper, I switch to the vanilla boltdb, everything works as expected even on 2.0.0. During my testing, I deployed a clean Loki (empty bucket and persistent volumes) every time I changed the chunk/index store types.

To Reproduce Steps to reproduce the behavior:

  1. Started a single instance of Loki 2.0.0 with GCS and boltdb-shipper
  2. Started Promtail 2.0.0 to tail Kubernetes logs
  3. Perform any time range query from Grafana and observe that logs that are outside the specified range are returned

Expected behavior Time ranged queries to work properly. Only logs that fall between the specified times should be displayed.

Environment:

Promtail config is the same as in the official helm chart. This is my Loki config:

target: all
auth_enabled: false

server:
  http_listen_port: 3100
  grpc_listen_port: 9095
  log_level: info

ingester:
  max_transfer_retries: 0
  lifecycler:
    final_sleep: 0s
    join_after: 0s
    address: 127.0.0.1
    ring:
      replication_factor: 1
      kvstore:
        store: inmemory

distributor:
  ring:
    kvstore:
      store: inmemory

limits_config:
  enforce_metric_name: false
  reject_old_samples: true
  reject_old_samples_max_age: 168h

schema_config:
  configs:
  - from: 2020-10-20
    object_store: gcs
    store: boltdb-shipper
    schema: v11
    index:
      period: 24h
      prefix: index_

storage_config:
  gcs:
    bucket_name: my-bucket

  boltdb_shipper:
    shared_store: gcs
    active_index_directory: /data/loki/index
    cache_location: /data/loki/cache

compactor:
  shared_store: gcs
  working_directory: /data/loki/compactor

I executed a range query while using boltdb-shipper and while using vanilla boltdb. Here are logs from Loki and request/response from the Grafana query inspector:

Logs when using GCS + boltdb-shipper (problem occurs)

Loki logs:

level=debug ts=2020-10-27T08:54:12.458376477Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=95.497µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:54:12.458770054Z caller=logging.go:66 traceID=28af8d870b8f5334 msg="POST /loki/api/v1/push (204) 1.209598ms"
ts=2020-10-27T08:54:12.572323985Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=SeriesStore.GetChunkRefs level=debug metric=logs
ts=2020-10-27T08:54:12.572384964Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=SeriesStore.lookupSeriesByMetricNameMatchers level=debug metricName=logs matchers=2
ts=2020-10-27T08:54:12.572447797Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug metricName=logs matcher="component=\"controller\""
ts=2020-10-27T08:54:12.572510121Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" queries=16
ts=2020-10-27T08:54:12.572528384Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" filteredQueries=16
ts=2020-10-27T08:54:12.572594054Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=cachingIndexClient.cacheFetch level=debug hits=0 misses=16
ts=2020-10-27T08:54:12.572783356Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb level=debug queried=uploads-manager
ts=2020-10-27T08:54:12.572828775Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.Downloads.Query level=debug table-name=index_18562
level=info ts=2020-10-27T08:54:12.572844525Z caller=table_manager.go:155 msg="downloading all files for table index_18562"
ts=2020-10-27T08:54:12.572923228Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug metricName=logs matcher="app=\"nginx-ingress\""
ts=2020-10-27T08:54:12.572970699Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" queries=16
ts=2020-10-27T08:54:12.572988426Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" filteredQueries=16
ts=2020-10-27T08:54:12.573042919Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=cachingIndexClient.cacheFetch level=debug hits=0 misses=16
ts=2020-10-27T08:54:12.573302621Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb level=debug queried=uploads-manager
ts=2020-10-27T08:54:12.573348964Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.Downloads.Query level=debug table-name=index_18562
ts=2020-10-27T08:54:12.573776369Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=SeriesStore.GetChunkRefs level=debug metric=logs
ts=2020-10-27T08:54:12.573827524Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=SeriesStore.lookupSeriesByMetricNameMatchers level=debug metricName=logs matchers=2
ts=2020-10-27T08:54:12.573887404Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug metricName=logs matcher="component=\"controller\""
ts=2020-10-27T08:54:12.57393626Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" queries=16
ts=2020-10-27T08:54:12.573955448Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" filteredQueries=16
ts=2020-10-27T08:54:12.574006373Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=cachingIndexClient.cacheFetch level=debug hits=0 misses=16
ts=2020-10-27T08:54:12.574127534Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 level=debug queried=uploads-manager
ts=2020-10-27T08:54:12.574153858Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Shipper.Downloads.Query level=debug table-name=index_18562
ts=2020-10-27T08:54:12.574210811Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug metricName=logs matcher="app=\"nginx-ingress\""
ts=2020-10-27T08:54:12.574249877Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" queries=16
ts=2020-10-27T08:54:12.574265595Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" filteredQueries=16
ts=2020-10-27T08:54:12.574315519Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=cachingIndexClient.cacheFetch level=debug hits=0 misses=16
ts=2020-10-27T08:54:12.574523871Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 level=debug queried=uploads-manager
ts=2020-10-27T08:54:12.574551697Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Shipper.Downloads.Query level=debug table-name=index_18562
level=debug ts=2020-10-27T08:54:12.71392845Z caller=table.go:134 msg="list of files to download for period index_18562: [{index_18562/loki-0-1603787965757502117-1603788752.gz 2020-10-27 08:52:33.05 +0000 UTC}]"
level=info ts=2020-10-27T08:54:12.734972843Z caller=util.go:51 msg="downloaded file index_18562/loki-0-1603787965757502117-1603788752.gz"
ts=2020-10-27T08:54:12.739744223Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.DownloadTable level=debug total-files-downloaded=1
ts=2020-10-27T08:54:12.739943907Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.DownloadTable level=debug total-files-size=286720
ts=2020-10-27T08:54:12.740034697Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.Downloads.Table.MultiQueries level=debug table-name=index_18562 query-count=16
ts=2020-10-27T08:54:12.740068663Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Shipper.Downloads.Table.MultiQueries level=debug table-name=index_18562 query-count=16
ts=2020-10-27T08:54:12.740160735Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.Downloads.Table.MultiQueries level=debug queried-db=loki-0-1603787965757502117-1603788752.gz
ts=2020-10-27T08:54:12.740192745Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb level=debug queried=downloads-manager
ts=2020-10-27T08:54:12.740196123Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Shipper.Downloads.Table.MultiQueries level=debug table-name=index_18562 query-count=16
ts=2020-10-27T08:54:12.740357999Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" entries=2
ts=2020-10-27T08:54:12.740395634Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" ids=1
ts=2020-10-27T08:54:12.740224183Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.Downloads.Table.MultiQueries level=debug table-name=index_18562 query-count=16
ts=2020-10-27T08:54:12.74017206Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Shipper.Downloads.Table.MultiQueries level=debug queried-db=loki-0-1603787965757502117-1603788752.gz
ts=2020-10-27T08:54:12.740669015Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.Downloads.Table.MultiQueries level=debug queried-db=loki-0-1603787965757502117-1603788752.gz
ts=2020-10-27T08:54:12.740741078Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb level=debug queried=downloads-manager
ts=2020-10-27T08:54:12.740481583Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Shipper.Downloads.Table.MultiQueries level=debug queried-db=loki-0-1603787965757502117-1603788752.gz
ts=2020-10-27T08:54:12.740867818Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 level=debug queried=downloads-manager
ts=2020-10-27T08:54:12.740910117Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" entries=4
ts=2020-10-27T08:54:12.740948657Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" ids=2
ts=2020-10-27T08:54:12.740970682Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=SeriesStore.lookupSeriesByMetricNameMatchers level=debug msg="post intersection" ids=1
ts=2020-10-27T08:54:12.740986482Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=SeriesStore.GetChunkRefs level=debug series-ids=1
ts=2020-10-27T08:54:12.741010165Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=SeriesStore.lookupChunksBySeries level=debug seriesIDs=1
ts=2020-10-27T08:54:12.741031126Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=SeriesStore.lookupChunksBySeries level=debug queries=1
ts=2020-10-27T08:54:12.741060708Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=cachingIndexClient.cacheFetch level=debug hits=0 misses=1
ts=2020-10-27T08:54:12.74106009Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" entries=4
ts=2020-10-27T08:54:12.741093342Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" ids=2
ts=2020-10-27T08:54:12.740638121Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 level=debug queried=downloads-manager
ts=2020-10-27T08:54:12.741120953Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb level=debug queried=uploads-manager
ts=2020-10-27T08:54:12.741139288Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.Downloads.Query level=debug table-name=index_18562
ts=2020-10-27T08:54:12.741165732Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.Downloads.Table.MultiQueries level=debug table-name=index_18562 query-count=1
ts=2020-10-27T08:54:12.741188269Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Shipper.Downloads.Table.MultiQueries level=debug queried-db=loki-0-1603787965757502117-1603788752.gz
ts=2020-10-27T08:54:12.741208486Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb level=debug queried=downloads-manager
ts=2020-10-27T08:54:12.7412659Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" entries=2
ts=2020-10-27T08:54:12.741293739Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" ids=1
ts=2020-10-27T08:54:12.741322983Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=SeriesStore.lookupChunksBySeries level=debug entries=2
ts=2020-10-27T08:54:12.741362794Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=SeriesStore.GetChunkRefs level=debug chunk-ids=1
ts=2020-10-27T08:54:12.741392098Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=SeriesStore.GetChunkRefs level=debug chunks-post-filtering=1
ts=2020-10-27T08:54:12.741331281Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=SeriesStore.lookupSeriesByMetricNameMatchers level=debug msg="post intersection" ids=1
ts=2020-10-27T08:54:12.741484729Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=SeriesStore.GetChunkRefs level=debug series-ids=1
ts=2020-10-27T08:54:12.741509431Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=SeriesStore.lookupChunksBySeries level=debug seriesIDs=1
ts=2020-10-27T08:54:12.741584073Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=SeriesStore.lookupChunksBySeries level=debug queries=1
ts=2020-10-27T08:54:12.741522461Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=LokiStore.fetchLazyChunks level=debug msg="loading lazy chunks" chunks=1
ts=2020-10-27T08:54:12.741640161Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=cachingIndexClient.cacheFetch level=debug hits=0 misses=1
ts=2020-10-27T08:54:12.741700751Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 level=debug queried=uploads-manager
ts=2020-10-27T08:54:12.741727641Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Shipper.Downloads.Query level=debug table-name=index_18562
ts=2020-10-27T08:54:12.741756862Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Shipper.Downloads.Table.MultiQueries level=debug table-name=index_18562 query-count=1
ts=2020-10-27T08:54:12.741784152Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Shipper.Downloads.Table.MultiQueries level=debug queried-db=loki-0-1603787965757502117-1603788752.gz
ts=2020-10-27T08:54:12.741700521Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=Fetcher.processCacheResponse level=debug chunks=1 decodeRequests=0 missing=1
ts=2020-10-27T08:54:12.74180457Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 level=debug queried=downloads-manager
ts=2020-10-27T08:54:12.741897571Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=SeriesStore.lookupChunksBySeries level=debug entries=2
ts=2020-10-27T08:54:12.741921516Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=SeriesStore.GetChunkRefs level=debug chunk-ids=1
ts=2020-10-27T08:54:12.741935434Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=SeriesStore.GetChunkRefs level=debug chunks-post-filtering=1
ts=2020-10-27T08:54:12.742030244Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=LokiStore.fetchLazyChunks level=debug msg="loading lazy chunks" chunks=1
ts=2020-10-27T08:54:12.742080906Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=Fetcher.processCacheResponse level=debug chunks=1 decodeRequests=0 missing=1
level=debug ts=2020-10-27T08:54:12.829870932Z caller=grpc_logging.go:58 method=/logproto.Querier/Query duration=257.902288ms msg="gRPC (success)"
ts=2020-10-27T08:54:12.830275181Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=query.Exec level=debug Ingester.TotalReached=1 Ingester.TotalChunksMatched=1 Ingester.TotalBatches=0 Ingester.TotalLinesSent=0 Ingester.HeadChunkBytes="335 B" Ingester.HeadChunkLines=1 Ingester.DecompressedBytes="4.7 kB" Ingester.DecompressedLines=9 Ingester.CompressedBytes="960 B" Ingester.TotalDuplicates=0 Store.TotalChunksRef=0 Store.TotalChunksDownloaded=0 Store.ChunksDownloadTime=0s Store.HeadChunkBytes="0 B" Store.HeadChunkLines=0 Store.DecompressedBytes="0 B" Store.DecompressedLines=0 Store.CompressedBytes="0 B" Store.TotalDuplicates=0
ts=2020-10-27T08:54:12.830351776Z caller=spanlogger.go:53 org_id=fake traceID=27c11789c3dee9bb method=query.Exec level=debug Summary.BytesProcessedPerSecond="20 kB" Summary.LinesProcessedPerSecond=38 Summary.TotalBytesProcessed="5.1 kB" Summary.TotalLinesProcessed=10 Summary.ExecTime=259.08139ms
level=info ts=2020-10-27T08:54:12.83041121Z caller=metrics.go:81 org_id=fake traceID=27c11789c3dee9bb latency=fast query="{app=\"nginx-ingress\", component=\"controller\"}" query_type=limited range_type=instant length=0s step=0s duration=259.08139ms status=200 throughput_mb=0.019526 total_bytes_mb=0.005059
level=debug ts=2020-10-27T08:54:12.831753549Z caller=logging.go:66 traceID=27c11789c3dee9bb msg="GET /loki/api/v1/query?query=%7Bapp%3D%22nginx-ingress%22%2C%20component%3D%22controller%22%7D&time=1603788853000000000&limit=1000 (200) 260.89503ms"
level=debug ts=2020-10-27T08:54:12.878213272Z caller=grpc_logging.go:58 method=/logproto.Querier/Query duration=305.124916ms msg="gRPC (success)"
ts=2020-10-27T08:54:12.878547081Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=query.Exec level=debug Ingester.TotalReached=1 Ingester.TotalChunksMatched=1 Ingester.TotalBatches=0 Ingester.TotalLinesSent=0 Ingester.HeadChunkBytes="335 B" Ingester.HeadChunkLines=1 Ingester.DecompressedBytes="4.7 kB" Ingester.DecompressedLines=9 Ingester.CompressedBytes="960 B" Ingester.TotalDuplicates=0 Store.TotalChunksRef=0 Store.TotalChunksDownloaded=0 Store.ChunksDownloadTime=0s Store.HeadChunkBytes="0 B" Store.HeadChunkLines=0 Store.DecompressedBytes="0 B" Store.DecompressedLines=0 Store.CompressedBytes="0 B" Store.TotalDuplicates=0
ts=2020-10-27T08:54:12.878633266Z caller=spanlogger.go:53 org_id=fake traceID=7353fd1b5059a8d0 method=query.Exec level=debug Summary.BytesProcessedPerSecond="16 kB" Summary.LinesProcessedPerSecond=32 Summary.TotalBytesProcessed="5.1 kB" Summary.TotalLinesProcessed=10 Summary.ExecTime=305.858637ms
level=info ts=2020-10-27T08:54:12.878703986Z caller=metrics.go:81 org_id=fake traceID=7353fd1b5059a8d0 latency=fast query="{app=\"nginx-ingress\", component=\"controller\"}" query_type=limited range_type=range length=2m1s step=1s duration=305.858637ms status=200 throughput_mb=0.01654 total_bytes_mb=0.005059
level=debug ts=2020-10-27T08:54:12.87916396Z caller=logging.go:66 traceID=7353fd1b5059a8d0 msg="GET /loki/api/v1/query_range?direction=BACKWARD&limit=1000&query=%7Bapp%3D%22nginx-ingress%22%2C%20component%3D%22controller%22%7D&start=1603788732000000000&end=1603788853000000000&step=1 (200) 306.651854ms"
level=debug ts=2020-10-27T08:54:13.561382707Z caller=mock.go:149 msg="Get - deadline exceeded" key=collectors/ring
level=debug ts=2020-10-27T08:54:13.561453275Z caller=mock.go:113 msg=Get key=collectors/ring wait_index=21
level=debug ts=2020-10-27T08:54:13.674891646Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=158.417µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:54:13.67530747Z caller=logging.go:66 traceID=787c4fc3fe91613 msg="POST /loki/api/v1/push (204) 1.396333ms"
level=debug ts=2020-10-27T08:54:13.759172706Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=103.23µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:54:13.759546626Z caller=logging.go:66 traceID=439a88404117b5cb msg="POST /loki/api/v1/push (204) 1.961206ms"
level=debug ts=2020-10-27T08:54:13.89548753Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=115.334µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:54:13.895890324Z caller=logging.go:66 traceID=68ac7db83da6bc6e msg="POST /loki/api/v1/push (204) 1.314264ms"
level=debug ts=2020-10-27T08:54:14.041950004Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=111.668µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:54:14.042368739Z caller=logging.go:66 traceID=28333aff40119762 msg="POST /loki/api/v1/push (204) 1.20154ms"

Query Inspector (Notice how only the first log message falls in the specified range):

{
  "request": {
    "url": "api/datasources/proxy/2/loki/api/v1/query_range?direction=BACKWARD&limit=1000&query=%7Bapp%3D%22nginx-ingress%22%2C%20component%3D%22controller%22%7D&start=1603788732000000000&end=1603788853000000000&step=1",
    "hideFromInspector": false
  },
  "response": {
    "status": "success",
    "data": {
      "resultType": "streams",
      "result": [
        {
          "stream": {
            "app": "nginx-ingress",
            "component": "controller",
            "job": "kube-system/nginx-ingress",
            "release": "addons",
            "container": "nginx-ingress-controller",
            "filename": "/var/log/pods/kube-system_addons-nginx-ingress-controller-6cd497b99d-w4vvx_fbd9e8f0-00c4-4a37-ac8b-4ab7c53c2292/nginx-ingress-controller/0.log",
            "namespace": "kube-system",
            "pod": "addons-nginx-ingress-controller-6cd497b99d-w4vvx",
            "pod_template_hash": "6cd497b99d",
            "stream": "stdout"
          },
          "values": [
            [
              "1603788791271089821",
              "<DUMMY-MSG>"
            ],
            [
              "1603788530957786822",
              "<DUMMY-MSG>"
            ],
            [
              "1603788530956178590",
              "<DUMMY-MSG>"
            ],
            [
              "1603788530894837165",
              "<DUMMY-MSG>"
            ],
            [
              "1603788505743058008",
              "<DUMMY-MSG>"
            ],
            [
              "1603788505736669609",
              "<DUMMY-MSG>"
            ],
            [
              "1603788505669379905",
              "<DUMMY-MSG>"
            ],
            [
              "1603788505576922403",
              "<DUMMY-MSG>"
            ],
            [
              "1603788505024797792",
              "<DUMMY-MSG>"
            ],
            [
              "1603788191284954798",
              "<DUMMY-MSG>"
            ]
          ]
        }
      ],
      "stats": {
        "summary": {
          "bytesProcessedPerSecond": 16540,
          "linesProcessedPerSecond": 32,
          "totalBytesProcessed": 5059,
          "totalLinesProcessed": 10,
          "execTime": 0.305858637
        },
        "store": {
          "totalChunksRef": 0,
          "totalChunksDownloaded": 0,
          "chunksDownloadTime": 0,
          "headChunkBytes": 0,
          "headChunkLines": 0,
          "decompressedBytes": 0,
          "decompressedLines": 0,
          "compressedBytes": 0,
          "totalDuplicates": 0
        },
        "ingester": {
          "totalReached": 1,
          "totalChunksMatched": 1,
          "totalBatches": 0,
          "totalLinesSent": 0,
          "headChunkBytes": 335,
          "headChunkLines": 1,
          "decompressedBytes": 4724,
          "decompressedLines": 9,
          "compressedBytes": 960,
          "totalDuplicates": 0
        }
      }
    }
  }
}

Logs when using vanilla boltdb (problem does not occur)

Loki logs:

level=debug ts=2020-10-27T08:36:34.258401105Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=126.975µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:36:34.258828998Z caller=logging.go:66 traceID=341be86ed66b57c0 msg="POST /loki/api/v1/push (204) 1.280476ms"
level=debug ts=2020-10-27T08:36:34.758939968Z caller=mock.go:113 msg=Get key=collectors/ring wait_index=0
level=debug ts=2020-10-27T08:36:34.759000907Z caller=mock.go:159 msg=Get key=collectors/ring modify_index=25 value="\"\\x84\\x14\\xf4\\x03\\n\\n\\x81\\x14\\n\\x06loki-0\\x12\\xf6\\x13\\n\\x0e127.0.0.1:9095\\x10\\x8d\\xb8\\xdf\\xfc\""
level=debug ts=2020-10-27T08:36:34.759067866Z caller=mock.go:86 msg=CAS key=collectors/ring modify_index=25 value="\"\\x84\\x14\\xf4\\x03\\n\\n\\x81\\x14\\n\\x06loki-0\\x12\\xf6\\x13\\n\\x0e127.0.0.1:9095\\x10\\x92\\xb8\\xdf\\xfc\""
level=debug ts=2020-10-27T08:36:34.759129919Z caller=mock.go:149 msg="Get - deadline exceeded" key=collectors/ring
level=debug ts=2020-10-27T08:36:34.75915394Z caller=mock.go:113 msg=Get key=collectors/ring wait_index=25
level=debug ts=2020-10-27T08:36:34.759172333Z caller=mock.go:159 msg=Get key=collectors/ring modify_index=26 value="\"\\x84\\x14\\xf4\\x03\\n\\n\\x81\\x14\\n\\x06loki-0\\x12\\xf6\\x13\\n\\x0e127.0.0.1:9095\\x10\\x92\\xb8\\xdf\\xfc\""
level=debug ts=2020-10-27T08:36:34.759208118Z caller=mock.go:113 msg=Get key=collectors/ring wait_index=26
ts=2020-10-27T08:36:35.027113769Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=SeriesStore.GetChunkRefs level=debug metric=logs
ts=2020-10-27T08:36:35.027181947Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=SeriesStore.lookupSeriesByMetricNameMatchers level=debug metricName=logs matchers=2
ts=2020-10-27T08:36:35.027241626Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug metricName=logs matcher="component=\"controller\""
ts=2020-10-27T08:36:35.027275622Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug metricName=logs matcher="app=\"nginx-ingress\""
ts=2020-10-27T08:36:35.027301141Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" queries=16
ts=2020-10-27T08:36:35.027319272Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" filteredQueries=16
ts=2020-10-27T08:36:35.0273593Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" queries=16
ts=2020-10-27T08:36:35.027376257Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=cachingIndexClient.cacheFetch level=debug hits=0 misses=16
ts=2020-10-27T08:36:35.027408778Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" filteredQueries=16
ts=2020-10-27T08:36:35.027479883Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=cachingIndexClient.cacheFetch level=debug hits=0 misses=16
level=debug ts=2020-10-27T08:36:35.027356985Z caller=grpc_logging.go:58 method=/logproto.Querier/Query duration=189.689µs msg="gRPC (success)"
ts=2020-10-27T08:36:35.027672018Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" entries=2
ts=2020-10-27T08:36:35.0277075Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" ids=2
ts=2020-10-27T08:36:35.027853674Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" entries=3
ts=2020-10-27T08:36:35.027892507Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" ids=3
ts=2020-10-27T08:36:35.027928731Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=SeriesStore.lookupSeriesByMetricNameMatchers level=debug msg="post intersection" ids=2
ts=2020-10-27T08:36:35.02795131Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=SeriesStore.GetChunkRefs level=debug series-ids=2
ts=2020-10-27T08:36:35.027981089Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=SeriesStore.lookupChunksBySeries level=debug seriesIDs=2
ts=2020-10-27T08:36:35.028005017Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=SeriesStore.lookupChunksBySeries level=debug queries=2
ts=2020-10-27T08:36:35.028039514Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=cachingIndexClient.cacheFetch level=debug hits=0 misses=2
ts=2020-10-27T08:36:35.02820191Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=SeriesStore.lookupChunksBySeries level=debug entries=0
ts=2020-10-27T08:36:35.028228477Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=SeriesStore.GetChunkRefs level=debug chunk-ids=0
ts=2020-10-27T08:36:35.028241323Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=SeriesStore.GetChunkRefs level=debug chunks-post-filtering=0
ts=2020-10-27T08:36:35.028395169Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=query.Exec level=debug Ingester.TotalReached=1 Ingester.TotalChunksMatched=0 Ingester.TotalBatches=0 Ingester.TotalLinesSent=0 Ingester.HeadChunkBytes="0 B" Ingester.HeadChunkLines=0 Ingester.DecompressedBytes="0 B" Ingester.DecompressedLines=0 Ingester.CompressedBytes="0 B" Ingester.TotalDuplicates=0 Store.TotalChunksRef=0 Store.TotalChunksDownloaded=0 Store.ChunksDownloadTime=0s Store.HeadChunkBytes="0 B" Store.HeadChunkLines=0 Store.DecompressedBytes="0 B" Store.DecompressedLines=0 Store.CompressedBytes="0 B" Store.TotalDuplicates=0
ts=2020-10-27T08:36:35.028436178Z caller=spanlogger.go:53 org_id=fake traceID=1333259083644359 method=query.Exec level=debug Summary.BytesProcessedPerSecond="0 B" Summary.LinesProcessedPerSecond=0 Summary.TotalBytesProcessed="0 B" Summary.TotalLinesProcessed=0 Summary.ExecTime=1.723871ms
level=info ts=2020-10-27T08:36:35.028475173Z caller=metrics.go:81 org_id=fake traceID=1333259083644359 latency=fast query="{app=\"nginx-ingress\", component=\"controller\"}" query_type=limited range_type=instant length=0s step=0s duration=1.723871ms status=200 throughput_mb=0 total_bytes_mb=0
level=debug ts=2020-10-27T08:36:35.028606834Z caller=logging.go:66 traceID=1333259083644359 msg="GET /loki/api/v1/query?query=%7Bapp%3D%22nginx-ingress%22%2C%20component%3D%22controller%22%7D&time=1603787795000000000&limit=1000 (200) 2.127477ms"
ts=2020-10-27T08:36:35.032818301Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=SeriesStore.GetChunkRefs level=debug metric=logs
ts=2020-10-27T08:36:35.032868569Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=SeriesStore.lookupSeriesByMetricNameMatchers level=debug metricName=logs matchers=2
ts=2020-10-27T08:36:35.032919386Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug metricName=logs matcher="component=\"controller\""
ts=2020-10-27T08:36:35.032958511Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" queries=16
ts=2020-10-27T08:36:35.03297346Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" filteredQueries=16
ts=2020-10-27T08:36:35.033022356Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=cachingIndexClient.cacheFetch level=debug hits=0 misses=16
level=debug ts=2020-10-27T08:36:35.033029268Z caller=grpc_logging.go:58 method=/logproto.Querier/Query duration=169.176µs msg="gRPC (success)"
ts=2020-10-27T08:36:35.033085218Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug metricName=logs matcher="app=\"nginx-ingress\""
ts=2020-10-27T08:36:35.03312421Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" queries=16
ts=2020-10-27T08:36:35.033168308Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" filteredQueries=16
ts=2020-10-27T08:36:35.033218498Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=cachingIndexClient.cacheFetch level=debug hits=0 misses=16
ts=2020-10-27T08:36:35.033462192Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" entries=2
ts=2020-10-27T08:36:35.033490224Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="component=\"controller\"" ids=2
ts=2020-10-27T08:36:35.033678215Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" entries=3
ts=2020-10-27T08:36:35.033704128Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Store.lookupIdsByMetricNameMatcher level=debug matcher="app=\"nginx-ingress\"" ids=3
ts=2020-10-27T08:36:35.033722945Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=SeriesStore.lookupSeriesByMetricNameMatchers level=debug msg="post intersection" ids=2
ts=2020-10-27T08:36:35.033736479Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=SeriesStore.GetChunkRefs level=debug series-ids=2
ts=2020-10-27T08:36:35.033756062Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=SeriesStore.lookupChunksBySeries level=debug seriesIDs=2
ts=2020-10-27T08:36:35.033774331Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=SeriesStore.lookupChunksBySeries level=debug queries=2
ts=2020-10-27T08:36:35.033803152Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=cachingIndexClient.cacheFetch level=debug hits=0 misses=2
ts=2020-10-27T08:36:35.033875432Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=SeriesStore.lookupChunksBySeries level=debug entries=2
ts=2020-10-27T08:36:35.033891187Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=SeriesStore.GetChunkRefs level=debug chunk-ids=2
ts=2020-10-27T08:36:35.033908555Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=SeriesStore.GetChunkRefs level=debug chunks-post-filtering=2
ts=2020-10-27T08:36:35.034036913Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=LokiStore.fetchLazyChunks level=debug msg="loading lazy chunks" chunks=2
ts=2020-10-27T08:36:35.034097031Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=Fetcher.processCacheResponse level=debug chunks=2 decodeRequests=0 missing=2
ts=2020-10-27T08:36:35.034968251Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=query.Exec level=debug Ingester.TotalReached=1 Ingester.TotalChunksMatched=0 Ingester.TotalBatches=0 Ingester.TotalLinesSent=0 Ingester.HeadChunkBytes="0 B" Ingester.HeadChunkLines=0 Ingester.DecompressedBytes="0 B" Ingester.DecompressedLines=0 Ingester.CompressedBytes="0 B" Ingester.TotalDuplicates=0 Store.TotalChunksRef=2 Store.TotalChunksDownloaded=2 Store.ChunksDownloadTime=396.449µs Store.HeadChunkBytes="0 B" Store.HeadChunkLines=0 Store.DecompressedBytes="4.9 kB" Store.DecompressedLines=21 Store.CompressedBytes="1.6 kB" Store.TotalDuplicates=0
ts=2020-10-27T08:36:35.035018303Z caller=spanlogger.go:53 org_id=fake traceID=74f197da8b71dae5 method=query.Exec level=debug Summary.BytesProcessedPerSecond="2.0 MB" Summary.LinesProcessedPerSecond=8523 Summary.TotalBytesProcessed="4.9 kB" Summary.TotalLinesProcessed=21 Summary.ExecTime=2.463731ms
level=info ts=2020-10-27T08:36:35.035066147Z caller=metrics.go:81 org_id=fake traceID=74f197da8b71dae5 latency=fast query="{app=\"nginx-ingress\", component=\"controller\"}" query_type=limited range_type=range length=5m1s step=1s duration=2.463731ms status=200 throughput_mb=1.9852 total_bytes_mb=0.004891
level=debug ts=2020-10-27T08:36:35.035381422Z caller=logging.go:66 traceID=74f197da8b71dae5 msg="GET /loki/api/v1/query_range?direction=BACKWARD&limit=1000&query=%7Bapp%3D%22nginx-ingress%22%2C%20component%3D%22controller%22%7D&start=1603787494000000000&end=1603787795000000000&step=1 (200) 2.997906ms"
level=debug ts=2020-10-27T08:36:35.195553447Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=128.394µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:36:35.195958119Z caller=logging.go:66 traceID=2e10c85cfb32a0c9 msg="POST /loki/api/v1/push (204) 1.288492ms"
level=debug ts=2020-10-27T08:36:35.459285874Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=132.082µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:36:35.460139896Z caller=logging.go:66 traceID=5851d9df84d33b9b msg="POST /loki/api/v1/push (204) 2.483064ms"
level=debug ts=2020-10-27T08:36:36.273355316Z caller=grpc_logging.go:41 duration=123.367µs method=/logproto.Pusher/Push msg="gRPC (success)"
level=debug ts=2020-10-27T08:36:36.273797549Z caller=logging.go:66 traceID=e007bfe88a980c1 msg="POST /loki/api/v1/push (204) 1.327534ms"
level=debug ts=2020-10-27T08:36:36.642042874Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=124.135µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:36:36.642441982Z caller=logging.go:66 traceID=53de3e9c9506a6b9 msg="POST /loki/api/v1/push (204) 1.177007ms"
level=debug ts=2020-10-27T08:36:36.756650297Z caller=mock.go:149 msg="Get - deadline exceeded" key=collectors/ring
level=debug ts=2020-10-27T08:36:36.756724734Z caller=mock.go:113 msg=Get key=collectors/ring wait_index=26
level=debug ts=2020-10-27T08:36:36.758780161Z caller=grpc_logging.go:41 method=/logproto.Pusher/Push duration=238.26µs msg="gRPC (success)"
level=debug ts=2020-10-27T08:36:36.759253462Z caller=logging.go:66 traceID=627cd21478979c90 msg="POST /loki/api/v1/push (204) 1.551156ms"

Query Inspector (all messages are in the specified range):

{
  "request": {
    "url": "api/datasources/proxy/2/loki/api/v1/query_range?direction=BACKWARD&limit=1000&query=%7Bapp%3D%22nginx-ingress%22%2C%20component%3D%22controller%22%7D&start=1603787494000000000&end=1603787795000000000&step=1",
    "hideFromInspector": false
  },
  "response": {
    "status": "success",
    "data": {
      "resultType": "streams",
      "result": [
        {
          "stream": {
            "app": "nginx-ingress",
            "filename": "/var/log/pods/kube-system_addons-nginx-ingress-controller-6cd497b99d-w4vvx_fbd9e8f0-00c4-4a37-ac8b-4ab7c53c2292/nginx-ingress-controller/0.log",
            "namespace": "kube-system",
            "pod_template_hash": "6cd497b99d",
            "release": "addons",
            "stream": "stderr",
            "component": "controller",
            "container": "nginx-ingress-controller",
            "job": "kube-system/nginx-ingress",
            "pod": "addons-nginx-ingress-controller-6cd497b99d-w4vvx"
          },
          "values": [
            [
              "1603787597963073895",
              "<DUMMY-MSG>"
            ],
            [
              "1603787597963062193",
              "<DUMMY-MSG>"
            ],
            [
              "1603787597963054901",
              "<DUMMY-MSG>"
            ],
            [
              "1603787597963044189",
              "<DUMMY-MSG>"
            ],
            [
              "1603787597963037410",
              "<DUMMY-MSG>"
            ],
            [
              "1603787597963024350",
              "<DUMMY-MSG>"
            ],
            [
              "1603787597963018314",
              "<DUMMY-MSG>"
            ],
            [
              "1603787597963009377",
              "<DUMMY-MSG>"
            ],
            [
              "1603787597962941672",
              "<DUMMY-MSG>"
            ]
          ]
        },
        {
          "stream": {
            "app": "nginx-ingress",
            "component": "controller",
            "container": "nginx-ingress-controller",
            "job": "kube-system/nginx-ingress",
            "namespace": "kube-system",
            "filename": "/var/log/pods/kube-system_addons-nginx-ingress-controller-6cd497b99d-w4vvx_fbd9e8f0-00c4-4a37-ac8b-4ab7c53c2292/nginx-ingress-controller/0.log",
            "pod": "addons-nginx-ingress-controller-6cd497b99d-w4vvx",
            "pod_template_hash": "6cd497b99d",
            "release": "addons",
            "stream": "stdout"
          },
          "values": [
            [
              "1603787605711365359",
              "<DUMMY-MSG>"
            ],
            [
              "1603787605673744689",
              "<DUMMY-MSG>"
            ],
            [
              "1603787605457930549",
              "<DUMMY-MSG>"
            ],
            [
              "1603787605409936081",
              "<DUMMY-MSG>"
            ],
            [
              "1603787604742430606",
              "<DUMMY-MSG>"
            ],
            [
              "1603787591261260623",
              "<DUMMY-MSG>"
            ]
          ]
        }
      ],
      "stats": {
        "summary": {
          "bytesProcessedPerSecond": 1985200,
          "linesProcessedPerSecond": 8523,
          "totalBytesProcessed": 4891,
          "totalLinesProcessed": 21,
          "execTime": 0.002463731
        },
        "store": {
          "totalChunksRef": 2,
          "totalChunksDownloaded": 2,
          "chunksDownloadTime": 0.000396449,
          "headChunkBytes": 0,
          "headChunkLines": 0,
          "decompressedBytes": 4891,
          "decompressedLines": 21,
          "compressedBytes": 1555,
          "totalDuplicates": 0
        },
        "ingester": {
          "totalReached": 1,
          "totalChunksMatched": 0,
          "totalBatches": 0,
          "totalLinesSent": 0,
          "headChunkBytes": 0,
          "headChunkLines": 0,
          "decompressedBytes": 0,
          "decompressedLines": 0,
          "compressedBytes": 0,
          "totalDuplicates": 0
        }
      }
    }
  }
}

Please, let me know if you need more information.

Best regards, Kaloyan

kloyan commented 3 years ago

I investigated the issue a little further and I can provide additional information. The To part of the time range (or the end query param) is actually taken into account. However, the From part (or the start query param) seems to be ignored. As you can see in the following query, all log messages are before end=1603805151000000000, but some of them are also before start=1603805090000000000 which is wrong.

{
  "request": {
    "url": "api/datasources/proxy/2/loki/api/v1/query_range?direction=BACKWARD&limit=1000&query=%7Bapp%3D%22nginx-ingress%22%2C%20component%3D%22controller%22%7D&start=1603805090000000000&end=1603805151000000000&step=1",
    "hideFromInspector": false
  },
  "response": {
    "status": "success",
    "data": {
      "resultType": "streams",
      "result": [
        {
          "stream": {
            "component": "controller",
            "filename": "/var/log/pods/kube-system_addons-nginx-ingress-controller-6cd497b99d-w4vvx_fbd9e8f0-00c4-4a37-ac8b-4ab7c53c2292/nginx-ingress-controller/0.log",
            "job": "kube-system/nginx-ingress",
            "pod_template_hash": "6cd497b99d",
            "release": "addons",
            "stream": "stdout",
            "app": "nginx-ingress",
            "container": "nginx-ingress-controller",
            "namespace": "kube-system",
            "pod": "addons-nginx-ingress-controller-6cd497b99d-w4vvx"
          },
          "values": [
            [
              "1603805150976103863",
              "<DUMMY-MSG>"
            ],
            [
              "1603805147751147222",
              "<DUMMY-MSG>"
            ],
            [
              "1603805143635594359",
              "<DUMMY-MSG>"
            ],
            [
              "1603805140386420384",
              "<DUMMY-MSG>"
            ],
            [
              "1603805137226062567",
              "<DUMMY-MSG>"
            ],
            [
              "1603805137145685791",
              "<DUMMY-MSG>"
            ],
            [
              "1603805137129129713",
              "<DUMMY-MSG>"
            ],
            [
              "1603805136664735621",
              "<DUMMY-MSG>"
            ],
            [
              "1603805135861734257",
              "<DUMMY-MSG>"
            ],
            [
              "1603805135034548449",
              "<DUMMY-MSG>"
            ],
            [
              "1603804991247843157",
              "<DUMMY-MSG>"
            ],
            [
              "1603804982235876264",
              "<DUMMY-MSG>"
            ],
            [
              "1603804982208278159",
              "<DUMMY-MSG>"
            ],
            [
              "1603804982188696773",
              "<DUMMY-MSG>"
            ],
            [
              "1603804975262965586",
              "<DUMMY-MSG>"
            ],
            [
              "1603804975226670989",
              "<DUMMY-MSG>"
            ],
            [
              "1603804975159244651",
              "<DUMMY-MSG>"
            ],
            [
              "1603804964699517500",
              "<DUMMY-MSG>"
            ],
            [
              "1603804964662348128",
              "<DUMMY-MSG>"
            ],
            [
              "1603804964657599079",
              "<DUMMY-MSG>"
            ]
          ]
        }
      ],
      "stats": {
        "summary": {
          "bytesProcessedPerSecond": 16750480,
          "linesProcessedPerSecond": 31113,
          "totalBytesProcessed": 43608,
          "totalLinesProcessed": 81,
          "execTime": 0.002603388
        },
        "store": {
          "totalChunksRef": 0,
          "totalChunksDownloaded": 0,
          "chunksDownloadTime": 0,
          "headChunkBytes": 0,
          "headChunkLines": 0,
          "decompressedBytes": 0,
          "decompressedLines": 0,
          "compressedBytes": 0,
          "totalDuplicates": 0
        },
        "ingester": {
          "totalReached": 2,
          "totalChunksMatched": 1,
          "totalBatches": 0,
          "totalLinesSent": 0,
          "headChunkBytes": 43608,
          "headChunkLines": 81,
          "decompressedBytes": 0,
          "decompressedLines": 0,
          "compressedBytes": 0,
          "totalDuplicates": 0
        }
      }
    }
  }
}
slim-bean commented 3 years ago

Thanks @KaloyanSimitchiyski this is very helpful info! we will see what we can figure out.

AMoghrabi commented 3 years ago

Just wanted to say I'm seeing the same thing. I'm using Azure blob storage, same behavior. Thanks!

periklis commented 3 years ago

@slim-bean Will take a look!

sandeepsukhani commented 3 years ago

I have opened a PR to fix the issue. I feel there should be a problem only when query start time is withing last ~1.5h(for both relative and absolute time ranges). If the start time is before that then there should not be a problem.

pcbl commented 3 years ago

Hi Folks,

Is this fix on the Release 2.0 which is available for download? As far as I can see on the Release page, it dates from Oct 26, but this fix seems to come few days later.

Is there any place where I could download it already compiled or on that case I should compile it on my end?

BR Pedro