ing-bank / scruid

Scala + Druid: Scruid. A library that allows you to compose queries in Scala, and parse the result back into typesafe classes.
Apache License 2.0
115 stars 29 forks source link

Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp) #55

Open hariramesh9a opened 5 years ago

hariramesh9a commented 5 years ago

Issue happens when there is no data for intervals, and granularity is minutes/seconds e.g. code DoubleMaxAggregation(fieldName = "temperature", name = "max_temp")

Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

Full log: Execution exception[[anon$2: Double: DownField(max_temp)]] at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:251) at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:178) at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:382) at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:380) at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417) at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

hariramesh9a commented 5 years ago

Issue happens when there is no data for intervals, and granularity is minutes/seconds e.g. code DoubleMaxAggregation(fieldName = "temperature", name = "max_temp")

Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

Full log: Execution exception[[anon$2: Double: DownField(max_temp)]] at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:251) at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:178) at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:382) at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:380) at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417) at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

val query = TimeSeriesQuery( aggregations = List( DoubleMaxAggregation(fieldName = "temperature", name = "max_temp"), DoubleMinAggregation(fieldName = "temperature", name = "min_temp"), DoubleMaxAggregation(fieldName = "torque", name = "max_torque"), DoubleMinAggregation(fieldName = "torque", name = "min_torque"), DoubleMaxAggregation(fieldName = "humidity", name = "max_humidity"), DoubleMinAggregation(fieldName = "humidity", name = "min_humidity") ), granularity = gran,

  intervals = List(intervalStr)
).execute()
query.map(_.results).foreach(println(_))
val result = query.map(_.series[TimeseriesCount].map(x => TimeseriesRes(x._1.format(formatter), x._2.head)))

Results foreach works fine.. series throw errro.

anskarl commented 5 years ago

Hi @hariramesh9a

I cannot reproduce the issue. I wrote the following code:

import java.time.format.DateTimeFormatter
import ing.wbaa.druid._
import ing.wbaa.druid.definitions._
import io.circe.generic.auto._
import io.circe.syntax._
import io.circe._
import scala.concurrent.Future

implicit val druidConf = DruidConfig(host = "127.0.0.1", port = 8082)
implicit val system = DruidClient.system
implicit val materializer = DruidClient.materializer
implicit val ec = system.dispatcher

val formatter = DateTimeFormatter.ofPattern("dd/MM/yyyy")

case class TimeseriesCount(
  max_temp: Double,
  min_temp: Double,
  max_torque: Double,
  min_torque: Double,
  max_humidity: Double,
  min_humidity: Double
)

case class TimeseriesRes(date: String, record: TimeseriesCount)

val query: DruidQuery = TimeSeriesQuery(
    aggregations = List(
      DoubleMaxAggregation(fieldName = "temperature", name = "max_temp"),
      DoubleMinAggregation(fieldName = "temperature", name = "min_temp"),
      DoubleMaxAggregation(fieldName = "torque", name = "max_torque"),
      DoubleMinAggregation(fieldName = "torque", name = "min_torque"),
      DoubleMaxAggregation(fieldName = "humidity", name = "max_humidity"),
      DoubleMinAggregation(fieldName = "humidity", name = "min_humidity")
    ),
    granularity = GranularityType.Minute,
    intervals = List("2011-06-01/2017-06-01")
  )

val request: Future[DruidResponse] = query.execute()

request.map(_.results).foreach(println(_))

// according to @hariramesh9a the following code should fail
val result: Future[Iterable[TimeseriesRes]] = request.map{ response =>
  response.series[TimeseriesCount].map{ case (zdt, entries) =>
    TimeseriesRes(formatter.format(zdt), entries.head)
  }
}

result.foreach(_.foreach(println))

Which version of Scruid are you using? I am testing on the latest v2.1.0 with Circe v0.10.1.

Fokko commented 5 years ago

@hariramesh9a ping!