Closed charlesb closed 1 year ago
Hello @charlesb, Thanks for reporting the bug. Couple of questions a) What version of the connector are you using b) As I understand, you have a real data-type column that you are selecting as is, without any transform operation. Is that a right understanding?
Hi @ag-ramachandran , I have faced this with both maven com.microsoft.azure.kusto:kusto-spark_3.0_2.12:3.1.4 and installing the latest jar with dependencies (kusto-spark_3.0_2.12-3.1.6-jar-with-dependencies.jar)
I'm simply reading a table like this:
`query = "InsightsMetrics | take 100"
log_ws_1 = spark.read.format("com.microsoft.kusto.spark.datasource") \ .option("kustoCluster", "") \ .option("kustoDatabase", "myworkspace") \ .option("kustoQuery", query) \ .option("kustoAadAppId", "") \ .option("kustoAadAppSecret",) \ .option("kustoAadAuthorityID", "") \ .option("readMode", "ForceSingleMode") \ .load() `
Workaround is to explicitly cast the real type column to int
| project toint(real_type_column)
It only fails when accessing log analytics workspaces
Hello @charlesb , we have just released 3.1.7. Should be in maven central in a few hours cc: @asaharn @ohadbitt
When querying a table that contains a real data type column, the connector returns this exception: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Double