Open eliviu opened 6 years ago
Hi,
The decimal logicalTypes are seen in DF as binary and values are in hexadecimal
Ex. for "3.12" :
org.apache.spark.sql.DataFrame = [col1: binary] df.select("col1").show() +-------+ | col1| +-------+ |[01 38]| +-------+
If I convert the column to string I get other values than the correct one (ex. 8 instead of 3.12)
df.withColumn("col2", 'col1.cast("String")).select("col2").show() +----+ |col2| +----+ | 8| +----+
There is any way that I can use to convert these columns in String datatype inside of DF (or in some other formats like BigDecimal, Decimal(s,p) )?
+1
Hello, I'm looking for a solution too Is it possible to extract logicaltype during dataframe creation ?
Hi,
The decimal logicalTypes are seen in DF as binary and values are in hexadecimal
Ex. for "3.12" :
If I convert the column to string I get other values than the correct one (ex. 8 instead of 3.12)
There is any way that I can use to convert these columns in String datatype inside of DF (or in some other formats like BigDecimal, Decimal(s,p) )?