Closed agolovenko closed 1 year ago
OK yeah I see the problem. Java allows BigDecimals with scale greater than precision, but Spark doesn't like having a decimal type like that. It will ultimately accept it fine if I return Decimal rather than BigDecimal internally. I will submit a PR shortly
Thanks for a prompt fix @srowen !
I'm using
from_xml
function to parse messages on spark 3.3.0, spark-xml 0.15.0 and scala 2.12.Oftentimes I see errors while parsing
DecimalType
for some values. I attached theSpec
that fails with the exception:0.0000
toDecimalType(7, 4)
which should work but fails with the above mentioned exceptionjson
format that works just fine on the same schema/value combination.