Open jiangjiangtian opened 2 months ago
@kecookier
Maybe we should not remove the PromotePrecision(Cast(...))
. Otherwise, we will always have the bug.
Besides, maybe we can also override DecimalPrecision
in gluten and make it consistent with high version Spark.
After some investigations, I find that if we have some literals in the decimal arithmetic expressions , we may have wrong result type. The reason is that the existence of constant folding makes we can't get the original type of the literal.
Hi @jiangjiangtian, would you also post the expected result and incorrect result in this issue? Thanks.
Hi @jiangjiangtian, would you also post the expected result and incorrect result in this issue? Thanks.
Thanks for reminding me. I add the result in the issue.
For now, we pass the result type when we create the decimal function, but it is not an elegant solution, and we will come up with a better solution.
Backend
VL (Velox)
Bug description
The SQL as follows will produce wrong result:
where col0 and col1 is decimal(20, 0). The values of col0 and col1 are 25 and 72. Gluten returns 3.4722217399692027821 and spark returns 0.3472221739969202782. The result type needs to be
decimal(38, 19)
. But during calculation, the actual result type isdecimal(38, 20)
:The reason is that the type of the subexpression
col1 + 0.00001
will be rescaled todecimal(27, 5)
:But we don't notice it when we rescale the topmost expression:
Spark version
None
Spark configurations
No response
System information
No response
Relevant logs
No response