Open zxf216 opened 3 months ago
Can I fix this bug like this? com/starrocks/sql/analyzer/ExpressionAnalyzer.java:677
if (node.getOp().getPos() == ArithmeticExpr.OperatorPosition.BINARY_INFIX) {
if (com.starrocks.common.Config.is_decimal_cast_default_decimal128_when_scale_beyond_the_limit) {
// select cast(2.001 as decimal(38,22))/CAST(2.34 AS decimal(38,18)); ->
// select cast(cast(2.001 as decimal(38,22)) as DEFAULT_DECIMAL128)/select cast(CAST(2.34 AS decimal(38,18)) as DEFAULT_DECIMAL128);
Type t1 = node.getChild(0).getType().getNumResultType();
Type t2 = node.getChild(1).getType().getNumResultType();
if (node.getOp() == ArithmeticExpr.Operator.DIVIDE || node.getOp() == ArithmeticExpr.Operator.MULTIPLY) {
//Logic consistent with the code at com/starrocks/analysis/ArithmeticExpr.java:297
//Corresponding case: select 1/cast(2.0 as decimal(38,37));;
int lhsScale = ((ScalarType) t1).getScalarScale();
if (lhsScale <= 6) {
lhsScale = lhsScale + 6;
} else if (lhsScale <= 12) {
lhsScale = 12;
}
int sum = lhsScale + ((ScalarType) t2).getScalarScale();
int maxPrecision = PrimitiveType.getMaxPrecisionOfDecimal(PrimitiveType.DECIMAL128);
if (sum > maxPrecision) {
node.setChild(0, TypeManager.addCastExpr(node.getChild(0), DEFAULT_DECIMAL128));
node.setChild(1, TypeManager.addCastExpr(node.getChild(1), DEFAULT_DECIMAL128));
}
}
}
Steps to reproduce the behavior (Required)
sql_dialect: set global sql_dialect='trino'
Column Type Description: col1:decimal(38, 18) col2:decimal(38, 18) sum(col1) * 1.0000 : decimal(38, 22)
query sql: SELECT (sum(col1) * 1.0000) / sum(col2) AS expr FROM hive_catalog.test.tab
OR select cast(2.0 as decimal(38,22)) / cast(2.0 as decimal(38,18));
ERROR: ERROR 1064 (HY000): Getting analyzing error. Detail message: Dividend fails to adjust scale to 40 that exceeds maximum value(38).
Expected behavior (Required)
Trino can calculate the results.
Real behavior (Required)
Trino can calculate the results.
StarRocks version (Required)
StarRocks-3.2.5