Open timo-abele opened 8 months ago
Thanks for the suggestion @timo-abele ! The current recipe is implemented as a refaster style recipe, which replaces new BigDecimal(d)
with BigDecimal.valueOf(d)
.
Do I understand correctly that you're looking to then go from BigDecimal.valueOf(1.00)
to new BigDecimal("1.0")
with an explicit recipe?
I'm not entirely sure that's what folks would expect these days, especially as we're moving other number constructors over to valueOf
in https://docs.openrewrite.org/recipes/staticanalysis/primitivewrapperclassconstructortovalueof
Reintroducing explicit constructors for BigDecimal might then be confusing; any thoughts on that?
Hi, in my project the scale of a BigDecimal
is relevant, and my takeaway is that initialization with a double literal should be avoided all together. Take this snippet:
List.of(
new BigDecimal(1.00),
BigDecimal.valueOf(1.00),
new BigDecimal("1.00")
).forEach(bd -> System.out.printf("value: %-4s, scale: %s%n", bd, bd.scale()));
that prints
value: 1 , scale: 0
value: 1.0 , scale: 1
value: 1.00, scale: 2
Only the (argument of the) string constructor clearly indicates the BigDecimal value that is created.
I authored this issue before I refactored all BigDecimal initializations in my project, so that argument wasn't as clear to me then and is missing in my original description. Having refactored every BigDecimal initialization from a literal in my repo, I believe it is best to always use the string constructor. (and the int constructor is allowed as a shorter form iff a scale of 0 is intended¹)
Right now (v1.3.1) BigDecimalDoubleConstructorRecipe
is already changing semantics when it rewrites new BigDecimal(1.00)
-> BigDecimal.valueOf(1.00)
.
It assumes that a dev who wrote
new BigDecimal(1.00)
which equals new BigDecimal("1")
really meant² to write
BigDecimal.valueOf(1.00)
which equals new BigDecimal("1.0")
. That's OK, IntelliJ suggests the same.
Similarly, I believe, that a dev who writes new BigDecimal(1.00)
or BigDecimal.valueOf(1.00)
expects new BigDecimal("1.00")
to happen but doesn't know better. And reviewers and debuggers will be less confused too if the created precision matches the literal.
¹ There is also long
, but we never use long literal to create a BigDecimal in my project. I personally prefer a constructor call with new
to valueOf
because then it is more obvious that a new object is created. And I believe that an official recommendation to prefer valueOf on a wrapper type does not carry an implication to prefer BigDecimal#valueOf
, as BigDecimal doesn't wrap anything. But those opinions ultimately reduce to "I think this looks better", I'm open to compromise here :smiley:
² That's an assumption on my part of course. At the very least however it assumes that the semantics are worth changing.
What problem are you trying to solve?
The String constructor of
BigDecimal
is much nicer thanBigDecimal.valueOf(doubleLiteral)
becauseDescribe the solution you'd like
Per API
BigDecimal.valueOf(double)
In addition to transforming
new Bigdecimal(existingDouble)
tovalueOf
,BigDecimalDoubleConstructorRecipe
should convertBigDecimal.valueOf(doubleLiteral)
tonew BigDecimal(doubleString)
wheredoubleString = Double.toString(doubleLiteral).toString()
. E.g.BigDecimal.valueOf(1.00)
->new BigDecimal("1.0")
Have you considered any alternatives or workarounds?
Additional context
Are you interested in contributing this feature to OpenRewrite?
Iff the idea is approved.