Open seakayone opened 1 year ago
@seakayone How about 42e0
, 4.2e1
, 420e-1
?
I would argue, that any integer number in any valid JSON number notation should be converted to an Int
as long as it is in the range of valid Int
values. What do you think?
I think we should start from understanding the use case.
Which problem are we going to solve in general? Is it integration with some other system that has that strange kind of formatting for integers? Should parser on our side report an error or just round silently in case of some fraction part detected? Could accepted values be limited for some range of mantissa and exponent values?
I am currently migrating from spray to zio-json and noticed the difference in the behaviour.
My observation came from a "system that has that strange kind of formatting for integers" reporting an image's height
and width
with values for this in the notation with a trailing.0
. Nonetheless, these value are always an integer.
Which problem are we going to solve in general?
I am under the impression that the more lenient approach when consuming the JSON number would be helpful in cases like above as fringe as they may be.
Scala knows that 42.0
is not an Int
, i.e."42.0".toInt
fails with a NumberFormatException
. Humans, and other programming languages, and json parsers might understand the integer nature of this number.
Should parser on our side report an error or just round silently in case of some fraction part detected
IMHO if the JSON number is not an integer in the mathematical sense and it does not fit into an Int
then it should fail.
Could accepted values be limited for some range of mantissa and exponent values?
Not really? E.g.Int.MaxValue
is just one off the 2^31.
To be honest, I understand my case is an edge case and casting JSON integer numbers representations which are technically not an Int
value can also be surprising to Scala developers. I do not have a very strong opinion on whether this should be implemented or not. Mathematically it would be "more" correct, but less close to Scala's types. My proposal is also closer to the robustness principle as in "be conservative in what you send, be liberal in what you accept", but that does not mean it is better for zio-json.
@seakayone For your case the following decoder should work if pasted in the scope of codec generation for classes with those width
and height
fields:
implicit val d: JsonDecoder[Int] = (trace: List[JsonError], in: RetractReader) => {
val x = Lexer.double(trace, in)
val y = x.toInt
if (y.toDouble == x) y
else throw UnsafeJson(JsonError.Message("32-bit int expected") :: trace)
}
@plokhotnyuk thank you very much. I appreciate your effort to help solve my imminent problem. I will actually use a slightly adapted approach to your solution as I am uncomfortable using the internal
classes.
implicit val anyWholeNumber: JsonDecoder[Int] = JsonDecoder[Double].mapOrFail { d =>
val i = d.toInt
if (d == i.toDouble) { Right(i) }
else { Left("32-bit int expected") }
}
Current a valid JSON number like
42.0
(with only zeroes after the decimal point) cannot be parsed as anInt
with zio-json. I would like to argue that zio-json could more lenient and allow for such de facto integer numbers. At least this is the way how spray json used to handle this.An example test suite (test 3. currently fails and IMHO could pass as well):