Closed jrudolph closed 5 years ago
@jrudolph I don't know if it should be resolved in this PR, but BigDecimal
values have yet another vulnerabilities.
Even after successful parsing of BigDecimal
values like 1e1000000000
or 1e-1000000000
users can be affected by any subsequent operations like +
, %
, longValue
, etc.
Just try how this code works in your Scala REPL:
scala> val f = (x: BigDecimal) => x + 1
f: BigDecimal => scala.math.BigDecimal = $$Lambda$1210/93981118@790ac3e0
scala> f(BigDecimal("1e1000000000"))
or
scala> val g = (x: BigDecimal) => 1 + x
g: BigDecimal => scala.math.BigDecimal = $$Lambda$1091/1954133542@e8ea697
scala> g(BigDecimal("1e-1000000000", java.math.MathContext.UNLIMITED))
To prevent this, the parser should avoid returning of BigDecimal
with too big exponent (or scale) or with MathContext.UNLIMITED
by default.
BTW, in jsoniter-scala java.math.MathContext.DECIMAL128
and a corresponding ~6K limit for the scale were selected as safe defaults:
@plokhotnyuk good point. I think about whether we want to do that in the same sweep as the other changes.
@jrudolph I don't know if it should be resolved in this PR, but
BigDecimal
values have yet another vulnerabilities.
Created #287 to track these extra issues.
BigInteger/BigDecimal seems to have approx. quadratic runtime for instantiating big numbers from Strings. Lacking a better solution we introduce a character limit for numbers. According to the benchmarks from #278, at 100 digits the constant/linear parts still predominate over the quadratic slowdowns seen with 10000+ digits.
/cc @plokhotnyuk