ScorexFoundation / sigmastate-interpreter

ErgoScript compiler and ErgoTree Interpreter implementation for Ergo blockchain
MIT License
62 stars 40 forks source link

Impossibility to deserialize 249+ bits big integers #991

Closed kushti closed 3 months ago

kushti commented 3 months ago

There is a bug in CoreDataSerializer when parsing SBigInt value:

if (size > SBigInt.MaxSizeInBytes) {
    throw new SerializerException(s"BigInt value doesn't not fit into ${SBigInt.MaxSizeInBytes} bytes: $size")
}

where MaxSizeInBytes == 32, while during serialization (see BigInteger.toByteArray):

        int byteLen = bitLength()/8 + 1;
        byte[] byteArray = new byte[byteLen];

Thus only big integers which can be written with <= 248 bits are supported.

Possible fixes:

1) do add versioned check in the CoreDataSerializer code. Then in 7.x the check can be changed in a way which is soft-fork for 6.x but hard-fork for 5.x. Do Global.deserializeBigInt method (or modify deserializeTo method to have a custom check for size > 33 bytes instead of 32? ) . Check in modQ method that result is x.bitLength() <= 256 instead of x.bitLength() <= 255 as now (this check is also a mistake caused by wrong understanding of big integers serialization)

2) do BigIntModQ type. As a drawback, less space of other new types.

aslesarenko commented 3 months ago

The current BigInt is implemented as signed 256-bit integer, semantically same as Short, Int, Long. Thus it represents any value in the range [MinValue ... MaxValue] where MinValue = new BigInteger("-80" + "00" 31, 16) MaxValue = new BigInteger("7F" + "ff" 31, 16)

This is exactly what is checked here https://github.com/ScorexFoundation/sigmastate-interpreter/blob/dd91fa3a926beaae91bd786b51fa4eecf341d371/core/shared/src/main/scala/sigma/util/Extensions.scala#L207

The problem of the current BigInt is that 256-bit signed integer happens to be not suitable for cryptography.

Thus defined BigInt can be represented by 32 bytes, that is why deserialization check is correct. The problem may be due to asymmetry of the MinValue and MaxValue, i.e. -MinValue == MaxValue + 1. As result -MinValue doesn't fit into 32 bytes.

This is what happens when such BigInt is passed to CoreDataSerializer and the serialize method just uses toByteArray which doesn't know about any limitation.

So the solution is just add to256BitValueExact check in the serialize method, to prevent oversized BigInts to every be serialized.

This is not part of consensus and shouldn't have issues. Apps shouldn't also fail as oversized BigInt values are not supported by network anyway (due to deserialize check and to256BitValueExact checks)