jefffhaynes / BinarySerializer

A declarative serialization framework for controlling formatting of data at the byte and bit level using field bindings, converters, and code.
MIT License
290 stars 62 forks source link

[Feature Request] SerializedType.Int3 and SerializedType.UInt3 #189

Closed sn4k3 closed 1 year ago

sn4k3 commented 2 years ago

I'm in the need of having 24 bit uint, since i can't find a way with current FieldBitLength(24) would be super usefull to have SerializedType.Int3 and SerializedType.UInt3 as i have a ton of 24 bit fields to decode/encode from a file format.

Example:

[FieldOrder(0)] [SerializedType.UInt3] public uint WaitTime {get; set; }

this way it will read/save into stream 3 bytes but use uint to store it on C#. I think a easy way to implement this would be replicate Int4 and UInt4 code but strip the first/last byte depending on endianness.

This request is related to: https://github.com/jefffhaynes/BinarySerializer/issues/186

jefffhaynes commented 2 years ago

My concern with this is that a 24-bit integer is not a well-defined thing. For example, I would be making up an arbitrary definition of endianness, and in general I try to avoid imposing artificial rules in the serializer where possible.

sn4k3 commented 2 years ago

It does not look like a int but with less a possible byte?

For example uint24 max value would be 16777215 == 0xFFFFFFFF & 0x00FFFFFF And for Big-endian would not be Convert.ToUint32(read3bytes.prepend(0).reverse()) ?

So far i have been working with some not-aligned data types and uint24 with multiple files, and with 010 Editor templates i just need to declare as: uint fieldname:bitcount, eg: uint fieldname:24 no matter how many bitcount and/or file endianness it never reported a wrong value to me.

Example of those files:

010Editor_2022-07-24_01-04-25

sn4k3 commented 2 years ago

To overcome this i have created this class to use BinarySerialization:

public sealed class UInt24BigEndian
{
    [FieldOrder(0)] [FieldCount(3)] public byte[] Bytes { get; set; } = new byte[3];

    [Ignore]
    public uint Value
    {
        get => BitExtensions.ToUIntBigEndian(0, Bytes[0], Bytes[1], Bytes[2]);
        set
        {
            var bytes = BitExtensions.ToBytesBigEndian(value);
            Bytes[0] = bytes[1];
            Bytes[1] = bytes[2];
            Bytes[2] = bytes[3];
        }
    }

    public UInt24BigEndian() { }

    public UInt24BigEndian(uint value)
    {
        Value = value;
    }

    public override string ToString() => Value.ToString();

    private bool Equals(UInt24BigEndian other)
    {
        return Value == other.Value;
    }

    public override bool Equals(object? obj)
    {
        return ReferenceEquals(this, obj) || obj is UInt24BigEndian other && Equals(other);
    }

    public override int GetHashCode()
    {
        return (int)Value;
    }
}

It works well but is a lame solution and raise other problems such can't be used as a int type for declaring other field length

jefffhaynes commented 1 year ago

Again, the issue is that I could make up a way to serialize 24-bit integers, but it would just be my interpretation vs a well accepted standard. I've had to deal with 24-bit "ints" myself in the past so I realize its frustrating, but I don't think the answer is for me to invent a definition. I would suggest using something derived from IBinarySerializable. Here is an example of a variable length integer.