DarkWanderer / ClickHouse.Client

.NET client for ClickHouse
MIT License
315 stars 62 forks source link

The DecimalType scale defaults to 0 #450

Closed Myron5787 closed 6 months ago

Myron5787 commented 6 months ago

When inserting Decimal parameters using Dapper, the digits after the decimal point will be lost.

DarkWanderer commented 6 months ago

Can you provide a minimal reproducible example please?

Myron5787 commented 6 months ago

Dapper: 2.1.35 ClickHouse.Client: 7.1.1

var sql = @"INSERT INTO Record
            (
                MemberID, 
                Balance
            )
            VALUES
            (
                @MemberID, //Int64
                @Balance   //Decimal(22,4)
            )";

await using var conn = new ClickHouseConnection($"{connstr}");
await conn.ExecuteAsync(sql, new
{
    MemberID = 1,
    Balance = 1.5M
});
Myron5787 commented 6 months ago

Because the ReverseMapping of the decimal value is new Decimal128Type(), and the constructor for Decimal128Type does not set the scale, it defaults to 0. image

DarkWanderer commented 6 months ago

This is an interesting one. The issue happens because of the difference between .NET and ClickHouse concepts of decimal. .NET System.Decimal is essentially a floating-point number - there is no specific 'scale' associated with it; whereas ClickHouse decimal always has a specific precision level as part of type information. Dapper cannot infer any information about underlying ClickHouse type - it just passes decimal to DbParameterCollection, which gets translated into a default value

The problem is, whatever default is chosen for scale, there would always be decimal values either overflowing - being too big - or truncating. I.e. it's not strictly speaking possible to deduce ClickHouse type of the target table by just saying 'here's a decimal, figure it out'. Normally, one would solve that problem by using BulkCopy or explicit CH types in query; however, there is no facility for either in Dapper.

The only 'honest' fix I can think right now is to disallow such mapping completely - to avoid giving false expectations. Other ideas are welcome

DarkWanderer commented 6 months ago

Thinking more about it, I think there might be a way to fix it using ClickHouse own data conversion - the corner cases are tricky, but I should have a PR soon

DarkWanderer commented 6 months ago

Please check release 7.2.0

Myron5787 commented 6 months ago

After testing, it has been confirmed that the data was written correctly. Thank you.