Closed yarmenteros closed 2 years ago
@yarmenteros Thanks for reporting. This may explain https://github.com/aspnet/EntityFramework/issues/6538.
@yarmenteros we believe the unexpected behavior manifests only if the precision and scale configured in your model doesn't match the actual precision and scale of the column in the database. I.e. if you create the database using EF Core with the code first approach and you don't configure the precision and scale explicitly then the column will be created with type decimal(18,2) which is the default picked by EF Core.
In other words, the workaround you found is the right solution: if you tell EF Core what the actual precision and scale in the column is then updates should work regardless of how many rows you update.
The reason this doesn't manifest if you update a single row is that we produce completely different SQL in that case that doesn't rely on a table variable. The precision and scale are actually never set in the parameter, so we let the provider (SqlClient) in this case to decide what parameter facets to use based on the value passed.
Note for triage: There are a few things we could consider doing here:
With regard to 2, possibly the update pipeline could use increased precision/scale if no explicit mapping has been set without changing the default used for DDL. However, I'm not sure if this can be done cleanly.
@ajcvickers Yes, we already special case 'rowversion' there.
@AndriySvyryd Why is it necessary to special case rowversion?
@AndriySvyryd Never mind--I just looked at the code. :-)
Wrong button. It's not supported in a temporary table.
With regard to 2, possibly the update pipeline could use increased precision/scale if no explicit mapping has been set without changing the default used for DDL. However, I'm not sure if this can be done cleanly.
I think that could help decrease the chances of truncation but I don't think it would work for all values (because precision has limits and a scale that is too high could limit the number of integer digits that can be represented) so we could end up needing to make it user-configurable. It seemed to me that just changing the default precision and scale on the property level (which is already user-configurable) was more compelling.
Anyway, I think point 3 could be interesting, especially because I am not sure if what we are doing now is fragmenting the query cache.
Hello I have an issue when work with always encrypted columns (type is decimal (6,2)), when I try to add/update, for Db manipulation I use DbContext, entry in my Db I get follow error:
Operand type clash: decimal(1,0) encrypted with (encryption_type = 'DETERMINISTIC', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = '****', column_encryption_key_database_name = '****') is incompatible with decimal(6,2) encrypted with (encryption_type = 'DETERMINISTIC', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = '*****', column_encryption_key_database_name = '****')
for work I use follow model
public class Model
{
[Column(TypeName = "nvarchar(MAX)")]
public string Name { get; set; }
[Column(TypeName = "nvarchar(MAX)")]
public string Description { get; set; }
[Column(TypeName = "decimal(6,2)")]
public decimal Fee { get; set; }
}
Also I tried to specify decimal format in OnModelCreating method
builder.Entity<Model>().Property(x => x.Fee).HasColumnType("decimal(6,2)");
Thanks for any advice
@3axap-4 This looks like a different (although related) case than was tracked by this issue--can you please post a new issue including a runnable project/solution or complete code listing that demonstrates what you are seeing?
Closing old issue as this is no longer something we intend to implement.
The issue
I have problems with decimal types when they are saved in the database. The problem is caused by a conflict between the decimal type precision (decimal(18,2)) used in temporary table into the batch and the decimal type precision (decimal(3,3)) used to define the corresponding parameter into the same batch.
In my case the decimal value is 0.589, and in the database it's saved as a rounded value: 0.590. My column type onto the table is decimal(9,3).
My question is: Why if was well determined the the type and precision to be used as parameter, at the same time it can't use the same type and precision to the same column into temporary table used in batch?
The Batch:
Affected column: Cost Column and Type defined in temporal table: [Cost] decimal(18, 2) Column Type and value defined as parameter: @p71 decimal(3,3) The value before run the batch: @p71=0.589 The saved value: 0.590
Workaround
To solve this issue I have configured explicitly into the DbContext the column type for affected properties in my code, like as:
Thanks for your time