Open Blackbaud-JasonBodnar opened 9 months ago
Thanks for filing this github issue on BigDecimal not serializing properly and losing precision, @Blackbaud-JasonBodnar - someone from the Cosmos SDK
team will follow up.
@Blackbaud-JasonBodnar - I believe this issue has been fixed in the latest version of the SDK. Can you please try upgrading to the latest version?
@kushagraThapar When you say latest version do you mean 5. or 3.?
Also there seems to be a regression between 3.18 and 3.42.
In 3.18, SimpleCosmosRepository.saveAll()
:
public <S extends T> Iterable<S> saveAll(Iterable<S> entities) {
Assert.notNull(entities, "Iterable entities should not be null");
final List<S> savedEntities = new ArrayList<>();
entities.forEach(entity -> {
final S savedEntity = this.save(entity);
savedEntities.add(savedEntity);
});
return savedEntities;
}
By calling save()
on each entity of the Iterable
passed, any auditing information was filled in (ie, @CreatedBy
, @LastModifiedBy
, etc).
But in 3.48:
public <S extends T> Iterable<S> saveAll(Iterable<S> entities) {
Assert.notNull(entities, "Iterable entities should not be null");
if (information.getPartitionKeyFieldName() != null) {
return operation.insertAll(this.information, entities);
} else {
final List<S> savedEntities = new ArrayList<>();
entities.forEach(entity -> {
final S savedEntity = this.save(entity);
savedEntities.add(savedEntity);
});
return savedEntities;
}
}
If there's a partition key, operation.insertAll(this.information, entities)
does not add auditing information.
3.42 did not fix this.
@kushagraThapar : Any more info on this? Is this supposed to be fixed in the latest 3.x? Or only 5.x? What about the issue with d @Created*
and @LastModified*
not working when saving a collection of entities?
@kushagraThapar Here is a repo demonstrating the issues (BigDecimal loses scale, auditable fields not populated when saving collection) with 3.42.0.
@kushagraThapar I also created a branch with azure-spring-data-cosmos 5.x and can confirm that neither the BigDecimal or the auditable fields work with that version either.
@Blackbaud-JasonBodnar I was out of office, will look at this earliest by this weekend.
@kushagraThapar : Have you been able to look at this?
@Blackbaud-JasonBodnar - I haven't been able to look at the issue above yet (and probably won't be this week) - but the only way to use BigDecimal in any json-base database like Cosmos DB is to store it as a string. JSON itself simply does not support decimal numbers at precision/range BigDecimal is using - Json implementation usually normalize around IEEE754 - which means you will loose precision for BigDecimal in any json-based database. The usual approach is to use serializers that convert to/from string for properties with values of BigDecimal or any integers outside the range of (-2^53)+1 to (2^53) - 1
Yes, azure-spring-data-cosmos should be storing these as a string. That's the issue here (plus the regression with auditable fields in newer versions). FYI, we are switching from Cosmos DB in mongodb mode to SQL/Document DB mode and there were no problems with BigDecimal in mongodb mode using spring-data-mongodb.
@Blackbaud-JasonBodnar - yes, Mongo is not using Json - but the proprietary BSON format - which addresses some of the gaps JSON has around numbers - see https://bsonspec.org/spec.html and the decimal128 binary type.
That's fine. My point is that because BigDecimal
is a standard Java class, I shouldn't have to have my own custom serializer/deserializer to save it and retrieve it from Cosmos DB. It should be handled correctly by azure-spring-data-cosmos if that means it's converted to a string before saving to the db.
Since the ObjectMapper used for serialization/deserialization isn't exposed in azure-spring-data-cosmos it seems you need to add something like what is mentioned in this comment or at least provide a configuration setting to enable something like that.
I discovered yesterday while going through the SDK code that it will use an ObjectMapper
@Bean
and when I supplied one with BigDecimal
configured to have Shape.String
I was able to save an entity with a BigDecimal
, retrieve it and the two BigDecimal
values were equal()
. This unblocks my team from moving forward with an older version of the SDK (ie, before the audit fields regression was introduced). But, I still believe that this should be fixed in the SDK because:
1) If I save something to a database using a high level SDK like this that deals strictly with object entities and I retrieve that entity using the SDK the retrieved entity should equal the entity I saved (except, of course, for fields that the SDK populates -- ie, autogenerated ids, audit fields, etc).
2) As the consumer of such an SDK, I should not have to know about or care about how the SDK serializes and deserializes my entities, especially when they contain fields that are native to Java such as BigDecimal
.
@Blackbaud-JasonBodnar thanks for the feedback, we will further investigate this and fix if possible in azure-spring-data-cosmos. If you have any details on how this can be fixed, or would like to contribute to the SDK, please feel free to do so, we would appreciate it.
@kushagraThapar PR created: https://github.com/Azure/azure-sdk-for-java/pull/39302
@Blackbaud-JasonBodnar PR #40239 was just merged which may fix this issue, after the next release please let me know if this error persists.
Describe the bug
When azure-spring-data-cosmos serializes an entity with a BigDecimal field trailing zeros in the decimal portion are lost thus ignoring the scale. For example, 43769.30 is serialized as 43769.3.
Exception or Stack Trace There is none.
To Reproduce Create an entity with a BigDecimal field. Set the value to 43769.30 and save it to cosmos. Look in cosmos or retrieve the entity. The value is 43769.3. The important thing is the scale. Because according to BigDecimal.equals():
"this method considers two BigDecimal objects equal only if they are equal in value and scale (thus 2.0 is not equal to 2.00 when compared by this method)"
Expected behavior 43769.30 is saved as 43769.30
Setup (please complete the following information):
Additional context In MappingCosmosConverter:
final String valueAsString = objectMapper.writeValueAsString(sourceEntity);
valueAsString
has the correct string representation, ie43769.30
but:
cosmosObjectNode = (ObjectNode) objectMapper.readTree(valueAsString);
ends up with a child node with a value of
43769.3
Information Checklist Kindly make sure that you have added all the following information above and checkoff the required fields otherwise we will treat the issuer as an incomplete report