spring-projects / spring-data-relational

Spring Data Relational. Home of Spring Data JDBC and Spring Data R2DBC.
https://spring.io/projects/spring-data-jdbc
Apache License 2.0
738 stars 339 forks source link

The logic for converting values for writing is confusing and I think very wrong. #1136

Open schauder opened 2 years ago

schauder commented 2 years ago

This became obvious when analysing #1127 Part of this problem is hopefully fixed by a fix for #1089.

We have multiple checks for determining what datatype a value should get converted to and I don't think they add to something that I understand.

BasicRelationalConverter.writeValue(@Nullable Object value, TypeInformation<?> type) converts the value twice when it is a simple type and only once otherwise. At the very least this should be refactored so I can understand why it is doing that.

Concept for a rewrite of the writing conversion logic.

The goal is to have a clean and conceptual consistent process for converting Java values to database values.

The process should look like this:

  1. Identify the conversion target The conversion target is a Java type + a java.sql.SQLType (typically a java.sql.JDBCType)

  2. perform the conversion.

The process for identifying the conversion target should follow this process:

Part of the challenge and why the current implementation is so convoluted is, that we a have different sets of information going into this process:

  1. for save operations we do have the property.
  2. for derived queries we do have the property.
  3. for annotated query methods we do have the method parameter.
  4. I'm not even sure what we do have for SpEL expressions.

The class structure should lean heavily on Cassandra, but with a combination of Java type + SQLType instead of just a Java type as the target.

https://github.com/spring-projects/spring-data-cassandra/blob/main/spring-data-cassandra/src/main/java/org/springframework/data/cassandra/core/convert/ColumnType.java https://github.com/spring-projects/spring-data-cassandra/blob/main/spring-data-cassandra/src/main/java/org/springframework/data/cassandra/core/convert/ColumnTypeResolver.java https://github.com/spring-projects/spring-data-cassandra/blob/main/spring-data-cassandra/src/main/java/org/springframework/data/cassandra/core/convert/DefaultColumnTypeResolver.java https://github.com/spring-projects/spring-data-cassandra/blob/main/spring-data-cassandra/src/main/java/org/springframework/data/cassandra/core/convert/MappingCassandraConverter.java

Related issues

The following issues might either get fixed as a side effect of this issue or much easier to fix after this issue is resolved.

https://github.com/spring-projects/spring-data-relational/issues/787 https://github.com/spring-projects/spring-data-relational/issues/1023 https://github.com/spring-projects/spring-data-relational/issues/725 https://github.com/spring-projects/spring-data-relational/issues/629

schauder commented 2 years ago

629 is also cause by this

schauder commented 2 years ago

The current process of BasicJdbcConverter for writing a value is:

  1. determine target type (T0) and sql-type
  2. convert to the target type if possible.
  3. check if there is a applicable writing custom conversion
  4. If so convert to the target type of that custom conversion (T1).

=> the sql-type determined in the beginning does not match the target type T1, resulting in whole bunch of errors.

Side problem: null values don't get converted by Spring Framework converters

Better process:

  1. check if there is a custom conversion
  2. If so apply it. Determine the SQL type from the conversion target.
  3. If there are no custom conversions use configured target types and SQL type and use the conversion service to convert to it.
mipo256 commented 2 years ago

Jens @schauder , I think I can try to make it work so, I have already invested some time in order to understand the guts of this process, so I think it will make sence. May I take a look at it?

schauder commented 2 years ago

Hi Mikhail,

of course you may take a look! But as a fair warning: This is a pretty central part of Spring Data JDBC and it is perfectly possible, that we have some ideas that we don't have properly documented yet, which could lead to a failure to accept an otherwise good PR.

If you are fine with that risk, please go ahead. And please let us know if you do so we can leave it to you at least for some time.

mipo256 commented 2 years ago

Yes, I think I am perfectly fine, I will try to fix this issue :) If I there will be some arguable questions during implementation, I will try to reach you out :)

schauder commented 1 year ago

Things also to consider:

schauder commented 1 week ago

1828 is also related to this.

At least it helps me understand why there are two conversions. See #1829