r2dbc / r2dbc-h2

R2DBC H2 Implementation
Apache License 2.0
200 stars 45 forks source link

Cannot read binary data (blob) into byte[] field. No converter. #115

Closed gm2552 closed 4 years ago

gm2552 commented 4 years ago

New issue related to issue #79. After applying the updates contributed by the spring-data-r2dbc issue https://github.com/spring-projects/spring-data-r2dbc/issues/186, the h2 driver cannot read a Blob database column into a byte[] field using ReactiveCrudRepository.

This sample project is an update to the previously provided sample in #79 with the latest spring-data-r2dbc milestone as well as the last io.r2dbc libraries.

https://github.com/gm2552/r2dbc-binarydata-sample.git

The sample results in the following stack trace (abbreviated) via the IDE:


java.lang.AssertionError: expectation "assertNext" failed (expected: onNext(); actual: onError(org.springframework.data.mapping.MappingException: Couldn't read column cert from Row.))
    at reactor.test.MessageFormatter.assertionError(MessageFormatter.java:115)
    at reactor.test.MessageFormatter.failPrefix(MessageFormatter.java:104)
    at reactor.test.MessageFormatter.fail(MessageFormatter.java:73)
    at reactor.test.MessageFormatter.failOptional(MessageFormatter.java:88)
....
    Suppressed: org.springframework.data.mapping.MappingException: Couldn't read column cert from Row.
        at org.springframework.data.r2dbc.convert.MappingR2dbcConverter$RowParameterValueProvider.getParameterValue(MappingR2dbcConverter.java:488)
        at org.springframework.data.relational.core.conversion.BasicRelationalConverter$ConvertingParameterValueProvider.getParameterValue(BasicRelationalConverter.java:256)
        at org.springframework.data.convert.ClassGeneratingEntityInstantiator$EntityInstantiatorAdapter.extractInvocationArguments(ClassGeneratingEntityInstantiator.java:250)
...
    Caused by: org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [io.r2dbc.h2.codecs.ValueLobBlob] to type [byte[]]
        at org.springframework.core.convert.support.GenericConversionService.handleConverterNotFound(GenericConversionService.java:321)
        at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:194)
        at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:174)
        at org.springframework.data.r2dbc.convert.MappingR2dbcConverter$RowParameterValueProvider.getParameterValue(MappingR2dbcConverter.java:486)
        ... 95 more
mp911de commented 4 years ago

That isn't a driver issue per se, but rather an issue of Spring Data R2DBC in the first place. See spring-projects/spring-data-r2dbc#196 for a similar issue that deals with Clob to String conversion. Blob to byte[] is a similar issue.

This issue is an indicator that the R2DBC spec assumes (large) binary/character objects are to be consumed using streaming in the first place. As per spec, BLOB and CLOB-like column types default to returning Blob and Clob wrappers when calling Row.get(…) without a type hint.

Cross-checking with JDBC, drivers return byte[] for BYTEA (Postgres), VARBINARY(MAX) (SQL Server) and BLOB (MySQL).

While handling Blob and Clob is a matter of deferred consumption in the mapping layer of a client, it becomes obvious that most applications rather want to consume that data through scalar data types, such as byte[] and String. Can you file a ticket in R2DBC SPI so BLOB and CLOB column types map by default to ByteBuffer/byte[]?

ByteBuffer/byte[] are limited to 2GB, so we should outline this limitation that consumption of data greater than 2GB requires Blob/Clob wrapper use.

gm2552 commented 4 years ago

Yes, thanks for the more in depth insight. Will get another issue filed this morning.

gm2552 commented 4 years ago

New issue filed: https://github.com/r2dbc/r2dbc-spi/issues/130

mp911de commented 4 years ago

The actual issue with conversion is that a streaming type was handed to the mapping layer of Spring Data R2DBC and the mapping layer was unable to consume it. That's addressed with #119 as LOB types now default to ByteBuffer/String instead of Blob/Clob.