spring-projects / spring-data-relational

Spring Data Relational. Home of Spring Data JDBC and Spring Data R2DBC.
https://spring.io/projects/spring-data-jdbc
Apache License 2.0
771 stars 346 forks source link

Can't read ByteBuffer / ByteArray or Blob in Kotlin R2DBC CoroutineCrudRepository #1726

Open pete-setchell-kubra opened 10 months ago

pete-setchell-kubra commented 10 months ago

When using kotlin and the CoroutineCrudRepository I expect the bytea postgres type to translate to a ByteBuffer, and in a debugger I can see it's already returned as java.nio.HeapByteBuffer in the query response. As HeapByteBuffer already implements ByteBuffer, I don't see why this shouldn't work.

interface AssetRepository : CoroutineCrudRepository<Asset, Long> {
    @Query("SELECT '\\xDEADBEEF'::bytea")
    suspend fun getDeadBeef(): ByteBuffer
}

Test:

  @Test
  fun testByteBufferRead() {
      runBlocking {
          val buf = assetRepository.getDeadBeef()
      }
  }

Throws:

org.springframework.data.mapping.MappingException: Expected to read Document RowDocumentAccessor[document=RowDocument{bytea=java.nio.HeapByteBuffer[pos=0 lim=4 cap=4]}] into type class java.nio.ByteBuffer but didn't find a PersistentEntity for the latter

    at _COROUTINE._BOUNDARY._(CoroutineDebugging.kt:46)
    at com.r2dbcpostgisbug.repro.repository.AssetRepositoryTest$testByteBufferRead$1.invokeSuspend(AssetRepositoryTest.kt:47)
Caused by: org.springframework.data.mapping.MappingException: Expected to read Document RowDocumentAccessor[document=RowDocument{bytea=java.nio.HeapByteBuffer[pos=0 lim=4 cap=4]}] into type class java.nio.ByteBuffer but didn't find a PersistentEntity for the latter
    at org.springframework.data.relational.core.conversion.MappingRelationalConverter.readAggregate(MappingRelationalConverter.java:344)
    at org.springframework.data.relational.core.conversion.MappingRelationalConverter.readAggregate(MappingRelationalConverter.java:311)
    at org.springframework.data.relational.core.conversion.MappingRelationalConverter.read(MappingRelationalConverter.java:298)
    at org.springframework.data.relational.core.conversion.MappingRelationalConverter.read(MappingRelationalConverter.java:294)
    at org.springframework.data.r2dbc.convert.MappingR2dbcConverter.read(MappingR2dbcConverter.java:110)
    at org.springframework.data.r2dbc.convert.EntityRowMapper.apply(EntityRowMapper.java:42)
    at org.springframework.data.r2dbc.convert.EntityRowMapper.apply(EntityRowMapper.java:29)
    at org.springframework.data.r2dbc.core.R2dbcEntityTemplate.lambda$getRowsFetchSpec$16(R2dbcEntityTemplate.java:846)
    at io.r2dbc.postgresql.PostgresqlResult.lambda$map$2(PostgresqlResult.java:129)
    at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:179)
    at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drainRegular(FluxWindowPredicate.java:670)
    at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drain(FluxWindowPredicate.java:748)
    at reactor.core.publisher.FluxWindowPredicate$WindowFlux.onNext(FluxWindowPredicate.java:790)
    at reactor.core.publisher.FluxWindowPredicate$WindowPredicateMain.onNext(FluxWindowPredicate.java:268)
    at io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91)
    at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107)
    at reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:880)
    at reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:805)
    at reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:163)
    at io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:684)
    at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:936)
    at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:810)
    at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:716)
    at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:129)
    at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854)
    at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
    at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
    at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:294)
    at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:403)
    at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:426)
    at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:114)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
    at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:333)
    at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:454)
    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:1583)

Repro case here: https://github.com/pete-setchell-kubra/r2dbcrepro

pete-setchell-kubra commented 9 months ago

Ok, this is maybe just a user error. ByteBuffer does translate correctly as a field in a DTO, just not if you attempt to return it directly as you can with Int, String etc.