Closed chrisvettese closed 2 years ago
I am also getting a second error trying to get the Geometry/Point type working:
2022-04-02 19:55:17.784 WARN 18080 --- [atcher-worker-1] n.g.e.SimpleDataFetcherExceptionHandler : Exception while fetching data (/createShipment) : Sub-entity without @ForeignKey is not supported: storeLocation
org.springframework.data.mapping.MappingException: Sub-entity without @ForeignKey is not supported: storeLocation
at net.lecousin.reactive.data.relational.mapping.LcEntityReader.readEntityProperty(LcEntityReader.java:169) ~[core-0.8.1.jar:na]
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Error has been observed at the following site(s):
*__checkpoint ⇢ SELECT FROM ShipmentEntity AS e WHERE e.id EQUALS 21
Original Stack Trace:
at net.lecousin.reactive.data.relational.mapping.LcEntityReader.readEntityProperty(LcEntityReader.java:169) ~[core-0.8.1.jar:na]
at net.lecousin.reactive.data.relational.mapping.LcEntityReader.readProperty(LcEntityReader.java:113) ~[core-0.8.1.jar:na]
at net.lecousin.reactive.data.relational.mapping.LcEntityReader.read(LcEntityReader.java:99) ~[core-0.8.1.jar:na]
at net.lecousin.reactive.data.relational.query.SelectExecution$JoinStatus.readNewInstance(SelectExecution.java:549) ~[core-0.8.1.jar:na]
at net.lecousin.reactive.data.relational.query.SelectExecution$RowHandler.handleRow(SelectExecution.java:584) ~[core-0.8.1.jar:na]
at reactor.core.publisher.LambdaSubscriber.onNext(LambdaSubscriber.java:160) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxUsingWhen$UsingWhenSubscriber.onNext(FluxUsingWhen.java:358) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxFlatMap$FlatMapMain.tryEmit(FluxFlatMap.java:543) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxFlatMap$FlatMapInner.onNext(FluxFlatMap.java:984) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:119) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.MonoFlatMapMany$FlatMapManyInner.onNext(MonoFlatMapMany.java:250) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:184) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxFilterFuseable$FilterFuseableConditionalSubscriber.onNext(FluxFilterFuseable.java:337) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.13.jar:3.4.13]
at io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:130) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:119) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:793) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:718) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:154) ~[reactor-core-3.4.13.jar:3.4.13]
at io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:635) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:887) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:761) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:667) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:119) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:220) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:220) ~[reactor-core-3.4.13.jar:3.4.13]
at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:279) ~[reactor-netty-core-1.0.14.jar:1.0.14]
at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:388) ~[reactor-netty-core-1.0.14.jar:1.0.14]
at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:404) ~[reactor-netty-core-1.0.14.jar:1.0.14]
at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:93) ~[reactor-netty-core-1.0.14.jar:1.0.14]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324) ~[netty-codec-4.1.72.Final.jar:4.1.72.Final]
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:311) ~[netty-codec-4.1.72.Final.jar:4.1.72.Final]
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:432) ~[netty-codec-4.1.72.Final.jar:4.1.72.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:276) ~[netty-codec-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) ~[netty-transport-4.1.72.Final.jar:4.1.72.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986) ~[netty-common-4.1.72.Final.jar:4.1.72.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.72.Final.jar:4.1.72.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.72.Final.jar:4.1.72.Final]
at java.base/java.lang.Thread.run(Thread.java:833) ~[na:na]
I have tried switching storeLocation to org.locationtech.jts.geom.Geometry but still getting the error. This support was added to r2dbc recently: https://github.com/earlbread/r2dbc-postgresql/pull/1 so I'm not sure if it's compatible with your library or if I'm doing something wrong.
The version 0.9.0 should fix the issue with customerId not updated. However for Geometry types, it is not yet supported by this library. I'll look into this to add this feature.
Hi Lecousin, updating is still not working for me. Even my entity have ID (I am fetching it by email, than change some property and saving), final SQL is INSERT where it fails on UNIQUE constraint.
I updated to version 0.9.0, I am using postgres, kotlin
Entity is defined as:
@Id
@GeneratedValue(strategy = GeneratedValue.Strategy.SEQUENCE, sequence = "personal_user_id_seq")
var id: Long? = null,
If I am debugging, I can see that the entity object has also additional property _lcState = null
if this helps
Debug message: Caused by: org.springframework.dao.DataIntegrityViolationException: executeMany; SQL [INSERT INTO personal_user (id, first_name, last_name, password, registration_date, email, username) VALUES (NEXTVAL('personal_user_id_seq'), $1, $2, $3, $4, $5, $6)]; duplicate key value violates unique constraint "personal_user_email_key"; nested exception is io.r2dbc.postgresql.ExceptionFactory$PostgresqlDataIntegrityViolationException: [23505] duplicate key value violates unique constraint "personal_user_email_key"
Looks like problem on my side, previous solution with just R2DBC not working as well, I will let you know, where is the problem.
@chrisvettese for the geometry types, even it is not yet fully supported by this library, if you don't need schema generation, you can already use the specific types from the r2dbc postgresql driver: io.r2dbc.postgresql.codec.Point
... This seems to work.
Good to know, thank you for investigating these issues.
Hello again, here is my situation:
The repository has not been customized:
This is the initial save, which works as expected.
For debugging purposes, this is the corresponding SQL statements run by r2dbc (these look good to me):
Then some updates are made to the shipment entity:
I understand that the geometry point updates may be a unique case, however customerId is a simple bigint/long, but that is not updating either. Here is the corresponding SQL:
As you can see, the new history element is generated, but nothing in shipment is updated, even though customerId, customerLocation, and storeLocation have been updated. Note that I updated the r2dbc driver to
org.postgresql:r2dbc-postgresql:0.9.1.RELEASE
, to support the PostGis extension (for the geometry type). Am I doing anything wrong here?