Open dkublik opened 3 years ago
I am a bit puzzled by your application setup. You're using Reactor types but the code is actually running WebMVC, is that by intention? The Spring Boot Spring HATEOAS starter pulls in the web starter, which might be unexpected. I.e. what looks like a WebFlux Application is actually run by the MVC pipeline, which doesn't know about the Reactor types and doesn't inspect the nested declared types for content negotiation.
For further investigation, I'll have to investigate how to exclude the WebMVC dependency on a Gradle project. If you have a hint, that'd be appreciated.
I got a bit further once I excluded WebMVC. Note, that there's currently no HATEOAS / WebFlux auto-configuration. You need to enable that through @EnableHypermediaSupport
explicitly. That done, it still doesn't work, unfortunately.
It looks like this is a bug in the ObjectMapper
selection for encoding in WebFlux. In AbstractMessageWriterResultHandler.writeBody(…)
the lookup of supported media types uses the element type and finds the ObjectMapper
customized to produce HAL and thus includes application/hal+json
. During the actual writing, AbstractJackson2Encoder.encode(…)
however constructs a ResolvableType
equivalent to List<TestDto>
to hand that into encodeValue(…)
. This in turn causes the default ObjectMapper
to be selected as the customization wasn't registered for List
.
I have to take that back to the core Framework team for further consideration.
Hi @odrotbohm, this wasn't intentional - I was just generating this example too quickly - I've corrected it in repo now. Thank you for investigation! and waiting for considerations outcome :)
No worries. We're debating this internally already. After giving this some more thought, I even think the error is actually not the format returned, but the Content-Type
header in the list-flux
case. The custom configuration you apply even makes things worse, I'd argue. The reason for that is that a top-level array is not valid HAL in the first place. HAL documents are JSON objects, always. I.e. the additional configuration "tricks" MVC and WebFlux into selecting the ObjectMapper
that can produce HAL documents for the elements, but still cause a top-level array being used. This is because the HAL configured OM
does and cannot have a special serializer for List
s registered, as that would affect all lists anywhere in the tree. That's exactly the reason CollectionModel
exists in the first place.
I can see that using a Flux
as primary return type might be useful, to leverage the streaming rendering capabilities. For that to work, you still need to return a top-level array so that the server can render the elements individually and already write the ones computed to the output stream. However, that approach shouldn't advertise a Content-Type
of HAL but plain application/json
. The OM
to be used for the individual elements would then have to be looked up by element type and application/json
and it would indeed find the right one as we register the ObjectMapper
of the first media type enabled in @EnableHypermediaSupport
for application/json
, too.
I.e. assuming an @EnableHypermediaSupport(types = { HAL, COLLECTION_JSON })
the configuration setup of both the WebFlux encoder and WebMvc converters looks as follows:
defaultObjectMapper : defaultObjectMapper // no Jackson customizations applied
customObjectMappers : {
RepresentationModel, application/hal+json, application/json -> HAL ObjectMapper
RepresentationModel, application/vnd.collection+json -> Collection/JSON ObjectMapper
}
As you can see, the HAL OM
would also get selected for a RepresentationModel
and application/json
.
Thanks for the detailed response Oliver. I agree 100%. I was worried after the first part, that we we will be forced to use CollectionModel and render the whole list at once, but I'm glad that you also see the use case for rendering elements one by one in the stream. So as you wrote - if we can select HAL ObjectMapper for individual elements in the array (when they extend RepresentationModel) - even if the returned content type is application/json - that would totally work for us.
Just to be precise: a standard Flux
rendering is not causing any streaming rendering. AbstractJackson2Encoder.encode(…)
collects the Flux
into a List
and renders that. I know too little about why that is, but I assume this is to enable ObjectMapper
s to customize the List
rendering.
If you want real streaming of results, you need to request a streaming media type like NDJSON or Server-Sent Events.
yep, I'm aware that currently this just returns List
, but I thought the same would be true if I add text/event-stream
header and change this response to stream,
@GetMapping(path = "list-flux", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<TestDto> listFlux() {
return Flux.interval(Duration.ofSeconds(1))
.map(d -> new TestDto(d.toString()).add(googleLink));
}
But it appears that this is working this way for plain dtos, but not for ones inheriting from RepresentationModel
.
Cause when they do inherit from RepresentationModel
I have
java.lang.IllegalStateException: No ObjectMapper for pl.dk.halproblems.TestDto
I hope this is not a distraction from the main thread :). For me most important is answering the question if we can expect HAL ObjectMapper
being selected based on individual element type in List
or should look for workarounds :).
This happened when switching from spring boot 2.4.8 to 2.5.1
Before all the links in RepresentationModel where rendered as "_links". Now it depends.
examples: having this dto
for single case -> _links
for Flux case -> links
for Mono List case -> links
of course we can go with CollectionModel extending RepresentationModel to got _links
But the response is different and that would require having a Mono, not a Flux.
This happens as for lists CollectionSerializer is used, when for RepresentationModel it's HalResourcesSerializer.
I think that this behavior changed after this commit -> https://github.com/spring-projects/spring-hateoas/commit/c561822a4520d8cd4f920b6bea8efff0d2d120a4#diff-c8f508ec153181e5bdf7967247928cf01d783525741b6c6b2c1561187f220e10 I was able to make it work using this -> https://github.com/dkublik/halproblems/blob/master/src/main/java/pl/dk/halproblems/HalListMappingInformation.java but I'm not sure how safe it is.
All this examples can be find in this repo: https://github.com/dkublik/halproblems