neos / contentrepository-development-collection

Work in Progress on the Event Sourced Content Repository as installable set of packages
4 stars 9 forks source link

Feature/160/traversable node as extension point #187

Closed nezaniel closed 3 years ago

nezaniel commented 3 years ago

This will provide the capability for custom TraversableNodeInterface implementations via NodeType configuration in a similar way the old CR did it.

The main change is that there is now a central factory for creating TraversableNodes that is used internally by the ContentSubgraph, which now only communicates in TraversableNodeInterface.

There are a few occurrences of TraversableNode instantiation whose removal has yet to be discussed

bwaidelich commented 3 years ago

I'm a bit skeptical about TraversableNodeInterface as extension point to be honest. In the old CR that was the reason for some ugly quirks and reduced flexibility on our side (i.e. as long as it's a "readonly interface" we can always extend it without breaking changes). Could you share the intentions for this feature?

nezaniel commented 3 years ago

The idea is to have implementations for TraversableNodeInterface, that implement other interfaces as well. They should still be readonly of course, but can provide more meaningful accessor logic than getProperty(). Think of a product catalog in the CR, with a product interface with getRrp(): PriceSpecification. Storing such value objects in the new CR is already supported, exposing them can only help with integration. I use this feature in the old CR rather often and enjoy it quite a lot. And I've never run into any issues that cannot be resolved with the new API, e.g. having a replaceable ContentSubraphInterface for in-memory-support but still keeping the desired TraversableNodet implementations. Which past quirks and flexibility reduction do you have in mind?

bwaidelich commented 3 years ago

Thanks for your feedback.

Which past quirks and flexibility reduction do you have in mind?

I vaguely remember that we had to go some extra steps in the past to keep compatibility with the API in case someone implemented a custom node. Also I came across some parts that didn't work with custom implementations because the core relied on some magic behavior of the implementation that wasn't restricted by the interface – I don't remember exactly what the issue was, but I think it was related to the format of the node identifier.

Here's my – frankly somewhat radical – view (Viewer discretion is advised *g): I think the CR is already quite complex and IMO we could tackle that complexity better if we drastically focus the scope on its core domain. IMO this undertaking could go even further than the current implementation, reducing the allowed property types to simple scalars (read: JSON) sparing us a lot of complexity and type juggling (PropertyCollection vs SerializedPropertyValues, ...).

Building a bridge to your example: Wouldn't it make much more sense to have a Product entity that either wraps a node (via composition) or – maybe even more flexible - is built from the properties of one or multiple nodes?

Personally I would even suggest to get rid of the interface in the long run, but maybe that takes it a bit far :)

nezaniel commented 3 years ago

Well, those examples of weird behavior seem to be related to the old implementation, don't they? e.g. NodeAggregateIdentifiers are now freed from any traversal logic (which was the point behind "stable node identifiers" afaik) and our new interfaces are a lot more comprehensive than the old ones.

My personal opinion on this is that with the interfaces, we get a lot of sensible extension points for further developments. Think of DAM where assets are stored as nodes, but reside in a custom subgraph implementation that communicates with a CDN for fetching the metadata. And supporting value objects for property types makes a lot of sense to me. Currently we rely on rather limited frontend validation while in the backend we can only hope for the best. I'd rather go another step in that direction and take validation via VOs into the NodeAggregateCommandHandler for validation of handleSetProperties. Custom PropertyCollections can be used for combined validation. I think the current approach - albeit more complex - makes a lot of sense.

As for the product entity: I'd rather compose it of the NodeInterfaceProxy as the TraversableNode does it. first: the product would have to be initialized everywhere its node is provided as an entry point, which is rather cumbersome. and second: We wrap Node around PropertyCollection. Then TraversableNode around Node. Product would be the third layer of wrapping just to access properties. Imho that's at least one layer too much

skurfuerst commented 3 years ago

@bwaidelich @nezaniel to chime in here - I can feel both of you and your arguments, and am really undecided on this one ;-)

bwaidelich commented 3 years ago

Product would be the third layer of wrapping just to access properties. Imho that's at least one layer too much

Without the NodeData abstraction we spare one layer though :) Just kidding. I'm also skeptical when it comes to many layers and I don't have a crystal clear vision.

I don't think that we're far apart actually: I fully agree that the CR should be usable for things other than documents and content elements. And I also think that in most cases you want to work with them on a higher abstraction level (e.g. use value objects etc). I just don't think that this higher level should be part of the CR core (commands, events, (core) projections) but should reside on a ...well.. higher level *g. So that from the CRs perspective a node is a node is a node. But in "userland" it could be treated as something more concrete (e.g. a DAM asset).

I could imagine that your implementation could be applied on a different level. For example: When retrieving nodes via FlowQuery. So that you could still easily implement custom behavior but with some additional benefit of reduced complexity in the CR core and better performance if you don't require property deserialization / custom class mapping.

nezaniel commented 3 years ago

Oh, I think I see a possible compromise here. Beginning on the read side: the whole idea of a TraversableNode is something that Neos needs, not the CR in the first place. I could imagine a separate package (used by Neos) that negotiates the interactions between applications and the CR. It knows of NodeAddress and has some kind of central factory for node-based read models like Neos' TraversableNode or userland Product entities (who not neccessarily need to be traversable btw) so that userland code still can configure/implement stuff at a single entry point instead of all over the place. On the write side it would translate complex to simpler commands.

This would not reduce overall complexity, but it would separate the concerns of versioned graph and object-versioned-graph-mapping better.

Does this better fit what you have in mind?

bwaidelich commented 3 years ago

Does this better fit what you have in mind?

Awesome! I think that is exactly what my confused mind thought and I failed to express with words: I want that extensible high level (especially on the read side) but I want to be able to interact with the (simpler & faster) low level API, too.

This would not reduce overall complexity, but it would separate the concerns [...]

Right. And I think that this would be an important winning since each of the concerns would get a little simpler to handle & maintain on its own.

As suggested above, I could even imagine the whole property (de)serialization to be moved to that new layer at some point.

I could imagine a separate package (used by Neos) [...]

Yes, I think that would be perfect, since it allows for a "fully fledged" API to be used out of the box (for Neos or Flow apps that want to interact with that) while allowing lower level (e.g. standalone) usages of the "bare" CR functionality in other cases.

Just realizing: I think this whole topic has some important overlaps with the (non PHP?) API design – this is a topic for another issue, but we should keep it in the back of our heads.

I think this whole topic is crucial. Maybe we can find some time next week to discuss it "face-to-face", @nezaniel & @skurfuerst !?

nezaniel commented 3 years ago

I think this whole topic is crucial. Maybe we can find some time next week to discuss it "face-to-face", @nezaniel & @skurfuerst !?

Yes, please! there are a lot of of concerns to be addressed here and judging by the amount of them that I already discovered myself I guess there are even more. So a modeling session is an absolute necessity imho

skurfuerst commented 3 years ago

@nezaniel @bwaidelich awesome thoughts! I think we're getting somewhere :) :) The abstract upper layer could also be the place where the "Node Mutator" API (which is more high level than raw commands) could live.

:) Looking forward to our discussion!

skurfuerst commented 3 years ago

superseded by #190