aerogear / graphback

Graphback - Out of the box GraphQL server and client
https://graphback.dev
Apache License 2.0
409 stars 73 forks source link

Ability to publish changes upon creation/update/deletion without necessarily creating the subscriptions GraphQL query /resolvers #1896

Open machi1990 opened 4 years ago

machi1990 commented 4 years ago

Is your feature request related to a problem? Please describe.

Being able to publish (over an external pub sub queue e.g Kafka topic) in one Graphback process and letting the subscription be handled by a complete lightweight another Graphback process(es) that's dedicated for subscriptions only.

Describe the solution you'd like

See subXXX knobs in https://graphback.dev/docs/next/model/annotations#arguments

Having a fine grained configuration pub/sub knobs.

Now we have subCreate, subDelete, subUpdate that handles publish and subscribing without being able to opt in for one or the other.

Essentially:

What I'll like is to split the subXXX knobs into:

This will enable for a more lightweight processes:

The two processes will have to have a strict "event" contract between to smoothen the communication between them.

Describe alternatives you've considered

This can still be achieved with the current version but it not fine tuned as I would have hoped.

machi1990 commented 4 years ago

/cc @craicoverflow, @wtrocki Automatically generated comment to notify maintainers

craicoverflow commented 4 years ago

This makes a lot of sense, thank you for the issue! Do you see subXXX overriding pubXXX if pubXXX is false? As subCreate on its own would not have any use.

machi1990 commented 4 years ago

I see them working independently.

This makes a lot of sense, thank you for the issue! Do you see subXXX overriding pubXXX if pubXXX is false? As subCreate on its own would not have any use.

Taking


"""
@model(subCreate: true, create: false, update: false, pubCreate: false ....)
"""
type Note {
  id: ID!
}

In this schema, subCreate would create:

type Subscription {
  newNote(filter: NoteSubscriptionFilter): Note!
}

and the corresponding resolver, subscribing to the specific Note creation queue. How events are getting publish to the queue would be up the publisher, another Graphback process, another GraphQLCrud process etc (so long as they conform to the same event contract ).

machi1990 commented 4 years ago

/cc @wtrocki

wtrocki commented 4 years ago

We also need the opposite situation:

Having subscription handlers available but do not publish any events on CREATE UPDATE AND DELETE

Workaround for now exist:

What we need is to define the model and disable all crud operations on it apart from subscription. Then we use kafka pubsub to listen to events (topics are configurable and documented) and it should work with debezium.

We need to build sample template to demo this better.

Also currently events (topics) are just internal part of the graphback - if we move to event streaming solution we will need extra capability to be able to specify topics directly in the config or schema.

Moving to generic streaming platform will enable us to process changes using debezium directly from db or external even sources.

wtrocki commented 4 years ago

I trelolozed this (added to trello) 7-datasync-kafka-debezium-enabled-event-streaming-approach

machi1990 commented 4 years ago

We also need the opposite situation:

Having subscription handlers available but do not publish any events on CREATE UPDATE AND DELETE

Yes, this is described in the issue description.

Workaround for now exist:

What we need is to define the model and disable all crud operations on it apart from subscription. Then we use kafka pubsub to listen to events (topics are configurable and documented) and it should work with debezium.

Indeed.

We need to build sample template to demo this better.

Also currently events (topics) are just internal part of the graphback - if we move to event streaming solution we will need extra capability to be able to specify topics directly in the config or schema.

Moving to generic streaming platform will enable us to process changes using debezium directly from db or external even sources.

+1 on this plus the ability to specify any pre-processing operation (e.g payload transformation ) that needs to be done before the received event is sent to the subscribing client.

wtrocki commented 4 years ago

For transformation we have separate feature for content mapping

machi1990 commented 4 years ago

For transformation we have separate feature for content mapping

Nice. Does it apply even under this context? E.g suppose that the source of the events is a DBZ (which pushes events to a Kafka topic), on Graphback side, we'll subscribing to this topic, what'll be desirable is to not merely to the subscription but to supply some sort of transformation function which is to be applied to the event before its sent to the client.

craicoverflow commented 4 years ago

Considering this would be a breaking change and a new feature, do we see this happening for a 0.17.x release?

wtrocki commented 4 years ago

what'll be desirable is to not merely to the subscription but to supply some sort of transformation function which is to be applied to the event before its sent to the client.

Yep. Since this will be needed to be applied to the queries/mutations and subscriptions we can reuse this logic.

Considering this would be a breaking change and a new feature, do we see this happening for a 0.17.x release?

Post 1.0 release. https://trello.com/c/1lH9SqKu/7-datasync-kafka-debezium-enabled-event-streaming-approach

The approach will be to do POC (same as datasync) without touching core or affecting graphback releases.

craicoverflow commented 4 years ago

This appears to be resolved, closing (reopen if I am wrong)

machi1990 commented 4 years ago

This solves only a part of it, but there is not ability to not publish changes from within the application.

See https://github.com/aerogear/graphback/blob/1f51f285a2c4dc22a5d8eba6f8b0487f9a5f7e60/packages/graphback-core/src/runtime/CRUDService.ts#L54

You can play with this repository especially this commit https://github.com/aerogear/datasync-example/blob/245b324a08f6ad72ff5ed728273e9700f8b69952.

This line https://github.com/aerogear/datasync-example/blob/245b324a08f6ad72ff5ed728273e9700f8b69952/graphback-debezium-integeration/src/kafka-sub.ts#L30 should never be called.

wtrocki commented 4 years ago

In production ready scenario streaming platforms will never support edits (because data is stream) so we will always have 2 pubsub engines - one for the classical pubsub and one that is done specifically to some model that works for streaming only. If we going to get that into officially supported scenario separate annotations might be needed.

machi1990 commented 4 years ago

... If we going to get that into officially supported scenario separate annotations might be needed.

I think we should support this usecase by splitting the subXXX annotation key into two:

The current situation is that subXXX is responsible of both the publishing and subscriptions.

wtrocki commented 4 years ago

Yep. This is how our competion seems to be doing subscriptions at the moment.