Open chunsli opened 5 years ago
I think we can do 1).
2) has the following nuances:
i) You can't insert, update or delete ANY view natively (only simple views)
ii) A nicer abstraction would be to trigger a webhook when the underlying table changes though this is impossible to detect via a view because it is not materialized. So, the workaround is to have the underlying table(s) as a trigger and then query your view in the webhook.
I was thinking about this as well, part of our workflow is that when a trigger occurs, and let’s say a field is marked approved, we want to take that data, with relationships, and perform a task. Right now we have lambda function that will call the API with a role of “function” that collects the data to transform.
It would be ideal if we could have events configured to include that relationship automatically, reduce the need to call the API again. I see your point though, this can get messy quickly
Thanks @tirumaraiselvan for our chat on Discord :). I wanted to add a few things to it here so they don't get lost on the chat.
How're you envisioning this feature working?
I have a similar use case to 1) but I'm wondering, could we extend that to be any GraphQL query? Furthermore, could the changed record produced by the mutation be passed as an argument to that query?
Here's an example of what I had in mind. Let's say we have:
type Message {
id: uuid
text: String
sender: User # (through sender_id: uuid or something like that)
}
query getRelatedStuff($id: uuid, $text: String, sender_id: uuid) {
user(where: { id: {_eq: $sender_id }}) {
name
}
}
Then it could be attached to the event's payload in some extra field.
What do you think about that?
I really like the idea of having it attached to the events payload, I think it would be error prone to have it under new and old, specifically for name collisions. Maybe it could be nested with _relationships? Having it as a top level under data makes sense as well, perhaps under a key of related with new and old?
@jasonmccallister Wouldn't you expect it be under the name of the relationship? i.e inside new
or old
I have extra relationship objects with keys like booksByAuthorId
?
@tirumaraiselvan I think the existing behavior is tied to table actions, having the table fields in the new/old sections makes total sense. However, if we were to add relationships under those keys, its not really a "table event" anymore.
In our use case, we are logging all changes to tables and while we care about the relationships for other actions (like updating a search index) having the relationships in another object makes a lot of sense. From a feature standpoint it would keep the existing behavior as it currently exists.
@tirumaraiselvan thinking about that some more, there are two things that concern me:
Maybe it makes more sense to have it under event.data.relationship
? If we had the posts
table with authors
as the relationship, maybe it would have event.data.author
?
Came looking for something similar... subscribed! ✔
Any news about that? According to https://docs.hasura.io/1.0/graphql/manual/event-triggers/payload.html, once the event is triggered we need then to query hasura to have linked data then use it. We can enforce to have less call to hasura if it can provide all info.
As suggested before, I think we should add the possibility to add a graphql query along with the event and hasura with execute it before sending the event.
For the payload I suggest this:
{
"event": {
"session_variables": <session-variables>,
"op": "<op-name>",
"data": {
"old": <column-values>,
"new": <column-values>
}
},
"additional_data": "<result of the query based on new values>",
"created_at": "<timestamp>",
"id": "<uuid>",
"trigger": {
"name": "<name-of-trigger>"
},
"table": {
"schema": "<schema-name>",
"name": "<table-name>"
}
}
One of the biggest concern for having relationship data in event trigger is how do you reason about consistency in an async system.
E.g. Say, you send a payload with a table and its relationship. Before the event is processed by the webhook, the relationship gets modified. Now, when your webhook is processing the data, should it or should it not check for the current state of the relationship before execution?
This is why the most "consistent" (in the ACID sense) pattern for event trigger is to use it as a notification system when a model changes (with its primary, unique keys). And fetch the required state in the execution. This is obviously not needed if your data is static or can be reasoned about in other ways.
That is why it would help if people interested in this feature can tell the exact use-case/application they are trying to build (hence I have tagged it as triage/0-needs-info
).
Thanks for the clarification of what was missing in this issue. I understand the concern about the fact the data is up to date or not; but for my point of view this is always the case.
Here is an example:
Of course 4 trigger again this workflow (or another if based on another field/table).
So to answer you question, the function that is handling does not need to check the data because it will be triggered a second time.
For my use case, I have something similar to the first post. Let's have those table :
When a book is added or updated, I wish to have the name of the author in the webhook instead of calling to have this information.
The think that can be done is to add or not the graphql query which leads to solve this feature and keep the "consistent" pattern you have talked about.
I hope I am clear enough ;)
Hey @tirumaraiselvan,
The most common use case for us was to fetch relatively static data like a user's name or email to be able to send an email.
I agree that if the data is dynamic it becomes more complex since the lambda could be called at different point in times through the retrying mechanism.
What about acknowledging that the data is a snapshot at the time when the event was triggered?
Why not make it optional ? If I know my relation data is not gonna change, I add the related tables on the event trigger, if the data from the relation might change, then I tell the trigger not to send the related data. So in cases where you need an user email, like @dariocravero mentioned above, you don't need to query hasura a 2 time
This is an interesting problem!
I can appreciate the reticence based on concerns for atomic behaviour in transactions. But having landed in this issue after suffering a similar problem, I can also see the use case!
We are left with three options:
My use case is this: I want to send an email to a user when a new row appears in the session
table. This table has a location_uuid
fk field to the location
table. So in the email we want to say:
Hello ${event.data.new.uploader_email}!
Your assets from ${event.data.new.location.display_name} are ready to collect!
Does that make seem a valid and clear use case?
I'm really keen to see how this moves along.
Any workarounds for this ? So far my alternative is Purchase
Is that the only way ?
For now I think it is @tcaraccia . This discussion is about avoiding to have to call back hasura for something you wish you knew beforehand ;)
My specific use case on this one ( I guess is already clear but in case someone else is facing the same inconvenience):
I would say it is a nice to have
I'm also in need of this feature. How far away is it to be able to include relational data with the trigger payloads?
Our use case is hooking events up in our Discord channel.
When a users subscribes to a session, we want to be able to fire off a message in our channel to let us know that stuff is happening in our app.
The messages looks like:
User 000-000-000-000 subscribed to session with title "hello world"
000-000-000-000
is the user_id
included in the payload hasura posted to our webhook. We don't want the user_id
, we want the actual name of the user which is contained in the user
relationship field.
Right now in our webhook processor, we need to query the user
table with the id 000-000-000-000
just to get their name, whereas if it was included in the payload then all the webhook needs to do is post the message to Discord.
I have what I think is an obvious use case and it's a confirmation email for an order.
2 tables - Order and Line Items. Line Items related back via order_id
Currently the trigger sends the new order on insert to a serverless function for processing, but I have to query back again to retrieve the line items and user info. This would be unnecessary if we could attach the relationships on the data sent by the trigger.
The line items are final for the order so there is no need to worry about inconsistency.
This capability should be possible on actions as well.
Any news from this feature? I also think sending a lot of separate request to Hasura only to get the relationships data is not a performant way for most scenario. Maybe a flag setting in event definition page to enable/disable relationship data in payload can be useful for a lot of use-cases.
This feature would save me a lot of custom endpoints, bumping
Yea, almost at every trigger i require related data. It will be cool to have object relation data at event trigger
We should define optional graphql query for a event trigger to query related data from different tables. This graphql query would run before event transform.
Great feature to have. Subscribed.
Our use-case for this is sending notifications to users. Our system tracks a series of "ticket" type records that users can have open, these tickets go through workflows and often change, with updates being pushed to the user.
Currently when a "ticket" changes, we have to receive that change as a webhook in a microservice, make another GraphQL call back to Hasura to get things like:
Effectively this doubles the back-end load for every change as an additional entire round-trip GraphQL call needs to be made. Ideally we could just get all this related data in the original webhook request.
Note: Our app includes 4 different types of entities that all require similar-ish behaviour to this, but with mildly different relational data for each. So we actually have to implement this 4x times.
Bump +1
Bump +1
Bump +2
Bump +3
This should already be considered. it will make for an excellent feature. I don't think there should be any concern around the additional or related data changing after the call. Currently we use postgres trigger and the http extension in postgres to send all needed data to the endpoint which works very well. The only missing peace is the replay ability the Hasura event trigger provides.
Bump +4
Thank you everyone for the request and comments for this feature. We would like to inform you that this is on our roadmap but we do not have a timeline at present. Please continue to follow this Github issue. We plan to publish on this issue a detailed RFC that covers all use cases and limitations of the feature. We welcome more detailed feedback from you once we provide those details.
I now stumbled upon same issue in my usecase (grabbing users name). Are there any updates for this?
Bump +5
Bump+6
In the interest of keeping this from becoming stale. I encountered this today and it would be a great feature to have!
Bump
Any plan for this feature request after 6 years? 🤔
Currently, when creating triggers in the Event tab of Hasura console, there's no way to either:
Will there be any plan to add support for either one of the options above? There are many use cases, e.g. in the example, send an email to the
author
(with its name in the email) through Zapier when anarticle
is published, supposedlyarticle
anauthor name
are stored in different tables.