PostgREST / postgrest-docs

This repo is archived and will be merged into postgrest/postgrest soon.
http://postgrest.org
MIT License
365 stars 164 forks source link

How to hook on operations aka Where goes business logic? #71

Open christiaanwesterbeek opened 7 years ago

christiaanwesterbeek commented 7 years ago

Assume that when a record is created in a table, an email needs to be sent out. Also, some front-end admin tool (like admin-on-rest) is operating against the api created by postgrest. Where would the logic go that sends the email after the record is created?

I assume in this case we need an additional backend on (Node.js for example) that proxies the rest interface by postgrest (effectively sits between the postgrest api and the admin tool), and will intercept some operations and do additional stuff (like sending the email).

Can you share some thoughts on this topic?

begriffs commented 7 years ago

Hi @devotis, you can use a postgresql trigger to perform actions before/after certain db operations. Inside the trigger you can emit a sql NOTIFY command and have an external bridge program listen for those events. Programs like these can put the events into a queue of your choice.

You can also listen for sql events directly from, say, a nodejs program like:

var PS = require('pg-pubsub');

if(process.argv.length !== 3) {
  console.log("USAGE: DB_URL");
  process.exit(2);
}
var url  = process.argv[2],
    ps   = new PS(url);

// change console.log to your own handler
ps.addChannel('postgres_notify_channel_name', console.log);
christiaanwesterbeek commented 7 years ago

I hadn't considered that yet. Thanks @begriffs !

I'm working out the architecture for a new platform of applications. I expect a lot of them to need additional tasks upon record creation/modification/deletion. The route of 1. creating a trigger, 2. emitting the NOTIFY command (with the payload), 3. listening for that on the server, process that and 4. having my admin tool listen to the server when it's done processing the notification seems like an awful long of a route to implement for each and every minor hook. And I haven't even mentioned an optional intermediate queue for the notification yet.

And as much as I may going to like Postgres, using this route with many use cases ties me to Postgres. This may be a minor issue, but it's a matter of fact nonetheless.

I'm not saying that I'm not going to try your suggestion because I probably am. I also like the idea of having a separate (node.js) app that is responsible for just handling database events. How do you think about my reservations?

PierreRochard commented 7 years ago

With some templating you could churn out 1 and 2 with https://github.com/PierreRochard/postgrest-boilerplate/blob/master/sql/create_triggers.sql and https://github.com/PierreRochard/postgrest-boilerplate/blob/master/sql/create_functions.sql

christiaanwesterbeek commented 7 years ago

Sounds good. Instead of notifying in the triggered function directly, I'm thinking about using the triggers to store row_to_json and trigger-action-name in an event table first. A insert-trigger on the event table would subsequently emit a sql NOTIFY with the original record as json along with the event id as payload. The listening server would update the event record marking it as "heard".

A dead listening server will result in event records not being marked as "heard". And I can do a fallback-poll on the event table and resend events that weren't heard the first time.

This approach overcomes the problem of a dead server that didn't listen and the notification lost. This seems like an easier solution than implementing RabbitMQ (which I've never done before btw). And RabbitMQ can be dead too right? Leaving the notification lost just as well.

Again, I welcome your thoughts :)

PS: These are some links I found while studying the topic: