leptonix / decoding-json

logical decoding output plugin for postgresql
18 stars 10 forks source link

Include database in output #1

Closed jmealo closed 9 years ago

jmealo commented 9 years ago

Kudos on decoding-json!

I see that you recently added the schema to the output. Could you also add the database?

I would imagine something like this:

{ "database": "databaseName", "table": "tableName", "schema": "schemaName", ...}

Thanks, Jeff

jmealo commented 9 years ago

I guess this isn't really needed as you connect specifying the database :-)

leptonix commented 9 years ago

I'm glad you like the plugin.

I have added the database to the output (in a separate branch feature-database-property for now)

jmealo commented 9 years ago

I have found that string munging is the fastest way to do logical decoding. I don't know how safe it is in the long run. For the time being, I have chosen decoding_json for use in Lapidus. Please check it out and let me know what you think.

leptonix commented 9 years ago

Lapidus looks very interesting.

I wrote this plugin to implement something like the _changes feed of CouchDB and use that to build master to master replication. But that is stilll work in progress.

I first tried to use "test_encoding", and then parse the information I would need, but that didn't work that well. So I modified it into "decoding-json".

If there are other missing (or awkward) properties, just let me know.

jmealo commented 9 years ago

I aim for Lapidus to be a high-performance, opinionated, generic interface to stream changes from any database that supports replication. It's in Node.js but a primary use case would be piping it to another process that takes line delimited JSON.

I'd like to support common use cases out of the box when run as a daemon to minimize parsing and reparsing of JSON, but it's still very lightweight compared to what people are likely to do with triggers or other naughty things.

decoding_json is very fast, it's much faster than wal2json or decoder_json. I'd like to see us find a way to test it with custom data types, unicode and other things that are likely to screw up naively building the strings rather than using a JSON library.

Are there any known issues or likely failure scenarios that I should know about?

leptonix commented 9 years ago

I have only tested data types I am currently using. The data type JSONB works ok without any special code from this plugin, except that it is rendered as a serialized string. Other types might also work out of the box, but it would be nice to have a way to test that. Have to think about that one, let met know if you have suggestions.

The only issue I know about was fixed in september (table names with quotes). If there are other change types besides "INSERT", "UPDATE" and "DELETE" something might go wrong. The "change" property in the output will then have the value "FIXME".

jmealo commented 9 years ago

@leptonix: any idea when we'd get ???unchanged-toast-datum???

leptonix commented 9 years ago

That one I copied verbatim from test_decoding. I have not yet seen it in the output.