Closed stuartsaltzman closed 8 years ago
I also would appreciate this ability
If I'm not mistaken, it seems like this plugin will use whatever the registered serializer is for a given type: https://github.com/okumin/akka-persistence-sql-async/blob/ec6c977d702fc2a2bc78842be1f62bb75a6f4b31/core/src/main/scala/akka/persistence/journal/sqlasync/ScalikeJDBCWriteJournal.scala#L24-L31
Would this mean that one could register their own JSON serializers for the built-in Akka Persistence data structures in application.conf
, as well as JSON serializers for their own event types, and everything would work?
I suppose one would also want to change the message types in the docs from BYTEA
to JSON
. You could use JSONB
, but that seems less appropriate for most use cases. Both give you Postgres sugar for querying, but JSONB adds overhead in encoding, which doesn't carry its weight if planning to do the vast majority of querying via Akka Persistence.
Okay, I think I'm going to give this a shot, unless someone like @okumin with real knowledge reports back with a reason to think this wouldn't work, or might not be advisable. For my application, I'm expecting throughput to be on the order of an event per second max, so I'm not super concerned about latency at this point. The idea of being able to directly examine the underlying store and also to manage deployment of a familiar DB is really attractive for my purposes. I'll report back!
@stuartsaltzman @acjay Excuse me for my late response.
You can serialize events or states by your custom serializers like the sample.
However, you may not store that as special types, e.g. JSONB.
Table schemas should be defined for each application considering its performance characteristic, so I'm planning to make table schemas customizable.
Is it possible to serialize the payload as Json using this persistence plugin? I think the plugin would need to honor any configured serialization plugins? Thanks!