Closed thejoven closed 1 month ago
Using PostgreSQL can certainly solve the problem of database distribution, but in terms of transformation costs, it would be better to consider this from the beginning, after all, you will have massive amounts of data in the future.
I think you're likely right about the transformation costs of using PostgreSQL. I haven't yet committed my database token logger but it's already mostly written. I think that if I was going to redo the entire logger I would likely use MongoDB or KDB+ because of the insane speeds. I've never heard of surreal until now but I think that I'm a little reluctant to jump on that horse so soon. It does look like a great answer to the problem though.
My current design for the logger mainly just offloads data from redis after the tracking period is over then slowly inserts all the data into postgres to persist. It's not the most elegant and can definitely be improved. If you want to give the implementation a crack, I'd be open to your changes.
My current design is more focused on live data streaming, so I appreciate your input
I'm not very familiar with KDB+, but I still recommend using a time-series database for implementation, as it will result in lower data storage costs and facilitate queries. For example: ClickHouse.
I'm not very familiar with KDB+, but I still recommend using a time-series database for implementation, as it will result in lower data storage costs and facilitate queries. For example: ClickHouse.
Yea, after some consideration, I think I'm going to use TimescaleDB. It seems like it fits my needs the best.
Using PostgreSQL can certainly solve the problem of database distribution, but in terms of transformation costs, it would be better to consider this from the beginning, after all, you will have massive amounts of data in the future.