Is your feature request related to a problem? Please describe.
I'm using iobroker with a postgresql db to store the history. To visualize the data i'm using grafana. The sql adapter is storing the timestamp as ms. When querying for a timeframe of data in grafana (e.g. last 3 hours), grafana is only able to filter for seconds or nanosec. I solved this by deviding the saved timestamp by 1000. But This leads to a huge performance issue: The query takes 9 seconds instead of 60ms due to the conversion. This is becoming a problem because the DB is growing.
Describe the solution you'd like
My idea is that the SQL adapter provides a setting to define the accuracy of the timestamp, Seconds, Milliseconds, NanoSeconds
Describe alternatives you've considered
Maybe providing a text field to store a formula like "/1000" to calculate the timestamp
Also any workaround on DB level is appreciated, I was thinking about a trigger while saving the value to directly convert it
I found a nice workaround which is fine for me but maybe not possible to normal users: I created a Generated Column in Postgresql which converts the timestamp to seconds directly when the DB entry is created.
Is your feature request related to a problem? Please describe. I'm using iobroker with a postgresql db to store the history. To visualize the data i'm using grafana. The sql adapter is storing the timestamp as ms. When querying for a timeframe of data in grafana (e.g. last 3 hours), grafana is only able to filter for seconds or nanosec. I solved this by deviding the saved timestamp by 1000. But This leads to a huge performance issue: The query takes 9 seconds instead of 60ms due to the conversion. This is becoming a problem because the DB is growing.
Describe the solution you'd like My idea is that the SQL adapter provides a setting to define the accuracy of the timestamp, Seconds, Milliseconds, NanoSeconds
Describe alternatives you've considered