georgysavva / scany

Library for scanning data from a database into Go structs and more
MIT License
1.29k stars 68 forks source link

hybrid mapping struct / map[string]float #81

Closed xorduna closed 2 years ago

xorduna commented 2 years ago

Hi,

First of all, thank you very much for contributing this library to the community. It seems to bee quite easy to use.

I am trying to get arbitrary data out from a timescale database and I would like to mix structs and map[string] scanning.

Given those structs:

type Data struct {
    PhConTot float64
    PhGenTot float64
}

type Metric struct {
    Time     time.Time
    DeviceId string
    Data     Data
}

I am correctly scanning data with

    var metrics []*Metric
    if err := pgxscan.Select(ctx, conn, &metrics, `
        select time, device_id, ph_con_tot as "data.ph_con_tot", ph_gen_tot as "data.ph_gen_tot"
        from metrics_computer_wide
          WHERE device_id = '9d5eaae0-421b-11ec-9949-7f0fdad2c99c' and ph_con_tot is not null and 
          time > '2022-04-01' and time <= '2022-04-05'
        `); err != nil {
        fmt.Fprintf(os.Stderr, "Unable to query database: %v\n", err)
    }

But since Data fields can be a huge list of options, I would like it to be a map[string]float64. For example

type MetricMap struct {
    Time     time.Time
    DeviceId string
    Data     map[string]float64
}

And querying like

    var metricsMap []*MetricMap
    if err := pgxscan.Select(ctx, conn, &metricsMap, `
        select time, device_id, ph_con_tot as "data.ph_con_tot", ph_gen_tot as "data.ph_gen_tot"
        from metrics_computer_wide
          WHERE device_id = '9d5eaae0-421b-11ec-9949-7f0fdad2c99c' and ph_con_tot is not null and 
          time > '2022-04-01' and time <= '2022-04-05' limit 10
        `); err != nil {
        fmt.Fprintf(os.Stderr, "Unable to query database: %v\n", err)
    }

Then I am getting

Unable to query database: scany: column: 'data.ph_con_tot': no corresponding field found, or it's unexported in main.MetricMap

Is there any way to mix structs and map[string]? or can I extend the scanner?

Thank you very much!

xorduna commented 2 years ago

I just found issue #76 and I thought my problem was solved. So I defined:

type CustomMetric struct {
    Time     time.Time
    DeviceId string
    Data     MyData `db:"data.*"`
}

type MyData struct {
    Data map[string]float64
}

And

func (m *MyData) Scan(v interface{}) error {
    fmt.Println(v)
    return nil
}

But it seems that the problem persists: scany: column: 'data.ph_con_tot': no corresponding field found, or it's unexported in main.CustomMetric and Scan is not even called.

I think this should be the way to go, but still something is wrong. Can anybody give me a clue?

Thank you very much!

georgysavva commented 2 years ago

Hi. Thank you for your issue! Right now "data.ph_con_tot" and "data.ph_gen_tot" are two distinct columns and there is no way to scan them into the .Data field for scany. You need to combine these two columns into a single JSON object:

        select time, device_id, json_build_object('ph_con_tot', ph_con_tot, 'ph_gen_tot', ph_gen_tot) as data
        from metrics_computer_wide
          WHERE device_id = '9d5eaae0-421b-11ec-9949-7f0fdad2c99c' and ph_con_tot is not null and 
          time > '2022-04-01' and time <= '2022-04-05' limit 10 

This way scany will pass the .Data field map to the underlying library and it will parse the JSON object into it. Let me know if you have any problems with it!

xorduna commented 2 years ago

Yeah, it worked perfectly!

Thank you very much!

xorduna commented 2 years ago

Hi Georgy,

I am trying to re-open this issue because I am facing an un expected behaviour. Some days ago I told you that my POC worked correctly, but now I am getting a weird error.

This is my code:

package main

import (
    "context"
    "database/sql"
    "fmt"
    "github.com/georgysavva/scany/pgxscan"
    "github.com/georgysavva/scany/sqlscan"
    "github.com/jackc/pgx/v4"
    _ "github.com/lib/pq"
    "time"
)

type MetricMap struct {
    Time     time.Time
    DeviceId string
    Data     map[string]interface{}
}

func main() {
    fmt.Println("Starting timescale playground")
    ctx := context.Background()
    connStr := "postgres://postgres:postgres@localhost:5430/postgres?sslmode=disable"
    db, err := sql.Open("postgres", connStr)
    if err != nil {
        fmt.Println(err)
        return
    }
    sqlJson := `
        select time, device_id, json_build_object('ph_con_tot', ph_con_tot , 'ph_gen_tot', ph_gen_tot) as "data"
        from metrics_computer_wide
          WHERE device_id = '9d5eaae0-421b-11ec-9949-7f0fdad2c99c' and ph_con_tot is not null and 
          time > '2022-04-01' and time <= '2022-04-05' limit 10
        `
    sqlJson = "SELECT time, device_id, json_build_object('ph_con_tot',ph_con_tot,'ph_gen_tot',ph_gen_tot) as data FROM metrics_base WHERE device_id = '08c210ca-7077-4907-8ea7-a98b77d4df0c' AND time >= '2022-05-02 13:13:56' AND time <= '2022-05-02 13:14:56'"
    var metrics []MetricMap

    err = sqlscan.Select(ctx, db, &metrics, sqlJson)
    if err != nil {
        fmt.Println(err)
    } else {
        for _, metric := range metrics {
            fmt.Printf("%s %s %f %f\n", metric.Time, metric.DeviceId, metric.Data["ph_con_tot"], metric.Data["ph_gen_tot"])
        }
    }

    // Now with PGX Scan
    dbPgx, _ := pgx.Connect(ctx, connStr)

    pgxscan.Select(ctx, dbPgx, &metrics, sqlJson)
    if err != nil {
        fmt.Println(err)
    } else {
        for _, metric := range metrics {
            fmt.Printf("%s %s %f %f\n", metric.Time, metric.DeviceId, metric.Data["ph_con_tot"], metric.Data["ph_gen_tot"])
        }
    }
}

In both cases the error is:

Scan error on column index 2, name "data": unsupported Scan, storing driver.Value type []uint8 into type *map[string]interface {}

As you can see, the code is identical to the one we discussed earlier, but it seems there is some problem with data conversion now.

Can you give some light on the error?

Thank you very much!

georgysavva commented 2 years ago

@xorduna I just replied in the new issue.