doug-martin / goqu

SQL builder and query library for golang
http://doug-martin.github.io/goqu/
MIT License
2.35k stars 205 forks source link

Cannot handle Array types in Postgres #215

Open teejays opened 4 years ago

teejays commented 4 years ago

Describe the bug When using postgres dialect, inserting a row with an Array SQL column type (e.g. text[]) does not generate the correct SQL. For the values corresponding to that column, it generates ($1, $2) where as postgres expects something of the form {$1, $2}.

To Reproduce Create a table (SQL):

CREATE TABLE posts (
   title text NOT NULL PRIMARY KEY,
   tags  text[]
);

Construct InsertDataset (Go, goqu):

dialect := goqu.Dialect("postgres")
builder := dialect.Insert("posts").Cols("title", "tags").Vals(goqu.Vals{"New Post", []string{"a", "b"}})
query, _, _ := builder.ToSQL()
fmt.Println(query)

Query generated is:

INSERT INTO "posts" ("title", "tags") VALUES ( 'New Post', ('a', 'b'))

Running it in Postgres gives an error:

 column "tags" is of type text[] but expression is of type record

Expected behavior I expect goqu to generate the below, which is expected by postgres.

INSERT INTO "posts" ("title", "tags") VALUES ( 'New Post', '{"a", "b"}')

Dialect:

doug-martin commented 4 years ago

@teejays you should checkout https://godoc.org/github.com/lib/pq#Array which will handle properly formatting using the postgres array format.

rajphoenix20 commented 2 years ago

@teejays, any workaround found for this issue yet, please advise .. thank you

MDobak commented 2 years ago

@teejays you should checkout https://godoc.org/github.com/lib/pq#Array which will handle properly formatting using the postgres array format.

If I understand correctly, there is no way to use it with the Insert method provided by goqu (at least without wrapping every slice with a custom type). Native support for PostgreSQL arrays would be really useful.

alyhegazy commented 2 years ago

I ran into the same problem but in a SELECT query, trying to use array contains in the WHERE clause, and was able to get around it by using goqu.L("(tags @> ?)", fmt.Sprintf("{%s, %s}", "tag1", "tag2"))

ori-shalom commented 2 years ago

Following @doug-martin suggestion, this solution with pq.Array works:

dialect := goqu.Dialect("postgres")
builder := dialect.Insert("posts").
    Cols("title", "tags").
    Vals(goqu.Vals{
        "New Post",
        pq.Array([]string{"a", "b"}), // wrap value with pq.Array
    })
query, _, _ := builder.ToSQL()
fmt.Println(query) // INSERT INTO "posts" ("title", "tags") VALUES ('New Post', '{"a","b"}');
steeling commented 2 years ago

@teejays you should checkout https://godoc.org/github.com/lib/pq#Array which will handle properly formatting using the postgres array format.

@doug-martin this this doesn't work when trying to insert structs via goqu.Insert("posts").Rows(myPosts), unless the fields on the object themselves are an array, but this means that we can't use type aliasing on our structs now

rgalanakis commented 1 year ago

@steeling as per #137 here is a workaround you can use with structs containing slice fields:

https://gist.github.com/rgalanakis/a2d5b8b61ee9540981bb43a103bf59c7

// PGArrayExpr takes a slice of items and returns a literal expression
// with a Postgres array that can be used by goqu.
// It does this by using goqu natively and then manipulating the string
// to surround it in an array literal.
//
// To use this in a struct, you can do something like:
//
//     type MyModel struct {
//       Tags   []string        `json:"tags" db:"-"`
//       DbTags goqu.Expression `json:"-" db:"tags"`
//     }
//     body := `{"tags":["x", "y"]}`
//     m := MyModel{}
//     _ = json.Unmarshal([]byte(body), &m)
//     m.DbTags = goqux.PGArrayExpr(m.Tags)
//     sql, _, _ := goqu.Insert("modeltable").Rows(m).ToSQL()
func PGArrayExpr[T any](arr []T) goqu.Expression {
  if len(arr) == 0 {
    return goqu.L("'{}'")
  }
  lit := goqu.V(arr)
  selectSql, _, err := goqu.From(lit).ToSQL()
  if err != nil {
    panic(err)
  }
  valuesSql := strings.TrimPrefix(selectSql, "SELECT * FROM ")
  if valuesSql == selectSql {
    panic("expected go to output an (invalid) 'SELECT * FROM (x, y, z)' for the slice")
  }
  if valuesSql[0] != '(' || valuesSql[len(valuesSql)-1] != ')' {
    panic("expected goqu to output '(x, y, z)' but is missing parens")
  }
  arraySql := fmt.Sprintf("ARRAY[%s]", valuesSql[1:len(valuesSql)-1])
  return goqu.L(arraySql)
}

func ExamplePGArrayExprWithModel() {
    type MyModel struct {
        Tags   []string        `json:"tags" db:"-"`
        DbTags goqu.Expression `json:"-" db:"tags"`
    }
    body := `{"tags":["x", "y"]}`
    m := MyModel{}
    _ = json.Unmarshal([]byte(body), &m)
    m.DbTags = goqux.PGArrayExpr(m.Tags)
    sql, _, _ := goqu.Insert("modeltable").Rows(m).ToSQL()
    fmt.Println(sql)

    // Output:
    // INSERT INTO "modeltable" ("tags") VALUES (ARRAY['x', 'y'])
}
mscienski commented 1 year ago

I am also scanning the inserted values into a struct, so I had to do an additional step.

With either the pq.Array solution or @rgalanakis's solution, both of which work great for inserting, I had to scan into a different struct than I inserted with. Using pq.Array, the Tags on insert would be the interface return value of pq.Array(), and on returning would be a pq.StringArray.

type MyModel struct {
    ID                        string         `db:"id"`
}

type MyModelToInsert struct {
    MyModel
    Tags          interface {
        driver.Valuer
        sql.Scanner
    } `db:"tags"`
}

type MyInsertedModel struct {
    MyModel
    Tags          pq.StringArray `db:"tags"`
}

myModelToInsert := MyModelToInsert{
    MyModel: MyModel{
        ID: "some_id",
    },
}
myModelToInsert.Tags = pq.Array([]string{"a", "b"})

var myInsertedModel MyInsertedModel

saved, err := txn.
        Insert(goqu.T("modeltable").
        Rows(myModelToInsert).
        Returning(goqu.T("modeltable").All()).
        Executor().
        ScanStructContext(ctx, &myInsertedModel)

...

fmt.Println(myInsertedModel.Tags)

// Output:
// [a b]

This only really works when the array type can be read back to one of the pq defined types (pq.StringArray here). For arrays of custom types, a custom slice type that implements the driver.Valuer and sql.Scanner interfaces would be needed, to convert to and from the custom type and bytes.

rgalanakis commented 1 year ago

Nice solution! I ran into the same limitation after I posted my solution and did some more work (and needed to start scanning, not just inserting). I am using pgx and couldn't drop into sql directly so needed yet another solution for that, too, so I could only use goqu to get the SQL string. Ultimately what I ended with was:

If can try to clean up the code to post, but it's also probably not suitable for all use cases. Just wanted to mention that I didn't find a solution that didn't involve writing a significant amount of reflection shimming code to get this to work right.

federico-hero commented 11 months ago

This also works:

type Tags []string

func (t Tags) Scan(src any) error {
    return pq.Array(t).Scan(src)
}

func (t Tags) Value() (driver.Value, error) {
    return pq.Array(t).Value()
}