Open teejays opened 4 years ago
@teejays you should checkout https://godoc.org/github.com/lib/pq#Array which will handle properly formatting using the postgres array format.
@teejays, any workaround found for this issue yet, please advise .. thank you
@teejays you should checkout https://godoc.org/github.com/lib/pq#Array which will handle properly formatting using the postgres array format.
If I understand correctly, there is no way to use it with the Insert
method provided by goqu (at least without wrapping every slice with a custom type). Native support for PostgreSQL arrays would be really useful.
I ran into the same problem but in a SELECT
query, trying to use array contains in the WHERE
clause, and was able to get around it by using goqu.L("(tags @> ?)", fmt.Sprintf("{%s, %s}", "tag1", "tag2"))
Following @doug-martin suggestion, this solution with pq.Array
works:
dialect := goqu.Dialect("postgres")
builder := dialect.Insert("posts").
Cols("title", "tags").
Vals(goqu.Vals{
"New Post",
pq.Array([]string{"a", "b"}), // wrap value with pq.Array
})
query, _, _ := builder.ToSQL()
fmt.Println(query) // INSERT INTO "posts" ("title", "tags") VALUES ('New Post', '{"a","b"}');
@teejays you should checkout https://godoc.org/github.com/lib/pq#Array which will handle properly formatting using the postgres array format.
@doug-martin this this doesn't work when trying to insert structs via goqu.Insert("posts").Rows(myPosts)
, unless the fields on the object themselves are an array, but this means that we can't use type aliasing on our structs now
@steeling as per #137 here is a workaround you can use with structs containing slice fields:
https://gist.github.com/rgalanakis/a2d5b8b61ee9540981bb43a103bf59c7
// PGArrayExpr takes a slice of items and returns a literal expression
// with a Postgres array that can be used by goqu.
// It does this by using goqu natively and then manipulating the string
// to surround it in an array literal.
//
// To use this in a struct, you can do something like:
//
// type MyModel struct {
// Tags []string `json:"tags" db:"-"`
// DbTags goqu.Expression `json:"-" db:"tags"`
// }
// body := `{"tags":["x", "y"]}`
// m := MyModel{}
// _ = json.Unmarshal([]byte(body), &m)
// m.DbTags = goqux.PGArrayExpr(m.Tags)
// sql, _, _ := goqu.Insert("modeltable").Rows(m).ToSQL()
func PGArrayExpr[T any](arr []T) goqu.Expression {
if len(arr) == 0 {
return goqu.L("'{}'")
}
lit := goqu.V(arr)
selectSql, _, err := goqu.From(lit).ToSQL()
if err != nil {
panic(err)
}
valuesSql := strings.TrimPrefix(selectSql, "SELECT * FROM ")
if valuesSql == selectSql {
panic("expected go to output an (invalid) 'SELECT * FROM (x, y, z)' for the slice")
}
if valuesSql[0] != '(' || valuesSql[len(valuesSql)-1] != ')' {
panic("expected goqu to output '(x, y, z)' but is missing parens")
}
arraySql := fmt.Sprintf("ARRAY[%s]", valuesSql[1:len(valuesSql)-1])
return goqu.L(arraySql)
}
func ExamplePGArrayExprWithModel() {
type MyModel struct {
Tags []string `json:"tags" db:"-"`
DbTags goqu.Expression `json:"-" db:"tags"`
}
body := `{"tags":["x", "y"]}`
m := MyModel{}
_ = json.Unmarshal([]byte(body), &m)
m.DbTags = goqux.PGArrayExpr(m.Tags)
sql, _, _ := goqu.Insert("modeltable").Rows(m).ToSQL()
fmt.Println(sql)
// Output:
// INSERT INTO "modeltable" ("tags") VALUES (ARRAY['x', 'y'])
}
I am also scanning the inserted values into a struct, so I had to do an additional step.
With either the pq.Array
solution or @rgalanakis's solution, both of which work great for inserting, I had to scan into a different struct than I inserted with. Using pq.Array
, the Tags on insert would be the interface return value of pq.Array()
, and on returning would be a pq.StringArray
.
type MyModel struct {
ID string `db:"id"`
}
type MyModelToInsert struct {
MyModel
Tags interface {
driver.Valuer
sql.Scanner
} `db:"tags"`
}
type MyInsertedModel struct {
MyModel
Tags pq.StringArray `db:"tags"`
}
myModelToInsert := MyModelToInsert{
MyModel: MyModel{
ID: "some_id",
},
}
myModelToInsert.Tags = pq.Array([]string{"a", "b"})
var myInsertedModel MyInsertedModel
saved, err := txn.
Insert(goqu.T("modeltable").
Rows(myModelToInsert).
Returning(goqu.T("modeltable").All()).
Executor().
ScanStructContext(ctx, &myInsertedModel)
...
fmt.Println(myInsertedModel.Tags)
// Output:
// [a b]
This only really works when the array type can be read back to one of the pq defined types (pq.StringArray
here). For arrays of custom types, a custom slice type that implements the driver.Valuer
and sql.Scanner
interfaces would be needed, to convert to and from the custom type and bytes.
Nice solution! I ran into the same limitation after I posted my solution and did some more work (and needed to start scanning, not just inserting). I am using pgx and couldn't drop into sql
directly so needed yet another solution for that, too, so I could only use goqu
to get the SQL string. Ultimately what I ended with was:
[]string
fieldpgxscan.Select
, which worked fine with the []string
field.goqu.Insert().Cols().Values()
, rather than working with records/structs.goqux.PGArrayExpr
when it detects a slice.If can try to clean up the code to post, but it's also probably not suitable for all use cases. Just wanted to mention that I didn't find a solution that didn't involve writing a significant amount of reflection shimming code to get this to work right.
This also works:
type Tags []string
func (t Tags) Scan(src any) error {
return pq.Array(t).Scan(src)
}
func (t Tags) Value() (driver.Value, error) {
return pq.Array(t).Value()
}
Describe the bug When using postgres dialect, inserting a row with an Array SQL column type (e.g.
text[]
) does not generate the correct SQL. For the values corresponding to that column, it generates($1, $2)
where as postgres expects something of the form{$1, $2}
.To Reproduce Create a table (SQL):
Construct InsertDataset (Go, goqu):
Query generated is:
Running it in Postgres gives an error:
Expected behavior I expect goqu to generate the below, which is expected by postgres.
Dialect: