jschaf / pggen

Generate type-safe Go for any Postgres query. If Postgres can run the query, pggen can generate code for it.
MIT License
282 stars 26 forks source link

Using custom types + arrays as pggen.arg() #29

Closed aight8 closed 2 years ago

aight8 commented 3 years ago

Actual:

When using custom types as argument (or an array) the generated code just pass the generated []CustomType go argument to QueryRow()

select pggen.arg('my_types')::custom_type[]

However []CustomType is not encodable by pgx, so it fails.

Expected: It should create a encodable pgtype Value (pgtype.NewArrayType) before passing it to the query.

jschaf commented 3 years ago

I think I just ran into the same bug.

https://github.com/jschaf/pggen/commit/6a49d2bfe551abee5ca51c08bd8dbe035f04b02a lays some groundwork for fixing it. I think the fix requires adjusting the code generation when we have an array type of a composite type.

jschaf commented 3 years ago

I think I misunderstood. I've fixed a couple bugs related to custom types and arrays in outputs but not related to input.

I started to look into supporting input fields and it's pretty painful. pggen would have to convert the input type into a []interface{} type where each element is a child field. Here's a snippet from a working example I cobbled together.

    p := newProductImageTypeArrayDecoder()
    buf := make([]byte, 0, 256)
    cn := q.conn.(*pgx.Conn)
    if err := p.Set([][]interface{}{
        {imgs[0].Source, []interface{}{imgs[0].Dimensions.Width, imgs[0].Dimensions.Height}},
        {imgs[1].Source, []interface{}{imgs[1].Dimensions.Width, imgs[1].Dimensions.Height}},
    },
    ); err != nil {
        return nil, fmt.Errorf("set parameter: %w", err)
    }
    text, err := p.EncodeText(cn.ConnInfo(), buf)
    if err != nil {
        return nil, err
    }
    row := q.conn.QueryRow(ctx, arrayNested2ParamSQL, string(text))
aight8 commented 3 years ago

isn't it cobbled together like in the scan function? I thought the generated encoder type (or Value how pgtype calls it) can be available like generated enum types in the package namespace. cannot the same Type/ArrayType be used then use .Set() on it before it is passed to the Query function?

(current version does not generating my composit type go types -> since commit 31e795 - "Introduce DeclarerSet...")

jschaf commented 3 years ago

I switched all decoders over to pgtype.ValueTranscoder which has a Set method to set the value and an AssignTo method to assign the value to a Go Type. I'm using ValueTranscoder because the more flexible CompositeFields type doesn't support nested arrays or composite type.

current version does not generating my composit type

I moved all composite type generation into factory functions in the leader file, like newProductImageType since they get reused across functions.

The problem is that the Set method is much less flexible than AssignTo. For composite types pgtype.NewCompositeType, you can AssignTo any Go struct and pgx will use reflection to do the field mapping. However, the same composite type only supports Set on []interface{}.

I'm guessing the asymmetry exists because it's pretty rare to use structs as query parameters. It's much more common to decode Postgres values into a Go struct.

I'll investigate how pgx encodes query parameters but I don't think there's another mechanism.

aight8 commented 3 years ago

current version does not generating my composit type

I mean, when I have multiple query sql files, structs (incl. decoder functions) are only generated for the first one (it ends in a corrupt go code where the Querier functions try to call unknown functions or return not existend types. The tests only have a single query file so this case is somehow not caught.

aight8 commented 3 years ago

okey I understand the issue.

another idea would be:

advantages

honestly I must learn more about pgx/pgtype, I don't have an exact overview of it, how things work inside it.

jschaf commented 3 years ago

Yea, I figured out a way to do this similar to what you proposed.

type textEncoder struct {
    pgtype.ValueTranscoder
}

func (t textEncoder) PreferredParamFormat() int16 { return pgtype.TextFormatCode }

func encodeDimensions(d Dimensions) textEncoder {
    dec := newDimensionsDecoder()
    dec.Set([]interface{}{d.Width, d.Height})
    return textEncoder{ValueTranscoder: dec}
}
jschaf commented 3 years ago

As expected, this was pretty painful to implement. It works for composite types and arrays of composite types. Give it a go.

It breaks if you have triply nested composite types due to a pgx quoting bug: https://github.com/jackc/pgx/issues/874

aight8 commented 3 years ago

ok I understand now the inner working better. yes the Value.Set function requires the interface slice, but we provide a filled data struct. actually it can reflect the structure fields (just by field order or json tag) and pack an interface array, is there a reason why pgtype.CompositeType.Set does not priovide this in it's "magic" setter? however your implementation is better anyway, because it avoid reflection and it's strict.

I have two OT questions..

  1. should initial db operations (prepare) explicitly avoided to not depend on any initial pre-query-work for a "working" library? I ask this because of the unknown oid's for the binary format.

  2. what were your thoughts when you decided not to include ConnInfo?

I test pggen currently in a more complex environment, especially the new argument encoder and give then some feedback. When pgx fix the escape problem it could be pretty neat.

jschaf commented 3 years ago

however your implementation is better anyway, because it avoid reflection and it's strict.

Yea, I think I might switch the encoder over to []interface{} as well. Since it's generated, there's no reason to enable reflection.

should initial db operations (prepare) explicitly avoided to not depend on any initial pre-query-work for a "working" library? I ask this because of the unknown oid's for the binary format.

I don't think using Prepare affects how the parameters would be encoded since we've already constructed the pgtype.Value.

what were your thoughts when you decided not to include ConnInfo?

Do you mean using it to look up how to encode parameters? I didn't think about it that much. It's not needed; pggen knows how to encode all the types since they're all generated from the database.

jschaf commented 3 years ago

After https://github.com/jackc/pgtype/pull/100, I think I can drop all the helpers to a single function.

what were your thoughts when you decided not to include ConnInfo?

I think I can use this to get the OID and enable binary decoding. Is that what you were getting at?

aight8 commented 3 years ago

Hm I don't get it. Status is an Undefined, Null or Present. How do you want get the OID out from this?


I just know that ConnInfo acts as a central type registration. It is mainly used when the destination type is not defined. Specifically for example pgx.row.Values, it returns []interface{} with concrete types of column values. I must check if rows.Scan is using it (maybe in some situation). With the somewhat hidden pgx helper package a common way to define types is typically (in pgx Conn AfterConnect hook):

// pgx automatically creates a DataType (oid, name, transcoder typ) by a db lookup (also composit types)
dataType, err := pgxtype.LoadDataType(ctx, conn, connInfo, typeName)
// then register it...
connInfo.RegisterDataType(dataType)
// now defines the destination type for that type
conn.ConnInfo().RegisterDefaultPgType(new(Person), "person")

There is no real debug functions for ConnInfo (except pp dumping it). I have a forked pgtype to get information about the internals of this registry. (-> #31) If this could help you for debugging, tell me.

jschaf commented 2 years ago

The original issue of an array of composite types works. Going to close.

For example, Blocks is a composite type in this query: https://github.com/jschaf/pggen/blob/d158a4584001b069a4c684d58c052be608769f61/example/composite/query.sql.go#L19