Closed evanfuller closed 2 years ago
Hi. It's not actually a bug, it's more an implementation limitation.
The problem is that sql.NullInt64
is actually a struct and when you pass it to scany it handles as a struct case (looks for fields that should correspond to columns). When you pass int64
it's a primitive type case for scany and it handles it differently (just assigns single column value)
The workaround here is to wrap the sql.NullInt64
type into a struct:
type Row struct {
Sum sql.NullInt64
}
and pass that struct to scany instead.
See https://github.com/georgysavva/scany/issues/33 and https://github.com/georgysavva/scany/issues/63 for details.
This workaround only appears to work with sufficiently small numbers, i guess depending how the database chooses to represent the integer. When I tried this workaround locally, it appeared to work. However, when I run in my pre-production environment, I get the following error:
scany: scan row into struct fields: can't scan into dest[0]: converting driver.Value type string ("60640252e0") to a int64: invalid syntax
To confirm, I manually ran the sql query in a DB shell and got the following
notifications_stage=> select sum(my_column) from my_table where ...;
sum
----------
60640252
(1 row)
Is this a bug?
EDIT: actually, this may not have even worked with small numbers, it appears that it only works when the sum is 0. Otherwise, the same error is emitted, e.g. scany: scan row into struct fields: can't scan into dest[0]: converting driver.Value type string (\"45e0\") to a int64: invalid syntax
This error has nothing to do with scany actually, scany successfully passes sql.NullInt64
type to the underlying database library and the database library itself fails to scan into it. From the error message It seems that database library treats your data as a string.
sql.NullInt64
to your lib, you should get the same error.Got it; I am surprised that the pgx library is choosing to return the integer data as a string in this case, but the ::int
workaround appears to work. thanks.
I am glad it works for you know with scany! I think you should file a bug to pgx library in that case!
Hi. It's not actually a bug, it's more an implementation limitation. The problem is that
sql.NullInt64
is actually a struct and when you pass it to scany it handles as a struct case (looks for fields that should correspond to columns). When you passint64
it's a primitive type case for scany and it handles it differently (just assigns single column value) The workaround here is to wrap thesql.NullInt64
type into a struct:type Row struct { Sum sql.NullInt64 }
and pass that struct to scany instead.
See #33 and #63 for details.
@evanfuller JFYI I've released a new version of scany that now includes this feature.
I just tried plugging in this library for my use case with
pgx
; for the most part, it works great. One bug I have run into is for a code snippet like thisIn this case, I would expect the
SUM()
aggregator to be parsed and then populate thesql.NullInt64
type. However, this query produces the error: `scany: column: 'sum': no corresponding field found, or it's unexported in sql.NullInt64``.I noticed that by changing the declaration to
var mySum int64
, this appeared to solve the problem.I am guessing that the bug might be that the
sql.Null*
types do not populate the struct correctly when aggregate functions are at play, so I would expect those to be populated appropriately rather than throwing an error.