Closed azaviyalov closed 2 years ago
Hi. Yes, it should work with scany. Could you please provide the code where you define your struct, how you pass it to scan and the SQL query?
Yes, sure
package main
import (
"fmt"
"context"
"github.com/jackc/pgx/v4/pgxpool"
"github.com/georgysavva/scany/pgxscan"
)
type User struct {
// no columns defined here
}
/*
create table users
(
id serial constraint users_pk primary key,
name varchar not null
);
create unique index users_id_uindex on users (id);
insert into users (id, name)
values (1, 'john'),
(2, 'jacob'),
(3, 'joshua');
*/
func (m *User) Scan(data interface{}) error {
// seems not to be called
// scany should not have any errors, if it is
return nil
}
func main() {
ctx := context.Background()
db, _ := pgxpool.Connect(ctx, "postgres://postgres:mysecretpassword@localhost:5432/postgres")
var users []*User
err := pgxscan.Select(ctx, db, &users, `SELECT id, name FROM users`)
if err != nil {
panic(err)
}
fmt.Println(users)
}
output is panic: scany: column: 'id': no corresponding field found, or it's unexported in main.User
Scany supports custom types that implement sql.Scanner
interface (see this for more details on how sql.Scanner
works with database libraries).
Here is the corresponding scany docs section: https://pkg.go.dev/github.com/georgysavva/scany@v0.2.9/dbscan#hdr-NULLs_and_custom_types.
Generally, they should be used as fields, not as a top-level model, because scany just passes them to the underlying database library for the corresponding field and the database library handles the custom .Scan()
call.
For example:
type User struct {
ID string `db:"id"`
Name MyString `db:"name"`
}
type MyString struct {
text string
}
func (ms *MyString) Scan(data interface{}) error {
// Fill ms.text and do other stuff here.
return nil
}
func main() {
ctx := context.Background()
db, _ := pgxpool.Connect(ctx, "postgres://postgres:mysecretpassword@localhost:5432/postgres")
var users []*User
err := pgxscan.Select(ctx, db, &users, `SELECT id, name FROM users`)
if err != nil {
panic(err)
}
fmt.Println(users)
}
However it could be used as you intended but only for the single-column case, because calling the custom .Scan()
method only makes sense on columns:
type MyString struct {
text string
}
func (ms *MyString) Scan(data interface{}) error {
// Fill ms.text and do other stuff here.
return nil
}
func main() {
ctx := context.Background()
db, _ := pgxpool.Connect(ctx, "postgres://postgres:mysecretpassword@localhost:5432/postgres")
var strings []MyString
err := pgxscan.Select(ctx, db, &strings, `SELECT name FROM users`)
if err != nil {
panic(err)
}
fmt.Println(strings)
}
Note that the later example won't work with the latest scany version. It's in the master branch and hasn't been released yet.
Thank you, the problem is solved!
I'm moving from
sqlx
toscany
because of usingpgx
as apostgres
driver.I have a custom implementation of
sql.Scanner
for my model, but it seems doesn't working withscany
.When selecting rows, I got errors like that:
scany: column: 'body': no corresponding field found, or it's unexported in model.MyModel
, becauseMyModel
does not really have abody
column, but a customScan
instead.Is it possible to implement check if models implement some
Scanner
interface?