Closed nkramaric closed 6 years ago
I know with regular JSON arrays stringifying them beforehand works but I don't think that'd hold for arrays-of-JSON. Does #598 offer any leads?
Unfortunately both stringifying the items in the array and also stringifying the array doesn't work. I looked at #598 and it offered some clues but I did not look too deeply into that "toPostgres" function documentation.
Is that "toPostgres" function something I add to the item
object in my example and it will get run in the save function?
Yes in the abstract, but it doesn't work for this case. You need to turn the pgFormatting
driver option on in order to bypass pg-promise's parameter processing, which will also mean you can't use named parameters. If you don't care about that at all, you can pass it in the driverConfig when you initialize Massive. Otherwise enable then disable it once the insert has succeeded or failed.
Tagging @vitaly-t in case you're interested :wink:
it('inserts arrays of json/jsonb', function* () {
db.instance.$config.options.pgFormatting = true;
return db.normal_pk.insert({
field1: 'nu',
array_of_json: [{val: 'one'}, {val: 'two', nested: {val: 'three'}}]
}, {
}).then(res => {
assert.equal(res.field1, 'nu');
assert.deepEqual(res.array_of_json, [
{val: 'one'},
{val: 'two', nested: {val: 'three'}}
]);
});
});
Yes in the abstract, but it doesn't work for this case.
Are you sure? Custom Type Formatting can custom-format any value, it can work with any possible case.
If you can provide an example of what kind of format is expected, I can show how it can be done.
Use of option pgFormatting
is an extreme, I don't see how this is going to help, and would advise against anyway. I was tempted to remove that option completely in future versions of the driver, as obsolete.
For what I can see, pg-promise
escapes everything correctly by default. And if you pass in ::jsonb[]
type casting, the server will understand the type correctly, and it will work:
const obj = [{a: 'something'}, {b: 'hello'}];
db.none('insert into things(data) values($1::jsonb[])', [obj])
//=> insert into things(data) values(array['{"a":"something"}','{"b":"hello"}']::jsonb[])
And if you want it to happen automatically, based on the formatting parameter, you can use Custom Type Formatting:
const arrJSONB = a => ({toPostgres: () => pgp.as.format('$1::jsonb[]', [a]), rawType: true});
so instead of passing in obj
, you would pass in arrJSONB(obj)
as the value, and then no type casting is needed inside the SQL template:
db.none('insert into things(data) values($1)', [arrJSONB(obj)])
//=> insert into things(data) values(array['{"a":"something"}','{"b":"hello"}']::jsonb[])
Thanks @dmfay and @vitaly-t.
I like the approach you outlined @vitaly-t however when I try to apply it to the insert
function it doesn't seem like the toPostgres
function is being called. Here is my setup:
const item = {
data: [{
a: "something"
},{
b: "hello"
}],
toPostgres: (p) => db.pgp.as.format('$1::jsonb[]', [p.data]),
rawtype: true
};
db.items.save(item);
I have also tried with insert db.items.insert(item)
. I am following the example in #598. No matter what I put into toPostgres it does seem to fire so maybe I am putting it in the wrong place? What do you think?
@nkramaric what is db.pgp
? :smile: The method is pgp.as.format
.
I suggest that you use pg-monitor in your project, so you can see what exactly is being executed :wink: Then we can nail the issue :wink:
db is the instance of the massive js connection:
massive(process.env.DB_CONNECTION).then(db => {
});
As per my previous post, if we can't see what's being executed, then it's all speculation. You should integrate pg-monitor, then we can get to the bottom of it fast :wink:
Makes sense! Below is the entire example. I will see if I have time to create a mini repo so you can see what is happening. (The tables are the same as described in the original post).
const massive = require('massive');
const monitor = require('pg-monitor');
const debug = require('debug')('db');
const m = massive(process.env.DB_CONNECTION).then(db => {
debug('connected to database.');
if (process.env.MONITOR_SQL) {
monitor.attach(db.driverConfig);
}
const item = {
data: [{
a: "something"
},{
b: "hello"
}],
toPostgres: (p) => {
return db.pgp.as.format('$1::jsonb[]', [p.data]);
},
rawtype: true
};
db.things.save(item);
return database;
});
/*
monitor output:
INSERT INTO "things" ("data") VALUES (array['{ "a": "something"}','{ "b": "hello"}']) RETURNING *
10:06:42 error: column "data" is of type jsonb[] but expression is of type text[]
*/
I put a breakpoint in the toPostgres function and I can see that it is never firing.
This looks wrong, i.e. the input doesn't correspond to the output, because rawType
is misspelled as rawtype
. Please make sure you got this one right :wink:
I tried with the rawType
correction. Still getting the following response:
INSERT INTO "things" ("data") VALUES (array['{"a":"something"}','{"b":"hello"}']) RETURNING *
12:40:26 error: column "data" is of type jsonb[] but expression is of type text[]
query: INSERT INTO "things" ("data") VALUES (array['{"a":"something"}','{"b":"hello"}']) RETURNING *
It just doesn't seem to run the toPostgres function.
When I do db.query
it works. However when I tried db.none
like you mentioned above it did not work as none was undefined. From what I have gathered, massivejs uses pg-promise to craft and execute the queries, is the toPostgres
function not exposed in the insert
and save
methods?
You would want db.instance.none
, that's a pg-promise method not a Massive method. I tried with rawType
and toPostgres
yesterday and it didn't seem to be running the custom formatter, but I haven't had time to fully dig into it yet.
@nkramaric use of toPostgres
is slightly off in your example. Since each field's value in the map becomes a parameter in a prepared statement, you have to define the value of the JSON array field as an object implementing toPostgres
. rawType
is necessary, and you can make the formatter reusable by defining a data
attribute like so:
it.only('inserts arrays of json/jsonb', function* () {
return db.normal_pk.insert({
field1: 'nu',
array_of_json: {
data: [{val: 'one'}, {val: 'two', nested: {val: 'three'}}],
rawType: true,
toPostgres: (p) => {
return db.pgp.as.format('$1::jsonb[]', [p.data]);
}
}
}).then(res => {
assert.equal(res.field1, 'nu');
assert.deepEqual(res.array_of_json, [
{val: 'one'},
{val: 'two', nested: {val: 'three'}}
]);
});
});
In this case you also can replace this line:
return db.pgp.as.format('$1::jsonb[]', [p.data]);
with this:
return db.pgp.as.array(p.data) + '::jsonb[]';
Awesome! This works great. Thanks for all the help.
Summary
When I attempt to save to a column that is type jsonb[] the object [{a: "something"}] it interprets the expression as type text[].
I appreciate that this is a complex scenario but I am wondering if there is a way to specify that the object I am passing to this column is of type jsonb[].
The specific error I get is
error: column "data" is of type jsonb[] but expression is of type text[]
Example
Database setup
Code demonstrating the behavior
Expected behavior
Save column to the database
Actual behavior
Error