Closed wesleytodd closed 6 years ago
The mysql package can do that because it doesn’t use real parameterized queries. pg, on the other hand, passes the parameters on to PostgreSQL. Instead, you can use a query builder like Knex, a formatter that does about the same thing as mysql like pg-format, column arrays, an array of jsonb rows…
If I understand your answer and the code I dug through you are making prepared statements for any parameterized queries. Is this documented somewhere? Seems like that is probably an important thing for people to understand?
That’s what parameterized queries are.
Lol, I see that :)
As a user of the library I thought it was important to know that, and it is different than other libraries I have used. I was hoping to find a reference to this in the docs to make it clear how those worked. Probably in this section:
If you are passing parameters to your queries you will want to avoid string concatenating parameters into the query text directly. This can (and often does) lead to sql injection vulnerabilities. node-postgres supports paramterized queries, passing your query text unaltered as well as your parameters to the PostgreSQL server where the parameters are safely substituted into the query with battle-tested parameter substitution code within the server itself.
There is a performance trade off when the library uses prepared statements under the hood that is not covered here. Would you accept a PR to add "Under the hood, parameterized queries are implemented as prepared statements. This can have performance impacts you should be aware of when using."?
I think that note would be vague and misleading, and that the text you quoted already describes how it works.
Do you have a better suggestion? For me it was unexpected to find out that this was not being done in the client and instead was using prepared statements. Maybe changing the sentence to read:
passing your query text unaltered as well as your parameters (via prepared statements) to the PostgreSQL server where the parameters are safely substituted into the query
That change alone would have tipped me off enough to not have it be a surprise.
The original:
passing your query text unaltered as well as your parameters to the PostgreSQL server
already means it’s not being done in the client, no?
It did not mean that to me when I first read it. Maybe other people read more closely than me, but in either case, calling out the specific technique being used would be helpful to users. At least it would have been helpful to me because I wouldn't have thought this feature mentioned in my OP would be possible here.
@wesleytodd The best solution for multi-row inserts.
Another option https://stackoverflow.com/a/37445088/3163120
const obj = [
{operation:"U",taxCode:1000},
{operation:"U",taxCode:10001}
]
query(`
INSERT INTO table (operation, taxCode)
SELECT *
FROM json_to_recordset($1)
AS x("operation" text, "taxCode" int);
`,
[JSON.stringify(obj)]
)
Result
------------------------
| id|operation|taxCode |
------------------------
| 1 | "U" | 1000 |
------------------------
| 2 | "U" | 10001 |
------------------------
Would you be interested in building support for multi row insert into the parameterization semantics?
The
mysql
package as something like this:Which results in a query like:
Would you be interested in adding this feature to
pg
? If so I can take a look at what it would take. Probably it cannot be the same api, but some version would be nice IMO.Other related issues: #530 #1644