Open arackaf opened 1 year ago
The issue currently is needing support for this on the server. I plan on adding some multi statement API but haven't gotten to it yet. It'd be more like a pipeline where the server would just execute them sequentially to avoid round trips.
Oof - look forward to seeing that! Thanks!
Can I ask what the status of this is? @dgraham marked it closed as completed, but I can't for the life of me see anything in the docs about how to do this. @mattrobenolt mentioned this issue elsewhere, but never with an indication that it was done.
Correct, I'm not sure why this is closed either, I'm guessing in error.
I assume once we get the server support, it'll be easy to add in here.
Gotcha - just glad I'm not crazy. I don't have access to re-open, so feel free to, if you don't have this work tracked elsewhere.
Also just curious how high of a prio this is these days. Seems like it's be a huge perf win, letting you do batch operations (ie, insert ids into a temp table, then run two queries against it) without server roundtrips.
It's right behind "getting the API public".
Beyond that, I haven't fully decided on what the API looks like for failures. I was going to research a bit on what other pipelines do and what the native protocol does in multi statement mode.
Yeah I guess that can be tricky. If I send a batch of 5 queries, the first 4 succeed, the 5th throws an error ....
Or you send it 5 and the third fails, but the second statement was a BEGIN
and now there an aborted transaction. We shall figure it out though. I agree that it's an important feature.
fwiw I've had an internal issue open for the server side implementation for this feature. So it's definitely on my mind.
Not 100% related to running multiple different queries simultaneously, but in case anyone else has to manually migrate data into Planetscale or just has any other use case where they want to insert multiple records at the same time, here's a non-edge solution using mysql2
:
utils/database.ts
import * as mysql from "mysql2/promise";
export const createDirectConnection = async (): Promise<
mysql.Connection & { close: () => Promise<void> }
> => {
const connection = await mysql.createConnection(process.env.DATABASE_URL);
return Object.assign(connection, {
close: connection.end,
});
};
index.ts
import { createDirectConnection } from "./utils/database";
const inserts = [
[
1,
"your",
"values",
"here",
],
[
2,
"your",
"values",
"here",
],
];
await database.query(
"INSERT INTO table (`id`, `column1`, `column2`, `column3`) VALUES ?",
[inserts]
);
I've tried the above solution with batches of 1000 records each, working flawlessly. Though, inserts of this size will take exponentially longer — keep that in mind.
Also, inside the database.query
function, inserts
is wrapped by another pair of []
. This is very important to insert multiple records simultaneously in mysql2
! (Meaning, the second parameter is actually [ [ [ 1, ... ], [2, ... ] ] ]
)
I'm debating on hosting my own proxy on Cloud Run, which handles batch inserts exclusively, as this is a crucial feature for any database.
In my project, I have to insert data into multiple tables simultaneously. This goes up to 25 records, taking up to 1500ms per transaction, which is everything but acceptable for an API.
Reopening because this isn't implemented and I still would like to see this through.
The API to support this still needs to be shipped first before implementation can be done in here, but the client implementation should be rather simple.
I would like to see this as well.
Same here, I'd like to be able to batch inserts and updates.
Apologies if this is already supported, but would it be possible to get the ability to run multiple statements in one request? Something like
Obviously you can achieve that with separate queries, but that would require blocking until the insert is done, and then blocking while you grab the inserted id.