Open aidenwallis opened 2 years ago
Actually, all .batch
calls to D1 are transparently wrapped in transactions, if that helps.
Yeah, I've considered that, unsure how the correct behaviour should be created within the context of Kysely's API right now, if you have any ideas feel free to create a PR and we can go from there!
@geelen Looking into this a little further, as it currenly stands I don't think it would be possible to implement transactions with kysely-d1 due to the .batch()
API for the workers runtime.
When kysely opens a transaction, it expects each subsequent database call during that transaction to return a response. As this batch API involves a single roundtrip to D1 in order to get an atomic transaction, there isn't a way forward to support this "mid-transaction response" behaviour from Kysely. We would have to either return no results (which isn't how most dialects for Kysely work), or hold all promises until they commit the transaction (which could make a really weird developer experience).
Unless there's a way for D1 to support transactions without making a single roundtrip, I don't see a decent way too add transactions support to this library.
Yeah, sounds a bit awkward. Thanks for digging into this.
We do have an alternative API coming up for JS-based transactions: https://blog.cloudflare.com/whats-new-with-d1/#sneak-peek-how-we-re-bringing-javascript-transactions-to-d1
Basically, we don't want to support arbitrary JS transactions from any Worker because that locks the Primary DB for all users, so we're testing this mechanism for invoking a "procedure", which is effectively an RPC call into your same Worker but running right next to the Primary. It's a little tricky to get your head around (but the blog post does a decent job of explaining it I think?), and I'll let you know once we have it open for testing, maybe that can work better..
I was looking at transaction support in d1 and found this.
Keen to know if @geelen you have updates on that front?
Not a solution to this, but as a workaround in the meantime I got type-checking working with Kysely and D1#batch
like this:
import { type InferResult, type Compilable } from "kysely";
/**
* Make a `batch` function for your database that you can pass Kysely statements to.
*
* @param d1 Your D1 database
* @returns A `batch` function that you can pass Kysely statements to, which will run them in a transaction.
*/
export const batchFactory =
(d1: D1Database) =>
/**
* Batch-call a D1 database with Kysely statements, which ensures the statements run in a transaction.
*
* @param queries The un-compiled Kysely statements to run in a transaction
* @returns A 2-dimensional array of the results of each statement, in the supplied order.
*/
async <
// Use a `const` type parameter to narrow this to a tuple when one is supplied
const Q extends Compilable<any>[],
>(
...queries: Q
) => {
const statements = queries
.map((query) => query.compile())
.map(({ sql, parameters }) => d1.prepare(sql).bind(...parameters));
// This type signature is compatible with D1, but it'll narrow to the length and types of the tuple
return (await d1.batch(statements)).map(({ results }) => results) as {
[I in keyof Q]: InferResult<Q[I]>;
};
};
This uses const
type parameters and mapped tuples to narrow the D1#batch
types a bit further than they currently do :)
Transactions are not currently supported in D1. Once they are, this package needs support added for them.