kysely-org / kysely

A type-safe typescript SQL query builder
https://kysely.dev
MIT License
10.85k stars 276 forks source link

In a migration, how to refer to the type created by db.schema.createType #1269

Open jroitgrund opened 6 days ago

jroitgrund commented 6 days ago

When using a custom schema

export default defineConfig({
  dialect: new PostgresDialect({
    pool: new Pool({
      connectionString: process.env.DATABASE_URL,
    }),
  }),
  plugins: [new WithSchemaPlugin('custom-schema')],
  migrations: {
    migrationTableSchema: 'custom-schema',
    migrationFolder: 'src/db/migrations',
  },
})

the following migraiton fails:

  await db.schema
    .createType('enum_type')
    .asEnum([
      'value'
    ])
    .execute()
  await db.schema
    .createTable('projects')
    .addColumn('enum', sql`enum_type`))
    .execute()

I get

    error: type "enum_type" does not exist

(presumably because the type needs to be prefixed with the schema).

How can I fix this? I don't know how to access the schema name from the migration to manually prefix it.

Note that none of the sql helpers (.ref, .table, etc) seem to fix this.

jroitgrund commented 6 days ago

It looks like (db as any).schema.executor.plugins[0].transformer.schema exposes the schema but the private properties are only accessible in a debugger.

yrahcaz commented 5 days ago

Try wrapping the type in double quotes:

.addColumn('enum', sql`"enum_type"`))
jroitgrund commented 5 days ago

I get the same error with double quotes

yrahcaz commented 5 days ago

You've also got an extra ) in your example, maybe that's it. I am doing exactly what youre proposing with success

jroitgrund commented 5 days ago

Sorry - I wrote but didn't run the minimal repro above, hence the syntax error.

I've now tried running the actual minimal repro as the only migration on a fresh DB:

import { PostgresDialect, WithSchemaPlugin } from 'kysely'
import { defineConfig } from 'kysely-ctl'
import { Pool } from 'pg'

export default defineConfig({
  dialect: new PostgresDialect({
    pool: new Pool({
      connectionString: process.env.DATABASE_URL,
    }),
  }),
  plugins: [new WithSchemaPlugin('other')],
  migrations: {
    migrationTableSchema: 'other',
    migrationFolder: 'src/db/migrations',
  },
})
import { type Kysely, sql } from 'kysely'

export async function up(db: Kysely<any>): Promise<void> {
  await db.schema.createType('enum_type').asEnum(['value']).execute()
  await db.schema
    .createTable('projects')
    .addColumn('enum', sql`enum_type`)
    .execute()
}

export async function down(db: Kysely<any>): Promise<void> {}

(this is the only migration in src/db/migrations and the DB is brand new)

$ DATABASE_URL=postgresql://postgres:postgres@localhost:5432/db pnpm kysely migrate:latest
◐ Starting migration to latest
✖ Migration failed with error: type "enum_type" does not exist @ "1732135764590_project"

This happens both with and without double quotes.

If I remove the createTable statement, the type is correctly created in the target schema:

db=# SET search_path TO other;
SET
db=# \dT
        List of data types
 Schema |   Name    | Description
--------+-----------+-------------
 other  | enum_type |
(1 row)
"kysely": "^0.27.4",
jroitgrund commented 5 days ago

Here's a self-contained repro:

https://github.com/jroitgrund/kysely-repo

clone and run

$ docker compose up -d
$ pnpm kysely migrate:latest

BTW, with the following diff

diff --git a/migrations/1732136775207_migration.ts b/migrations/1732136775207_migration.ts
index 19fe917..3d863d5 100644
--- a/migrations/1732136775207_migration.ts
+++ b/migrations/1732136775207_migration.ts
@@ -4,7 +4,7 @@ export async function up(db: Kysely<any>): Promise<void> {
   await db.schema.createType("enum_type").asEnum(["value"]).execute();
   await db.schema
     .createTable("projects")
-    .addColumn("enum", sql`"enum_type"`)
+    .addColumn("enum", sql`other.enum_type`)
     .execute();
 }

the migration completes, which is what's leading me to believe that kysely isn't prepending the schema name to the custom type in the createTable statement.

koskimas commented 4 days ago

Kysely doesn't parse raw SQL and therefore has no idea what you wrote inside your raw sql snippet. For that reason, it can't prefix it with the schema.

We already have the sql.id function that can be used to create raw identifiers in a way kysely knows it's an identifier (table name, column name, type name etc.). But we don't prefix those either unfortunately.

We should add something like sql.schemableId to be able to handle these cases. It's definitely not a clean solution.

Alternatively we could allow ANY string in the second argument of addColumn as I think @igalklebanov suggested somewhere. Then we could treat any type as a "schemable id" automatically.

jroitgrund commented 4 days ago

Thanks for taking a look! Do you have a suggested workaround in the meantime? I tried setting the schema as a field on the DB instance before migrating so I can manually prepend it to the type name, but since the DB object passed to the migrator is actually a transaction, this doesn't work.

Then we could treat any type as a "schemable id" automatically.

My hunch is that schemableId would work better than implicitly treating anything 'unrecognized' as schemable, because some of the types which currently require raw SQL aren't "schemable" (e.g. types added by extensions, array types).

For example,

.addColumn('list, sql.raw(`${schema}.integer[]`), (col) => col.notNull())

✖ Migration failed with error: type "other.integer[]" does not exist @ "1731672569480_project"                                                                  
koskimas commented 4 days ago

Yeah there's a way. I can write a https://kyse.link for you in a bit.

koskimas commented 4 days ago

Actually I couldn't figure out a way to do that currently 😞

I think the only way is to somehow pass in the schema using an environment variable or a global variable and glue it to the type manually 😬

jroitgrund commented 4 days ago

Unfortunately, my use case is one-schema-per-test and running tests concurrently. I'll keep an eye on the issue, thanks for the quick replies :)