sebastianwessel / surrealdb-client-generator

Tool which generates a typescript client for SurrealDB and zod schema of a given database
MIT License
77 stars 12 forks source link

SDK string conversion issues #50

Open hammo92 opened 4 months ago

hammo92 commented 4 months ago

The JS SDK adds the s parameter to all parameters provided as a string; this prevents Surreal from inferring types and forces a difference between the input and output schemas for a table. This causes errors with our schema definitions as we are expecting to be able to pass a date string to date fields but this will throw an error:

await db.query(`DEFINE TABLE test SCHEMAFULL; DEFINE FIELD time ON test TYPE datetime; DEFINE FIELD relation ON test TYPE record<test>;`)

//fails
await db.create("test", {
    relation: new RecordId("test", "123"),
    time: new Date().toIsoString()
})
// successful
await db.create("test", {
    relation: new RecordId("test", "123"),
    time: new Date()
}) 

But our generated type for this table would be

 z.object({
    relation: recordId('test'),
    time: z.string().datetime()
})

Barring a change to the SDK I think the best approach is likely to create a custom Zod type for dates similar to the recordId type which applies a transform and converts the value to a date; this would allow us to provide a string or date object argument for datetime fields.

sebastianwessel commented 4 months ago

Hey @hammo92 We should maybe open an issue on the SDK repo for this, to clarify all the different possibilities like date, datetime, time, time range, date range and datetime range? It makes somehow no sense in the sdk, to only accept Date, while all around is JSON-string-date.

In general, a custom type, which accepts string and Date would be the way to go on our side, from my point of view. This would be the best DX imo. But it might become complex, if there assertions?

hammo92 commented 4 months ago

@sebastianwessel apparently it's a side effect of the transition from JSON to CBOR. With the new implementation seems that quite a few types are expecting classes:

    rid: new RecordId("person", "tobie"),
    dec: new Decimal("3.333333"),
    dur: new Duration("1d2h"),
    geo: new GeometryLine([
        new GeometryPoint([1, 2]),
        new GeometryPoint([3, 4]),
    ]),

    tb: new Table("table_name"),
    uuid: new Uuid("92b84bde-39c8-4b4b-92f7-626096d6c4d9"),
    date: new Date("2024-05-06T17:44:57.085Z"),
    undef: undefined,
    null: null,
    num: 123,
    float: 123.456,
    true: true,
    false: false,
    string: "I am a string",
});

I guess for the best dx we would have to create custom types for each. Or change our schema declaration to separate input and output schemas.

sebastianwessel commented 4 months ago

As long as these classes are available in the SDK, we can use z.instanceof(Decimal); as the first step. This can be used in both schema.

I would prefer to decouple validation from transformation.

It might be a good idea to have some helper in the future. Something like transformToInputSchema, which takes the native representations, and creates the class instances if needed from string, numbers, objects, dates and so on. This would leave the decision to the developer, if he wants to work with these classes from SDK or with more native stuff like strings. Also, the maintenance of this lib would be still somehow handleable.

hammo92 commented 4 months ago

Do we not still have the issue of the input and output schema having different types if we are using the classes? It's fine for the Record as we're returned a RecordId instance, but ee're never returned an instance of date by the sdk we are always returned a string value

   await db.query(`DEFINE TABLE test SCHEMAFULL; DEFINE FIELD time ON test TYPE datetime; DEFINE FIELD relation ON test TYPE record<test>;`)

    const a = await db.create("test", {
        relation: new RecordId("test", "123"),
        time: new Date()
    })

    [
      {
        id: RecordId { tb: 'test', id: '5r5qyfp1hp9mhaj6ar98' },
        relation: RecordId { tb: 'test', id: '123' },
        time: 2024-07-25T16:13:44.124Z
      }
    ]
sebastianwessel commented 4 months ago

Tbh, I need to look into the SDK and try out the new version. In the best case, we pass through the returned data 1:1 and do only validation.

I wonder if it really makes sense to have validation during reads from db. Maybe, it is better if we have zod schema generation for writes, and only typescript type generation for read?

hammo92 commented 4 months ago

I think you're right that validation on read is excessive, are you thinking to generate the read types without inferring them from Zod?

sebastianwessel commented 4 months ago

Yes. Might also be simpler, as we only need to figure out the type and if it is optional/required, but not something like "is email". Also, it does not make really sense to have this full schema for reads, as most of the time, the result of a read is very custom, depending on the query. Most of the time, nobody is reading the full entry from db - maybe only for simple structures and basic CRUD.

hammo92 commented 4 months ago

Makes sense, should be simpler to implement that way as well.