Closed MentalGear closed 4 months ago
Link Height tasks by mentioning a task ID in the pull request title or commit messages, or description and comments with the keyword
link
(e.g. "Link T-123").💡Tip: You can also use "Close T-X" to automatically close a task when the pull request is merged.
@MentalGear is attempting to deploy a commit to the Aspen Cloud Team on Vercel.
A member of the Team first needs to authorize it.
Hey @MentalGear thanks for the PR! A tighter integration with specs like JSON schema, or at least exporting options, is something we've talked about so we're excited to see it coming from the community.
Taking a look through the PR now.
@MentalGear this looks great. I may fix some types so our builds run properly and rebase before merging.
Is there a way you would like this consumed? One that comes to mind would be in our CLI triplit schema print --format=json-schema > schema.json
(would export to a file). Curious how you feel it would be best consumed for your (and other) use cases.
Hey @MentalGear thanks for the PR! A tighter integration with specs like JSON schema, or at least exporting options, is something we've talked about so we're excited to see it coming from the community.
You're welcome! I regard triplit as an excellent piece of engineering / API design, so I'm happy to contribute as I can afford.
Curious how you feel it would be best consumed for your (and other) use cases.
triplit schema print --format=json-schema > schema.json
makes sense, just a tad long.
Maybe we could introduce a more straightforward tripit export
with the defaults --path=jsonSchema.json --collection=all --format=jsonSchema
?
I originally planed to have a watcher on the triplit schema and run the export each time the schema file changes.
Yet, it would be optimal to have this done internally and have a more integrated solution that could simply be used via:
import { jsonSchema } from "triplit/db"
And access individual collections via jsonSchema.collectionName
.
PS: I would also be interested if there is a proposal / RFC for the data validation feature? It'd be nice to have something that we can plug the (augmented) jsonSchema in, e.g.:
// triplit schema
const schema = {
user: {
schema: S.Schema({
email: S.String(),
// ...
},
}
// dataValidation lib pseudocode
const validationSchema = getDataValLibSchemaFromJsonSchema(jsonSchema)
validationSchema.email.addRule([ "isEmail" ] )
// validation file
const validation = {
user: {
email: function( input, userSchema, allSchema ) {
return validationSchema.validate( "email", input ) ?? false
}
})
or better even
// validation file
const validation = {
user: validationSchema.parse( user )
}
With a config option to set validationRequired: true so that no fields can be set that do not have a data validation rule.
Of course, we would also need to consider schema changes / migrations: validation would need to be versioned with the schema version I assume.
Looking forward to the team's comments on the above and also using export together with zod to setup a SSOST for my application stack. (If I have time and there is interest, I might write a post on it).
Might we see the integration (at least on a cli level) already for Friday's update ?
Thanks for the kind words and for the info!
The first integration with the CLI will probably fit with the current API, but we're always looking to make hotpaths easier.
The watcher sounds like a good start. For runtime access to start we could probably export the some of the export methods.
import { schema } from './triplit/schema';
import { schemaToJSONSchema } from '@triplit/db'
const jsonSchema = schemaToJSONSchema(schema);
Or maybe include a jsonSchema export at runtime:
import { schema } from './triplit/schema';
const jsonSchema = schema.export('jsonSchema');
I think there's a good chance this lands this week.
I think the discussions section would be a good place to discuss our support for validation and an API that makes sense. Your suggestions certainly seem on track with how we'd probably think about it. We dont have any specifics yet, but my guess is it will come after we release an official hooks API, which could be used to perform validation. We generally have a philosophy of "make it possible, then make it easy (ie add it into the API)", so hooks should be the first step in making it possible. But input like this definitely helps with the "make it easy" part.
make it possible, then make it easy
I like that: Give a minium to make it possible (hooks api), so the community can already experiment, and together figure out the easy way.
Last question: Any ETA on the hooks API? (I assume it's 'Data Mutation Triggers' listed with Q2 in the roadmap?)
Exactly!
Don't want to put an official ETA on it, but its very likely the next major feature we're planning on tackling.
Looking forward to it! I have opened a discussions to exchange thoughts on hooks/validation: https://github.com/aspen-cloud/triplit/discussions/168
JSON Schema compliant export functions for both single and all collections.
Unlocks
Ability to use a Single Source of Schema Truth (SSoST) across the whole application stack.
How it works
A transformation pipeline is applied to the Triplit JSON data to transform it into valid JSON Schema. The output is also validated against JSON Schema Draft 07 to ensure compliance.
Usage as a SSoST
Application Examples