Closed CatGuardian closed 6 years ago
Oh wow, that is quite the oversight on my part, thanks for the PR!
One small question: do you think it would make more sense if the plugin's validators appears in Model.validators
, or if they did not? If yes, then I think the right place to stick them is into the Model.loadExternal
startup method (appending/prepending them to the type's validators); however if not then just loading them in the helper should work perfectly.
The reason I'm pondering this is that I kinda like the idea of being able to view all the information related to something in "one place" rather than having to manually aggregate it. Not that I think there's many use cases for asking "what validation plugins are registered for this Model?" but still.
On a side note, I've actually spent a bit of time today looking at how to update the validation layer to support MongoDB 3.6's validation functionality - with the objective of being able to use the same validation attributes you provide in your models to populate the DB's validation config (if you so wish). To achieve that I will likely be looking at moving to JSONSchema - however I'm not sure how much demand there is for complex client-side validation functions (i.e. stuff that can't be done easily with JSONSchema). What are your thoughts there?
Yeah I think you are right that I misplaced where the hookup for the plugin validators go.
The reason for this is because of this method from Model.ts
:
/**
* Gets the custom validation types available for this model. These validators are added to the
* default skmatc validators, as well as those available through plugins, for use when checking
* your instances.
*/
get validators() {
return this._validators;
}
This comment to me indicates that its meant to have the plugin validators too. However, the comment is a bit wrong too in that the global skmatc isn't altered to include the validators, so i will update that to make it more clear as well.
I'll attach the validators to the model before it gets to the model helper. I'll do that in loadExternal.
Hang tight for that quick update.
As for your native mongoDb validation question. I'll talk about that in the next comment.
Wow I didn't know about JSONSchema. I wish you would have told me 2 weeks ago. At work we were just struggling with what to do for validating the various filters and sorts for extending our endpoints to utilize paging. We ended up with the npm package 'joi'
Joi is really cool and neat! I don't use Iridium at work but do personally and once I discovered 'joi' I wrote a validation wrapper for it to work with Iridium:
import * as Joi from 'joi';
import * as _ from 'lodash';
import { IValidationHandler } from 'skmatc';
export function JoiValidator(joiSchema: Joi.SchemaLike, message?: string, options?: Joi.ValidationOptions): IValidationHandler {
const validatorFn: IValidationHandler =
function(schema, data, path: string) {
return this.assert(Joi.validate(data, joiSchema, options).error === null, message);
};
return validatorFn;
}
It doesn't make use of joi's error messaging framework but you pass in your own message for if something fails.
Here is how to use it:
@Iridium.Validate('location', JoiValidator(Joi.string().allow('').max(1000).optional(), 'Location must be 1,000 or less characters'))
@Iridium.Validate('biography', JoiValidator(Joi.string().allow('').max(10000).optional(), 'Biography must be 10,000 or less characters'))
export class User extends BaseInstance<IUserDocument, User> implements IUserDocument {
...
}
Anyways, back to what you were saying. Yes I agree that skmatc is kind of limited and eventually it would be best to move to a better validation framework. I definitely like the idea of putting the validation onto the MongoDb server because then the validation will apply no matter how you access the MongoDb, whether its with Iridium or something else. This could be good for enterprises with various teams who like to use different libraries to access the same MongoDb server.
I definitely think moving to JSON Schema is a great idea because it looks like its soon going to become an official Standard (json-schema.org) and the fact that MongoDb supports it is a huge plus. And another benefit is you can use the same validation you do on your endpoints with your database code too. So there isn't two validation technologies being used. Only one technology is used then.
Here are my thoughts on the concern about stuff that can not easily be done with JSONSchema. First, the current validation stuff you have can't do asynchronous operations. And neither can JSONSchema. So that is fine. The other thing the current validation system can't do is check that a property is valid based on information about other properties. Meaning you can't validate Sandwhich.type = 'Peanut Butter and Jelly
is correct based on what the sandwhich is made of (can't use Sandwhich.ingredients
to validate Sandwhich.type
. Well i mean you can if Sandwhich
is a subproperty of the bigger document because you can define a custom validator that gets the whole sandwhich object. But right now the current validation system doesn't have a class Validator, only a property Validator. And I don't think JSON Schema can do that either. So that is the same again.
TLDR: basically switching to the JSON Schema validation that is built into MongoDb has the same capabilities and more than the current validation system Iridium uses. So we should definitely do that.
Here is my suggestion: Allow the JSON Schema validation that MongoDb supports to be wrapped up in a new decorator, or update the @Collection
decorator to allow the MongoDb's JSON Schema to be specified. And document it so that everyone knows the MongoDb will be doing the validation. Then continue to have the Iridium's validation as it is for now but make it clear that the validation will execute before it goes to the db. I would also suggest making the documents say it is suggested to use the MongoDb validation whenever possible as the Validation might be deprecated. Then later you can always deprecate the validation layer and remove that code if it is never used. I was just about to say that I am not opposed to removing the validation layer right now, BUT I just found an example that I am currently doing that I don't think JSONSchema can handle. I want to store just the 'YYYY-MM-DD' of a date because I need my users to be of a certain age. And at my current job I discovered it is not a good idea to use '2017-09-23 00:00:00' (or a javascript Date object) as a way to store just dates ('YYYY-MM-DD') because there is a whole bunch of headaches that come with it. So I want to store the date as a string but still validate that they are older than a certain age so I need Iridium to execute a custom validator so I can use moment
to figure out if they are old enough. @Validate('dateOfBirth', UserValidators.DateOfBirthValidator)
So this is a great find. Thank you for sharing it, I will probably start using that myself instead of joi now. And I will see if my buddy at work wants to change over to the JSON Schema as well.
As for the JSON implementation stuff, I was considering doing something like this:
@Property(type)
maps type
to one of a set of "lookup" types ([String]: { type: "string" }
etc) so that a number of existing, simple, property types continue to work as normal. @Validate
would create new mappings for this, so you could define your own shortcuts if you wish. Of course, you could also provide a normal JSONSchema entry there, so @Property({ type: "string" })
would work, or something more complex if needed.
The @Property
decorator would then continue to populate .schema
on the instance class, however it would do so within this object structure:
static schema = {
type: "object",
bsonType: "object",
required: ["_id"],
properties: {
_id: {
type: "string",
bsonType: "ObjectId"
}
},
additionalProperties: false
}
So adding @Property(String)
to name
would set the schema.properties.name
field and (since required is the default) add name
to schema.required
.
This has the added benefit of making it really easy for us to support using @Property(MySubInstance)
and just injecting the .schema
property from that - which then allows you to use @Property
for nested documents as well (which should be really helpful).
Coming back to the idea of being able to validate cross-property values, JSONSchema allows that using the anyOf
and allOf
options on the schema. With the .schema
field you could add that kind of additional validation pretty easily, so in your example:
schema = {
type: "object",
anyOf: [
{
properties: {
type: {
type: "string",
pattern: "Peanut Butter and Jelly"
},
ingredients: {
type: "array",
items: {
type: "string",
enum: ["Peanut Butter", "Jelly"]
}
}
}
]
}
Of course, that doesn't deal with async
validation or the need for custom functions, however JSONSchema allows us to define additional fields if we wish, so maybe we can just populate a $clientSide
field on our property definitions with a method that is called by Iridium to perform that validation 🤔. I'd also want to embed a JSONSchema validator and run that stuff anyway within Iridium (i.e. not purely rely on the DB) since that will allow us to support MongoDB <3.6
as well as use cases where you don't want to put the load of invalid queries on the DB.
Ok the code is updated.
I am very interested in reading your response about the validators but I must defer that to later.
I made another change for a missing typing which was causing build failure when using alpha-13. So please look at my comments for the last commit I did.
Resolves issue #119