Open not-my-profile opened 1 year ago
There are some interesting points here. Not sure if a strict mode is a good idea because it would be hard to implement some of the checkings.
For example, there are many ways to select and copy a file:
site.copy([".jpg"]
site.copy("image/one.jpg")
site.copy("image")
The same file can match to multiple conditions, and this is okay. For example, you may want to copy all jpg files to a specific directory, except a couple of files that have a different destination. The only thing we can do is if the same condition was already used (for example, calling site.copy("image")
twice), we can show a warning.
Making a typo in a variable in a nunjuck template (i.e. referencing undefined data).
Nunjucks has already an option for that (throwOnUndefined
) that by default is false.
Making a typo when specifying a tag name (either in the metadata or when calling e.g. search.pages).
I don't understand this. It's okay to filter by non existing fields. For example, you may want to get all pages with the category "lume", so you run search.pages("category=lume")
. There may be pages without the category defined (because they are different types of pages that don't need it). What do you think it should happen here?
I think the best way to address the pitfall of the current Site.copy
API is to enable the copying of one file to multiple destinations and perform the deprecations suggested in https://github.com/lumeland/lume/issues/428#issuecomment-1585699760.
I don't understand this.
I agree that it's okay to filter on non-existing fields. I was talking of some way of validating the metadata defined in frontmatters. I guess this could be achieved with an event handler.
If you were to define the schema for your data with e.g. JSON schema we could even validate queries. E.g. if you have defined that there only are the tags foo
, bar
and baz
then filtering for a tag orange
could be detected as an error. But yeah validating data is of course more important than validating queries.
Schema validation is a nice idea for a plugin. It could be a preprocessor that validate all pages before rendering and show warnings when a page has a different scheme.
With Lume there are currently many opportunities to make small mistakes that will not be reported. For example:
Site.copy
(if the file doesn't exist, nothing will happen).Site.copy
multiple times for the same source path with different destinations (only the last call will take effect).search.pages
).I think it would be nice if such likely usage errors would be detected and reported. The CLI could even have a
--strict
mode that could fail if any such problem was to be detected for usage within CIs. Detecting the mistyped tags would require the ability to optionally declare them with e.g.lume({tags: ["post"]}, ...)
.What do you think about this matter?