rashfael / eslint-plugin-vue-pug

linting your pug templates in vue single file components
MIT License
44 stars 7 forks source link

Consider directly requiring tokenizer in base config #24

Open urkle opened 8 months ago

urkle commented 8 months ago

I tried adding this to my yarn project which uses PnP and it failed with several dependency errors.

First issues is that vue-eslint-parser is configured as a devDependency instead of a dependency.

yarn error

Parsing error: vue-eslint-parser tried to access vue-eslint-parser-template-tokenizer-pug, but it isn't declared in its dependencies; this makes the require call ambiguous and unsound.

Moving this to the dependencies section instead of devDepeendencies fixes this error.

Then we get to the next issue, the inclusion of vue-eslint-parser-template-tokenizer-pug. After fixing the first error I receive this message.

Parsing error: vue-eslint-parser tried to access vue-eslint-parser-template-tokenizer-pug, but it isn't declared in its dependencies; this makes the require call ambiguous and unsound.

This is caused because vue-eslint-parser is trying to require that package and has no dependency on it (but estlin-plugin-vue-pug does).

One way that should fix this is to not specify the string value in the options that this package created, but instead pass the tokenizer directly

Thus using this instead

const PugTokenizer = require('vue-eslint-parser-template-tokenizer-pug');
module.exports = {
    // parser: ...,
    parserOptions: {
        // other options
        templateTokenizer: { pug: PugTokenizer }
    },
    // other options
}
rashfael commented 8 months ago

I only officially support npm, since that is what I use and I don't have the bandwidth to extensively test other package managers.

Configuring just the dependency name seems common practice in eslint and eslint-plugin-vue, defining the parser in parserOptions also works this way (probably to support JSON-based configs).

It's great though that you've found a solution in your setup. I will consider referencing the tokenizer directly instead of by name in the default config, but I will need to test first if this impacts json configs at all.