Open Belco90 opened 2 years ago
We could even derive
supportedFrameworks
from therecommendedConfig
meta.
@MichaelDeBoey when the rules are enabled in a recommended config yes, but I'm afraid we can't rely on this when the rule isn't enabled in any shareable config (e.g. no-manual-cleanup
, or prefer-explicit-assert
). So we would need this new supportedFrameworks
.
We should even be able to run all rules on all testing frameworks though. I currently only did this when the rule is available in the config for that specific testing framework.
That made me think about my approach might not be right, actually. For example, in no-manual-cleanup
there are different outputs depending on the testing framework. This is a scenario we couldn't reach with my proposal, so perhaps it has to be a callback, receiving the framework being tested to do things conditionally when needed, rather than just a placeholder replaced. Something like:
// test/lib/rules/foo.js
ruleTester.run(RULE_NAME, rule, ({ testingFramework }) => ({
valid: [
{
code: `
import { fireEvent } from ${testingFramework}
await fireEvent.click(element)
`
output: testingFramework.startsWith('@marko') ? 'foo' : 'bar'
}
]
}))
Plugin version
v5
What problem do you want to solve?
Quoting myself from #588:
Thinking about all the changes we did to test all rules for the corresponding Testing Library frameworks, I think it makes sense to add some improvements to our
TestingLibraryRuleMeta
andcreateRuleTester
to automatically generate the test cases variations for indicated frameworks.Your take on the correct solution?
In my mind, it could look like this:
Anything else?
No response
Do you want to submit a pull request to implement this change?
Yes