Closed jrfnl closed 4 years ago
It is interesting indeed.
However right now my priorities would have to go into documentation instead.
If you find unit tests that fails, that mean we need to update the code to make it right? At this point, as I said in #56, I can't focus myself on that, so I would need people to help with the potential generated work this might create.
Before you start we need to recognize that since this is a security tool, its philosophy is about limiting the number of false negative. Diminishing the false positive will be good, but the tests should have a focus on making sure we're not missing something we should have caught.
We also need to agree on the difference between unit tests and the tests.php file. Right now I do have the requirement of seeing that every sniff raise at least one warning/error in the tests.php file. I also want people to be able to run the tool normally (as running a scan) on tests.php and be able to see the results. I don't know how much what you have in mind would clash with that, but I see a difference between running a test suite and offering a file to see the results of an example scan that hit all rules. We could simply keep both solution, and at best, generate the file from the tests cases if it's easy/possible.
Thanks!
Before you start we need to recognize that since this is a security tool, its philosophy is about limiting the number of false negative. Diminishing the false positive will be good, but the tests should have a focus on making sure we're not missing something we should have caught.
IMO the tests should cover both: prevent false positives and false negatives and ensure that what needs to be reported is reported.
We also need to agree on the difference between unit tests and the tests.php file. Right now I do have the requirement of seeing that every sniff raise at least one warning/error in the tests.php file. I also want people to be able to run the tool normally (as running a scan) on tests.php and be able to see the results. I don't know how much what you have in mind would clash with that, but I see a difference between running a test suite and offering a file to see the results of an example scan that hit all rules. We could simply keep both solution, and at best, generate the file from the tests cases if it's easy/possible.
Well, as a start I would split out the test cases from tests.php
into their own files.
Keeping both will make maintenance more involved as that wouldn't be easily enforced (while having the individual unit tests files can be).
All the same, if I understand you correctly, your concern is more for it to be easy for people to see the results of an example scan. Correct ?
With the split out test cases, that would still be possible with only a minimal adjustment to the command: phpcs ./Security/Tests/ --standard=Security --extensions=inc
would run the scan over all the test case files.
Basically if we'd base the unit tests off the PHPCS native test suite, each sniff will get a SniffNameUnitTest.inc
file with test cases for that specific sniff and a SniffNameUnitTest.php
file with basically just two simple data providers stating on which lines to expect errors and on which lines to expect warnings (and how many on each line).
The PHPCS native TestCase will then run the sniff over the test case file and check that all lines where errors/warnings are expected have those and that all other lines do not have any errors/warnings.
If you like, have a look here for an example of what that looks like: https://github.com/PHPCSStandards/PHPCSExtra/tree/develop/Universal/Tests/Arrays
@jmarcil What about if I start with just adding the unit test setup and then go through every sniff individually (separate PR for each sniff) to set up the unit tests and make any quick fixes for false positives/false negatives I see ?
I can open issues for anything I come across which are false positives/negatives which require more extensive fixes/refactoring of the sniffs.
@jmarcil FYI: I've created an initial setup with tests for one sniff so far.
You can have a look at it here: https://github.com/jrfnl/phpcs-security-audit/tree/feature/initial-unit-test-setup and see a passing Travis build which includes running the unit tests here: https://travis-ci.org/jrfnl/phpcs-security-audit/builds/661271491
If you find unit tests that fails, that mean we need to update the code to make it right? At this point, as I said in #56, I can't focus myself on that, so I would need people to help with the potential generated work this might create.
Oh and just to be clear: help is exactly what I'm offering.
Now that we have one unit test done with #70, can we please close this issue?
Right now there's too many threads for me to follow left and right and I'm trying to cut that number down if possible.
As far as I can tell, we have answered the original questions and everything else can be tracked in their respective issues.
As partially discussed in #50, now the namespace has been fixed, sniff specific unit tests could be added based on the PHPCS native unit test framework.
I'd be willing to do the initial setup for this if it helps,.
Open questions: