Our customers are asking to notify them in case they add a new file and, as an example, a function, but forget to write or add tests for it. Or they simply do not upload a coverage report for this newly created file or method.
Here I've made some yaml changes and created a new PHP file. Yet, I didn't upload any coverage report that would cover this new file. Also I don't have CFF that would have the knowledge about this addition.
https://app.codecov.io/github/vlad-ko/laravel-stripe-app-gh/pull/56
The caveat is that I have testable changes in my PHP file and untestable changes in the yaml file.
A brute-force approach could be: if we see new lines in diff and have no uploaded tests, we'd show a warning to the customer along the lines of:
"We detected new lines, however Codecov doesn't have any coverage data. Please review this PR carefully"
Potentially we could add all known executable file types (i.e. only look for changes in php, js, py, ts, java .. etc.), the process would be the same. If we see new lines in a PHP file, but no tests we should signal a "caution" to the customer.
Additionally we could ask the customer to define, via include or exclude paths, which lines should be tested. The difficulty will be knowing which lines within a file are testable, and which ones are comments or blank ones for example. This feature would also require some static analysis implementation.
Future improvements:
Look for changes in the existing reports and uploaded reports to match the file name of the addition to the tests that were executed, where we have coverage data.
For example, we have coverage data for User.php in the same directory as the new file, and we see a corresponding UserTest.php, but nothing for NewFileTest.php, there is a good chance that we have missing tests.
Leverage static analysis to accurately determine if the lines are "testable"
Look at the "neighboring" test if available (i.e. files in the same test directory) and attempt to determine if a test should be written for the new code
Study the path of the tests and determine if one should exist for the newly created file
Just for fun, leverage AI to tell Codecov what could be tested and how to improve the overall code. It's actually quite accurate and even knows to suggest the right testing approach for the framework I am using. But even if we only present point 1, it's still rather amazing help.
Our customers are asking to notify them in case they add a new file and, as an example, a function, but forget to write or add tests for it. Or they simply do not upload a coverage report for this newly created file or method.
Here's an example PR, where I've added a new "model" to my application, but never executed the unit tests. https://github.com/vlad-ko/laravel-stripe-app-gh/pull/56
Here I've made some yaml changes and created a new PHP file. Yet, I didn't upload any coverage report that would cover this new file. Also I don't have CFF that would have the knowledge about this addition. https://app.codecov.io/github/vlad-ko/laravel-stripe-app-gh/pull/56
However, Codecov knows that some "new lines" were added in the PR. We determine this by looking at the git diff. https://patch-diff.githubusercontent.com/raw/vlad-ko/laravel-stripe-app-gh/pull/56.diff
The caveat is that I have testable changes in my PHP file and untestable changes in the yaml file.
A brute-force approach could be: if we see new lines in diff and have no uploaded tests, we'd show a warning to the customer along the lines of:
"We detected new lines, however Codecov doesn't have any coverage data. Please review this PR carefully"
Potentially we could add all known executable file types (i.e. only look for changes in php, js, py, ts, java .. etc.), the process would be the same. If we see new lines in a PHP file, but no tests we should signal a "caution" to the customer.
Additionally we could ask the customer to define, via include or exclude paths, which lines should be tested. The difficulty will be knowing which lines within a file are testable, and which ones are comments or blank ones for example. This feature would also require some static analysis implementation.
Future improvements:
Look for changes in the existing reports and uploaded reports to match the file name of the addition to the tests that were executed, where we have coverage data. For example, we have coverage data for
User.php
in the same directory as the new file, and we see a correspondingUserTest.php
, but nothing forNewFileTest.php
, there is a good chance that we have missing tests.Leverage static analysis to accurately determine if the lines are "testable"
Look at the "neighboring" test if available (i.e. files in the same test directory) and attempt to determine if a test should be written for the new code
Study the path of the tests and determine if one should exist for the newly created file
Just for fun, leverage AI to tell Codecov what could be tested and how to improve the overall code. It's actually quite accurate and even knows to suggest the right testing approach for the framework I am using. But even if we only present point 1, it's still rather amazing help.