Using the automated tests can give us a good confidence that any proposed PRs will not break things.
Unfortunately the number of different config options gives us a combinatorial explosion and realistically not all can even be compile tested, let alone have functional tests written for them.
The project should therefore define a small number of configuration options are are tested, these options should also have a method for automated testing in an isolated environment, however just proving that the combination compiles might still be useful.
Each config set will be compiled approximately 6 times, so increasing the number of set can have a significant impact on how long the automated tests run (with the current test speeds, even three option sets could be excessive)
As far as I know, all the built time config options to choose from are documented in the BuildConfig doc.
This ticket is for discussion, so if you have a set of options you feel should be tested, or any good ways to perform the relevant automated functional testing, add your voice here
Using the automated tests can give us a good confidence that any proposed PRs will not break things.
Unfortunately the number of different config options gives us a combinatorial explosion and realistically not all can even be compile tested, let alone have functional tests written for them.
The project should therefore define a small number of configuration options are are tested, these options should also have a method for automated testing in an isolated environment, however just proving that the combination compiles might still be useful.
Each config set will be compiled approximately 6 times, so increasing the number of set can have a significant impact on how long the automated tests run (with the current test speeds, even three option sets could be excessive)
As far as I know, all the built time config options to choose from are documented in the BuildConfig doc.