Closed synergiator closed 4 years ago
This is a good idea. I don't think I would focus on warnings, and even code quality might be narrow...but the overall software capability and flexibility is something worth looking at. Let me talk to my colleague Enrique about this.
I have added a column (far to the right) for notes on Software. I'm not ready to start scoring software; as projects become more advanced, I might develop some criteria. Thanks for the suggestion.
I think, beyond academic interests, this quick and simple check can be reduced to most significant question:
"is there a transparent automation aka CI/CD to create controller flash file from sources?"
Implications:
The CI/CD automation is integral part of sites like GitHub and GitLab so the check should be easy.
Controller code bases introduce their own complexities; therefore it would be great to have an additional column and a testing workstream to grade code quality.
There are varieties of tools, we could start with simply linting to get an idea about a statc code quality.
I am open to discuss further details and implementation in this issue thread.
For example:
* Provided documentation must contain instructions how to setup compilation environment, install required dependencies and run compilation and what the result should be. Ideally, these steps should be shared in form of a test script like Dockerfile or similar as part of the codebase.
Further metric could be code coverage whilee 100% code coverage is very expensive and rarely seen.
Further inspirations for graded code quality (a commercial tool but they have a great quality framework and also many industrial customers; I have reached out to their CEO in this context): https://www.tiobe.com/tqi/awards/