various code quality tools in place, mostly just for information and not blocking anything, aka. not actually enforcing quality
CheckStyle (mostly default ruleset, blocking)
PMD (mostly default ruleset, non-blocking)
SpotBugs (mostly default ruleset, non-blocking)
no real overview of the hotspots, criticality, and low hanging fruit of issues in our code base
@sfavello suggested Code Climate as a substitute for our old LGTM badges in #5095
Code Climate, similar to Sonarcloud, is a code quality analysis tool that might help get better insights into the state of our code-base and how to tackle the quality issues we have to enable functional improvements/refactoring with readability, less technical debt, etc.
Proposal
Step 1:
Hook up Code Climate Analysis for our PRs
Optionally hook up other alternatives such as Sonarcloud
Evaluate (optionally compare) the resulting analysis findings
true/false positives
criticality/severity information
detailed information on the identified issue and the problem it poses
guidance on how to fix the issue, e.g. code samples, links for further reading, etc.
replaceability / overlap with existing tooling
Step 2:
Provoke additional issues we want the tooling to be able to find
Evaluate (optionally compare) the resulting analysis findings
true/false negatives
Step 3:
Consider additional requirements such as customizability, ease of setup (ideally can be automated / set up on org-level - necessary for ModuleLand), maintainability (recurring tasks / maintenance load), etc.
Decide which tool to integrate
Set it up for ModuleLand and other relevant repos as well (e.g. TeraNui, website, launcher, etc)
Definition of Done:
Tooling evaluated with respect to true/false positives/negatives, usability, and ease of setup
Chosen tooling hooked up for push validation in engine and module repos, as well as any other relevant repo (see above)
Configuration adjusted to replace existing tools and any repo-specific intricacies
Any necessary maintenance tasks documented in the wiki
Existing tools are removed
Concerns / Open Questions
no dependencies and no conflicts expected
contributor needs some understanding of quality issues that can occur in Java
when introducing quality tooling, we need to make sure it doesn't block us until all issues are resolved - ideally a kind of delta approach as in Sonarcloud: newly introduced issues in a PR are blocking, existing issues are flagged but not blocking
does the tooling also cover groovy, kotlin, json, etc.? - our code base is not solely Java
are there further alternatives we should think about? (if none come to mind, let's just start with code climate and sonarcloud for comparison)
Motivation
Proposal
Step 1:
Step 2:
Step 3:
Definition of Done:
Concerns / Open Questions
Useful Links