A 2016 brainstorming session was held in New York City to identify "best practices" (though perhaps the term "recommended practices" is better, since that's a more accurate description). In particular, the idea was to help identify these practices.
Below are notes from that session - review this to identify potential criteria (at any level).
Practices of what? Not just code.
The top-level categories were;
Good documentation
Community mgmt
Make it easy for people to contribute
Testing
Discoverability
Release control
Security disclosure/response
License your project (OSS)
CLAs
Governance
Security development lifecycle
Operational security of project (project IT)
Dependency mgmt (Ecosystem)
Should packages "measure" best practice, e.g. for use at install time
Difficult to apply one universal evaluation. Subjective, different concerns,
etc.
One challenge is that lessons learned don't spread.
Some practices are difficult to use because of signal-to-noise
make check generating thousands of legacy "problems"
no power to assert "zero warnings" on other peoples' projects
AI: perhaps have Debian (say) turn "make check" on by default so
that it has to be disabled selectively (rather than enabled)
AI: commonly-used build systems could be updated to do ASAN builds
and checks
Best practices can be trumped by business/commercial considerations
AI: distributions/packaging mechanisms would convey "best practice"
attributes to worthy packages. (TBD what counts as "worthy")
Below are more detailed recommended practices (these are raw notes from the brainstorming session).
Release control
Stable release branches - yes but in a different way. Instead of "branches" (which is git-specific), we require the more general notion of tagging, which is in version_tags
Version numbers for releases - yes, in version_unique
Use semantic versioning - yes, in version_semver
Version control - yes, in repo_public, repo_track, and repo_distributed
Release notes (major changes) for each release - yes, release_notes
Intermediate versions are publicly released (no surprises in release) - yes, repo_interim
Issue tracker (GitHub, Bugzilla, etc.) - yes, report_process and report_tracker.
Good documentation & design
Documentation - yes, documentation_basics and documentation_interface
Published roadmap for future improvements - proposed as documentation_roadmap
A short intro about the project on the webpage & README - yes, documentation_basics
API guidelines / style guide - for the API, see documentation_interface. For code, proposed under coding_standards.
Design documents - Added. This is related to know_secure_design. Some is in proposed implement_secure_design and proposed documentation_security.
Dependency management
Accurate makefile dependencies - If a project uses makefiles (or something like them), and there are a few inaccurate dependencies, those are just a bugs - we don't want them, but projects can find and fix bugs. To be a criterion, we need a general rule that people should follow that is widely agreed on as being an improvement & has evidence to support it. This is harder. We could say, "maximally automate dependencies" - but it's hard to measure maximal. We could say, "avoid recursive make", citing "Recursive Make Considered Harmful" by Peter Miller. Note that "Non-recursive Make Considered Harmful" agrees that recursive make approaches are bad; its argument is that for large projects you should use a tool other than make (which is fine, we're agnostic about the build system - BadgeApp uses rake). Something like: "The project MUST NOT use recursive-subdirectory build systems where there are cross-dependencies in the subdirectories." - draft criterion build_non_recursive
External dependencies listed and traceable - Added as external_dependencies. It doesn't specifically say "traceable", but we think it gets the point across.
Code review
Code reviews - yes, see proposed two_person_review and security_review. Maybe more specific code review criteria could be added, suggestions welcome.
Community Management
Code of conduct (CoC) - code_of_conduct.
Be nice - this is not universally agreed on. Wheeler suggests you should be nice to people & hard on code, but many people have trouble seeing the difference. Also, there's a difference between "I'm offended" and "I'm attacked" - the latter is the issue. Suggest that the key issue here is addressed by code_of_conduct.
Discoverability - Presumably in this context this is "can I find the project?" and "Can I find out how to interact with it?" The first part is primarily addressed by criterion description_good - since once that's done, search engines can help people find it. Getting a badge also helps with the first part. The second part is helped by criteria interact, contribution, and contribution_requirements.
Have project & repo URL - already in criteria.
Other
Feature roadmap - proposed in criterion documentation_roadmap.
KISS (Keep it Simple) - worthy goal, but very difficult to measure and determine whether or not it's achieved. We do require that people be aware of this, as part of criterion know_secure_design.
Use code formatters - proposed in criterion coding_standards_enforced. We already had coding_standards, but based on this comment have split out coding_standards_enforced as a separate item to emphasize enforcement. We don't specifically require the use of a code formatter - many projects simply use a checker, instead of a reformatter, so we simply require enforcement and let the project decide how to enforce it.
Security disclosure/reporting
Bug reporting instructions - yes, already there in criterion report_process.
Explain how to report vulnerabilities / A way to report security bugs / Security vulnerability reporting procedure - yes, already there in criterion vulnerability_report_process.
Incident response SLA. Yes, report_response and vulnerability_report_response, vulnerabilities_fixed_60_days
Operational security of project
Two-factor authentication for developers - proposed as two_factor_authentication (passing+2)
Email/issue tracker security - Just saying "your project's development infrastructure secure" is too nebulous - people will agree, but will say they're already doing it. Email security is challenging; GnuPG is used, but many find it difficult to deploy in practice, especially to less technically savvy people. Mandating a specific technique, like GnuPG, doesn't seem like a good approach. We're not sure how to turn this into a specific criterion.
HTTPS for project & repo sites - yes, sites_https
Signed releases - yes, proposed as signed_releases
Make it easy for people to contribute
Issues marked for new contributors - yes, proposed small_tasks
Contributing guide/doc - yes, in criteria interact, contribution, and contribution_requirements
Documented process for how patches get accepted - yes, in criteria interact, contribution, and contribution_requirements
Low barrier to entry (tools, workflows, etc.) - added as new potential criterion installation_development_quick
Acknowledge bug reports (don’t just sit there) - yes, criterion report_responses. This only requires a majority, not every single report.
Acknowledge/credit contributions & contributors - Added vulnerability_report_credit for giving credit for vulnerability reports. For just generic contributions trying to do this separately can get very long, and many successful projects don't try to do this. In addition, version control systems already record this (e.g., "git log" and "git blame"). In conclusion, it's not clear that adding this as a separate criterion is a universal good for general contributions.
Assume good intention - give commit access soon, revoke & revert if needed - No, because different successful projects disagree on this. Node.js prefers this approach, however, many other successful projects (such as the Linux kernel) expressly do not do this. In addition, from a security point-of-view, assuming good intentions is not always realistic, especially since "revoke and revert" can be difficult if the commiter is actively malicious. Projects can choose whether or not to do this.
Testing!
Add tests when add major new functionality - yes, test_policy and tests_are_added
Code coverage. Yes, proposed as test_statement_coverage80 and test_statement_coverage90 and test_branch_coverage80
Test coverage >=N% (statement? Branch? other?) - Yes, proposed as test_statement_coverage80 and test_statement_coverage90 and test_branch_coverage80
Automated test suite - yes, criterion test
Make check with ASAN - yes, dynamic_analysis_unsafe
Continuous integration (2x) - yes, proposed criteria continuous_integration and automated_integration_testing
Can build it - criterion build and build_common_tools; see also build_repeatable and build_reproducible
For parsers, etc: FUZZ - yes in general, though we don't require fuzzing specifically. Criterion dynamic_analysis.
Use dynamic analysis tools - Criterion dynamic_analysis.
Use static analysis tools / static analysis coverage of code - yes for the first part - criterion static_analysis. Unclear what the author meant about "static coverage of code" - if what was meant was coverage of tests, see proposed criteria test_statement_coverage80 and test_statement_coverage90 and test_branch_coverage80
Use warning flags - yes, criterion warning_flags and proposed warnings_strict
How to get best practices applied
These are not changes to the criteria, but ideas on how to get the criteria more easily applied.
We've walked through the entire NYC brainstorm list, and tried to extract criteria. The list of ways to get more badges are now on a separate issue. So, we're closing this issue.
Brainstorming session
A 2016 brainstorming session was held in New York City to identify "best practices" (though perhaps the term "recommended practices" is better, since that's a more accurate description). In particular, the idea was to help identify these practices.
Below are notes from that session - review this to identify potential criteria (at any level).
Practices of what? Not just code.
The top-level categories were;
Should packages "measure" best practice, e.g. for use at install time
Difficult to apply one universal evaluation. Subjective, different concerns, etc.
One challenge is that lessons learned don't spread.
Some practices are difficult to use because of signal-to-noise make check generating thousands of legacy "problems" no power to assert "zero warnings" on other peoples' projects AI: perhaps have Debian (say) turn "make check" on by default so that it has to be disabled selectively (rather than enabled)
AI: commonly-used build systems could be updated to do ASAN builds and checks
Best practices can be trumped by business/commercial considerations
AI: distributions/packaging mechanisms would convey "best practice" attributes to worthy packages. (TBD what counts as "worthy")
Below are more detailed recommended practices (these are raw notes from the brainstorming session).
Release control
Good documentation & design
Dependency management
Code review
Code reviews - yes, see proposed two_person_review and security_review. Maybe more specific code review criteria could be added, suggestions welcome.
Community Management
Other
Security disclosure/reporting
Operational security of project
Make it easy for people to contribute
Testing!
How to get best practices applied
These are not changes to the criteria, but ideas on how to get the criteria more easily applied.