coreinfrastructure / best-practices-badge

🏆Open Source Security Foundation (OpenSSF) Best Practices Badge (formerly Core Infrastructure Initiative (CII) Best Practices Badge)
https://www.bestpractices.dev
MIT License
1.18k stars 202 forks source link

Add more about vulnerability reporting (as a new criterion or more details about a criterion) #1204

Open david-a-wheeler opened 5 years ago

david-a-wheeler commented 5 years ago

It might be good to extend the criteria, or at least add more explanatory material about vulnerability reporting, per this article: "New open source effort: Legal code to make reporting security bugs safer The Disclose.io framework seeks to standardize "safe harbor" language for security researchers." by Sean Gallagher, 8/2/2018, 9:00 AM, Ars Technica. Quote:

The lack of consistency in companies' bug-disclosure programs—and the absence of "safe harbor" language that protects well-intended hackers from legal action in many of them—can discourage anyone who discovers a security bug from reporting it. And vague language in a disclosure program can not only discourage cooperation but can also lead to public-relations disasters and a damaged reputation with the security community, as happened with drone maker DJI last November.

This is related to the criterion vulnerability_report_process in the "passing" level.

david-a-wheeler commented 5 years ago

Current criterion text:

The project MUST publish the process for reporting vulnerabilities on the project site. (URL required for "met".) [vulnerability_report_process]

Details: E.g., a clearly designated mailing address on https://PROJECTSITE/security, often in the form security@example.org. This MAY be the same as its bug reporting process. Vulnerability reports MAY always be public, but many projects have a private vulnerability reporting mechanism.

We could easily add to the details. Maybe something like this to the end?:

Since the project MUST publish a process for reporting vulnerabilities, it follows that a project CANNOT sue a vulnerability reporter for following that process to report a vulnerability (since the reporter is simply following what the project said to do).

What that does NOT do is make it clear that if a project doesn't quickly fix a vulnerability, a vulnerability reporter can (and ethically SHOULD) report such lapses to the public, because otherwise the public is being exposed to known vulnerabilities that they aren't being made aware of. We should probably address that too, but that's probably a new criterion, and wording it will be tricky. We want projects to have some private time to fix it, if they need it and are actively concerned about security, while dealing with the bad actors who would rather sue people instead of protecting their customers.