Apple provide default settings for macOS - many settings are enabled by default after a factory reset. We can consider these to be Apple's own security baseline for the average consumer. When a third party organisation provides a set of security guidelines for the macOS platform, they should show a clear indication of which guidelines deviate from Apple's own (and why the third party organisation disagrees with Apple's default).
Our org's IT-Sec team's position is that if a company's baseline requires a deviation from Apple's own (example: enabling application firewall), then these need a higher level of understanding by the technicians (or users) responsible for implementing them in comparison to those baseline policies which align with Apple's defaults (example: preventing Gatekeeper from being disabled). Therefore, it is important to be as clear as possible as to which default settings are considered inadequate.
Intended users
Anyone who wants to understand the guidelines and their impact better.
Further details
The goal would be full transparency of which guidelines are not the same as Apple's defaults.
Proposal
Providing a distinct key in the output of the macOS Security Guidelines to indicate whether a guideline matches Apple's default will provide a good prompt to an IT administrator as to which guidelines they really need to develop a deeper knowledge of.
Although some of the existing notes fields of some guidelines (for the CIS Benchmarks, at least) provide an indication of whether the recommendation matches Apple's default, this does not appear to be systematically so, and there is no easy way to audit this at a glance.
Testing
The only risk I can think of here is that Apple's defaults may change with each OS release, but checking these should be part of the normal cycle of checks when a new OS comes out anyway, since any change Apple makes will be for a reason. A good example of this is the current CIS benchmark for locking out after max. password attempts, which is not taking into account the built-in delays after a certain number of attempts, which should render this guideline obsolete (and in the opinion of our IT-SeC team, bad practice).
What does success look like, and how can we measure that?
Success is a field in the data output that is a simple Yes/No answer to the question "Does this recommendation match Apple's default setting?". This could then become a distinct column in CSV outputs, reports, and GUI apps such as the Jamf Compliance Editor.
Problem to solve
Apple provide default settings for macOS - many settings are enabled by default after a factory reset. We can consider these to be Apple's own security baseline for the average consumer. When a third party organisation provides a set of security guidelines for the macOS platform, they should show a clear indication of which guidelines deviate from Apple's own (and why the third party organisation disagrees with Apple's default).
Our org's IT-Sec team's position is that if a company's baseline requires a deviation from Apple's own (example: enabling application firewall), then these need a higher level of understanding by the technicians (or users) responsible for implementing them in comparison to those baseline policies which align with Apple's defaults (example: preventing Gatekeeper from being disabled). Therefore, it is important to be as clear as possible as to which default settings are considered inadequate.
Intended users
Anyone who wants to understand the guidelines and their impact better.
Further details
The goal would be full transparency of which guidelines are not the same as Apple's defaults.
Proposal
Providing a distinct key in the output of the macOS Security Guidelines to indicate whether a guideline matches Apple's default will provide a good prompt to an IT administrator as to which guidelines they really need to develop a deeper knowledge of.
Although some of the existing notes fields of some guidelines (for the CIS Benchmarks, at least) provide an indication of whether the recommendation matches Apple's default, this does not appear to be systematically so, and there is no easy way to audit this at a glance.
Testing
The only risk I can think of here is that Apple's defaults may change with each OS release, but checking these should be part of the normal cycle of checks when a new OS comes out anyway, since any change Apple makes will be for a reason. A good example of this is the current CIS benchmark for locking out after max. password attempts, which is not taking into account the built-in delays after a certain number of attempts, which should render this guideline obsolete (and in the opinion of our IT-SeC team, bad practice).
What does success look like, and how can we measure that?
Success is a field in the data output that is a simple Yes/No answer to the question "Does this recommendation match Apple's default setting?". This could then become a distinct column in CSV outputs, reports, and GUI apps such as the Jamf Compliance Editor.