OWASP / Top10

Official OWASP Top 10 Document Repository
Other
4.23k stars 823 forks source link

What should the OWASP Top 10 end result look like? (Summit Process session, Mon AM1) #8

Closed tghosth closed 6 years ago

tghosth commented 7 years ago

Possible options:

  1. Stay as it is - top 10 list of application security risks based with some aggregation of categories?
  2. Change to a "league table" of specific CWEs purely based on data gathered?
  3. Evolve to consider wider issues in application security - this seems to have been the rationale behind "2017 RC1 A7 - Insufficient Attack Protection"?
  4. Something else...?

My personal preference is option 2 with greater focus given to the OWASP Top 10 Proactive Controls or failing that, option 1.

jmanico commented 7 years ago

1) Brian Glas and Sync published very solid AppSec data analysis that we should all read. I like the idea of making the OWASP Top Ten data driven.

2) XXE has been a significant method of attack in real breaches and has been a top ten finding from some of the data sent to the project.

Plus, the defense is unique. "Configuring XML parsers for safety".

3) On that note; it we have the data and publish the top ten, it would be nice if the publication "keeps going beyond the 10" and lists other top items.

4) Last, per the data - what about releasing a Top Ten breach reasons ala Sync's work as well as Top Ten findings ala Brian Glas' work? It's where the data goes...

Just my 4 cents.

Aloha,

Jim Manico @Manicode Secure Coding Education +1 (808) 652-3805

On Jun 12, 2017, at 1:48 AM, Josh Grossman notifications@github.com wrote:

Possible options:

Stay as it is - top 10 list of application security risks based with some aggregation of categories? Change to a "league table" of specific CWEs purely based on data gathered? Evolve to consider wider issues in application security - this seems to have been the rationale behind "2017 RC1 A7 - Insufficient Attack Protection"? Something else...? My personal preference is option 2 with greater focus given to the OWASP Top 10 Proactive Controls or failing that, option 1.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

raesene commented 7 years ago

There is a challenge, of course, in placing too much reliance on a data driven model, which is that it leads only to findings about which data has been gathered being considered for inclusion in the Top 10, and depending on what the basis is, that might be overly restrictive. This leads to a potential chicken and egg scenario where data isn't gathered widely enough on new and emerging issues leading to them not being eligible for inclusion on the Top 10.

To provide a couple of concrete examples. 2013's new A9 "Using components with known vulnerabilities" got quite a bit of push back as I recall as there wasn't at the time a lot of historical data supporting its inclusion.

However if you look at the period from 2013 - now, I'd suggest that the wide range of high profile issues that we've had which would fall into that category makes it seem like it was a good choice for inclusion.

Also looking at things like the current draft's proposed A7, this type of proactive control would never be eligible under a model which used vulnerability data as the primary source of metrics to decide inclusion...

Not to say that data doesn't have a place in analysing what's happening in the AppSec world and what makes sense for inclusion, but just to sound a note of caution on the potential downsides of placing more reliance on it.

jmanico commented 7 years ago

I think that is a totally fair concern, Roy. A data-only approach also makes abuse easy - submitters data can skew the results, easily.

At the time, A9 looked - strange at best. Looking back it was spot on.

By the same token, how can we best motivate developers to improve their defense in 2017? By configuring their XML parsers correctly or by taking on the amorphous and complex topic that is A7? I think we serve the community better by encouraging developers to configure their XML parsers to stop XXE and similar. A7 is very difficult and expensive to address and does not address specific risks. So again, I'm leery of this category when we have more pressing and specific issues we could be highlighting.

Aloha, Jim

On 6/12/17 11:01 AM, Rory McCune wrote:

There is a challenge, of course, in placing too much reliance on a data driven model, which is that it leads only to findings about which data has been gathered being considered for inclusion in the Top 10, and depending on what the basis is, that might be overly restrictive. This leads to a potential chicken and egg scenario where data isn't gathered widely enough on new and emerging issues leading to them not being eligible for inclusion on the Top 10.

To provide a couple of concrete examples. 2013's new A9 "Using components with known vulnerabilities" got quite a bit of push back as I recall as there wasn't at the time a lot of historical data supporting its inclusion.

However if you look at the period from 2013 - now, I'd suggest that the wide range of high profile issues that we've had which would fall into that category makes it seem like it was a good choice for inclusion.

Also looking at things like the current draft's proposed A7, this type of proactive control would never be eligible under a model which used vulnerability data as the primary source of metrics to decide inclusion...

Not to say that data doesn't have a place in analysing what's happening in the AppSec world and what makes sense for inclusion, but just to sound a note of caution on the potential downsides of placing more reliance on it.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/OWASP/Top10/issues/8#issuecomment-307816404, or mute the thread https://github.com/notifications/unsubscribe-auth/AAgcCWAiyLeBHd-Jw8YowlP0mbEN2_wxks5sDVLGgaJpZM4N2rZG.

-- Jim Manico Manicode Security https://www.manicode.com

raesene commented 7 years ago

Well I guess I'd look at universal applicability as part of the criteria (so what percentage of application is potentially affected)

If we take XML parsers as an example, issue with them only apply where an XML parser is used obviously, so it's a subset of applications that could potentially have the issue. Anecdotally from the perspective of myself as an application security tester I'd say that I'm seeing fewer of those than I used to (several stacks are focusing more on JSON/REST setups), but then perhaps prevalence is something that we could gather data about.

By comparison pretty much every application I test lacks any form of active defence or response capability and I know for a fact that my life as a "pseudo bad-guy" would be made far more difficult if they did deploy even basic mechanisms to deter automated attacks.

If I was comparing AppSec to the overall Infosec industry I could take the realisation in the general space that preventative controls alone are not enough, and that focus needs to be placed on detective and reactive controls to provide a strong security model.

AppSec is unfortunately badly lacking in detective and reactive controls, in my experience, and that's where I think more could be done to drive developer awareness.

Cheers

Rory

jmanico commented 7 years ago

Good thoughts, Rory.

Well I guess I'd look at universal applicability as part of the criteria (so what percentage of application is potentially affected) If we take XML parsers as an example, issue with them only apply where an XML parser is used obviously, so it's a subset of applications that could potentially have the issue. Anecdotally from the perspective of myself as an application security tester I'd say that I'm seeing fewer of those than I used to (several stacks are focusing more on JSON/REST setups), but then perhaps prevalence is something that we could gather data about.

FWIW, I'm a LOT more concerned about addressing top risk in the real world than worrying about universal applicability. For example, not all applications do SQL but SQL Injection is a very common path for real world exploitation. Please note that two of the data submitters suggested we add this (XXE/XML Parsing) category.

By comparison pretty much every application I test lacks any form of active defence or response capability and I know for a fact that my life as a "pseudo bad-guy" would be made far more difficult if they did deploy even basic mechanisms to deter automated attacks.

Again, this is a very amorphous topic. There are many ways to evade this type of technology. I respectfully do not buy this argument.

AppSec is unfortunately badly lacking in detective and reactive controls, in my experience, and that's where I think more could be done to drive developer awareness.

Another reason why I do not like this topic as a top 10 topic is "ability to deploy". These technologies either cost a great deal of money or a great deal of developer time in ways that do not always address specific risks. It's valuable to detect attacks. But as a top 10 developer task? I am not sure if I buy it.

Rory, even if we disagree, these comments are intended with respect.

Aloha, Jim

On 6/12/17 11:22 AM, Rory McCune wrote:

Well I guess I'd look at universal applicability as part of the criteria (so what percentage of application is potentially affected)

If we take XML parsers as an example, issue with them only apply where an XML parser is used obviously, so it's a subset of applications that could potentially have the issue. Anecdotally from the perspective of myself as an application security tester I'd say that I'm seeing fewer of those than I used to (several stacks are focusing more on JSON/REST setups), but then perhaps prevalence is something that we could gather data about.

By comparison pretty much every application I test lacks any form of active defence or response capability and I know for a fact that my life as a "pseudo bad-guy" would be made far more difficult if they did deploy even basic mechanisms to deter automated attacks.

If I was comparing AppSec to the overall Infosec industry I could take the realisation in the general space that preventative controls alone are not enough, and that focus needs to be placed on detective and reactive controls to provide a strong security model.

AppSec is unfortunately badly lacking in detective and reactive controls, in my experience, and that's where I think more could be done to drive developer awareness.

Cheers

Rory

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/OWASP/Top10/issues/8#issuecomment-307823150, or mute the thread https://github.com/notifications/unsubscribe-auth/AAgcCT_lc-is73PFzHIJdgDezRIEm0AHks5sDVfRgaJpZM4N2rZG.

-- Jim Manico Manicode Security https://www.manicode.com

raesene commented 7 years ago

Interesting ideas Jim. I think it's fair to say we have a difference of opinion here, but then I think there's a wide range of opinions to be considered as part of the Top 10 process, and I'll be very interested to see where it all goes.

As to your point about the developer effort to deploy some form of detective/reactive controls in applications, well that's exactly where I'd see OWASP being able to help, to make that easier.

Obviously we have the work already done by the excellent OWASP AppSensor project and with more attention being paid to the top (perhaps due to inclusion in the Top 10 ;o) ) it should be possible to ease that initial effort for developers and make it easier for them to include this kind of control in their applications...

jmanico commented 7 years ago

it should be possible to ease that initial effort for developers and make it easier for them to include this kind of control in their applications...

That's the thing, it's not easy. AppSensor requires detection points in code - I would much rather developers fix their XML parser and similar before embedding detection points to insecure code. WAF's require extensive maintenance and rule updating and app-specific configuration. RASP's effect performance and are non-trivial to deploy. I feel it's more pressing to fix your code from basic tech issues before you start going down the rabbit hole of deploying detection technology.

So again, I think this is very valuable tech after the basics are covered.

It's totally ok to disagree. And I have no idea what the Top Ten process should be or what the Top Ten items even are. I am looking forward to change in project management here and clarity over these items. I'm open to A7 even if I disagree. I am eager to see where the process goes.

Aloha, Jim

On 6/12/17 11:54 AM, Rory McCune wrote:

Interesting ideas Jim. I think it's fair to say we have a difference of opinion here, but then I think there's a wide range of opinions to be considered as part of the Top 10 process, and I'll be very interested to see where it all goes.

As to your point about the developer effort to deploy some form of detective/reactive controls in applications, well that's exactly where I'd see OWASP being able to help, to make that easier.

Obviously we have the work already done by the excellent OWASP AppSensor project and with more attention being paid to the top (perhaps due to inclusion in the Top 10 ;o) ) it should be possible to ease that initial effort for developers and make it easier for them to include this kind of control in their applications...

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/OWASP/Top10/issues/8#issuecomment-307832861, or mute the thread https://github.com/notifications/unsubscribe-auth/AAgcCV7RNL23MovjiRiFWZCekD4jD6UUks5sDV81gaJpZM4N2rZG.

-- Jim Manico Manicode Security https://www.manicode.com

tghosth commented 7 years ago

This was discussed in the session. The outcome based on @vanderaj 's summary was basically:

vanderaj commented 6 years ago

I'm pretty sure this issue is now closed as this is exactly what's happening. Please follow along in GitHub as we modify A7 / A10 and re-order based on data from Friday onwards.