6

I was asked by a student how OWASP Top 10 are ranked, based on which indicators: is it severity? ease of exploit? ease of implementing their countermeasures? ... Knowing that each of these vulnerabilities is either severe or not based on the mise usecase possible.

Furthermore, I would be interested in more Top 10 different than the ones of OWASP and different than Web vulnerabilities.

Please I would appreciate answers based on references.

AviD
  • 72,138
  • 22
  • 136
  • 218
Phoenician-Eagle
  • 2,167
  • 16
  • 21

3 Answers3

7

As per the OWASP top ten page:

  • The OWASP Top Ten represents a broad consensus about what the most critical web application security flaws are. Project members include a variety of security experts from around the world who have shared their expertise to produce this list.

The updated version takes into account comment from industry and OWASP members in order to be as relevant as possible.

If you do want stats in order to help you assess a top ten relevant to you, I presented to ISACA Scotland in October 2010 on the top seven issues small businesses could tackle.

As background to my talk, I used data from these three reports:

I hope that will be of some use to you.

Rory Alsop
  • 61,367
  • 12
  • 115
  • 320
  • Is there any vulnerability list that's just based on frequency of real-world reports? ...and would that even be useful? :-D –  Apr 18 '11 at 12:02
  • @Graham - The WHID and Verizon reports are exactly that. – Rory Alsop Apr 18 '11 at 12:32
  • @Paul - And now the 2011 Verizon DBIR is here: http://www.verizonbusiness.com/resources/reports/rp_data-breach-investigations-report-2011_en_xg.pdf and some analysis here: http://blog.7elements.co.uk/2011/04/2011-verizon-dbir-out-its-still-basics.html – Rory Alsop Apr 21 '11 at 10:31
3

The latest version, OWASP T10 2010, takes into account the risk factors, including the potential damage. See here for further explanation.
F.e. XSS is considered more common than SQL injection, but the damage from SQLi is potentially much more severe - hence SQLi is #1, and XSS is relegated to #2.

Earlier versions of the OWASP T10 did not take that into account - there was no factoring involved, simply filtering MITRE data for relevant web app vulnerabilities.


In addition to OWASP's Top 10 (which are really a great place to start), you should look at the SANS Top 25 "Most Dangerous Software Errors" (not limited to web).
This is actually just the top 25 from MITRE's list of Common Weakness Enumeration (CWE) (which is basically a complete list of all types of common badness found in programmers' code...

AviD
  • 72,138
  • 22
  • 136
  • 218
  • As per OWASP, this is a result of their shift towards proper risk management, and not just "enumerating badness". I think it's definitely a step in the right direction, even though I'm not 100% with their rating methodology. – AviD Apr 20 '11 at 20:08
  • 1
    Re SANS, I don't know how they define their ratings, but quoting: `"It leverages experiences in the development of the SANS Top 20 attack vectors and MITRE's Common Weakness Enumeration (CWE) .... This year's Top 25 entries are prioritized using inputs from over 20 different organizations, who evaluated each weakness based on prevalence and importance"` – AviD Apr 20 '11 at 20:12
  • also excellent pointers! – Phoenician-Eagle Apr 20 '11 at 20:13
1

Check out the OWASP podcast episode 82 with Dave Wichers. He addresses this question.

getahobby
  • 175
  • 3