Monday, May 23, 2011

Risk analysis and things that go boom.

A recent paper with the sexy title “Understanding and Managing Risk in Security
Systems for the DOE Nuclear Weapons Complex
” (and subsequent coverage of it) wound some people up over its attack on probabilistic risk analysis (PRA).  You can download a free copy of the PDF at http://www.nap.edu/catalog.php?record_id=13108.  It is worth a read, it is only 30 pages, and the meat is really only a few pages.  Read the preface on page IX, then the content from pages 1-5 and you can skip the rest unless you really want the tedium of government documentary fluff.  (Note: this is the public, sanitized version- there is a longer, and understandably classified version).
Here’s the quote that seems to trigger the reaction:
“The committee concluded that the solution to balancing cost, security, and operations at  facilities in the nuclear weapons complex is not to assess security risks more quantitatively or more precisely.  This is primarily because there is no comprehensive analytical basis for defining the attack strategies that a malicious, creative, and deliberate adversary might employ or the probabilities associated with them.”
I don’t have a problem with this, I think it is dead on.  Part of my frustration with the risk analysis crowd is many of them insist on using made-up or otherwise useless metrics for “calculating” their “probabilities”.  That isn’t the issue here, though- In this case, PRA fails for the the reason stated above:
“…there is no comprehensive analytical basis for defining the attack strategies that a malicious, creative, and deliberate adversary might employ…”
This is especially true in the context of this document- physical threats to the US nuclear weapons arsenal.  The consequences are simply too high to not just do the absolute best job possible.  That is not true for what we deal with in information security, no matter what the cyberhypemeisters tell us.  Things like “acceptable level of compromise” aren’t acceptable when we’re talking about nuclear weapons.  Small incidents with nukes are not small.  (Tangent: say what you will about his politics, birth certificate, whatever- I am relieved the hear a president who doesn’t say “nuculer”).
Now, if you want to put together some good metrics, reliable and repeatable ones, and use them for predictive modeling in environments where some margin of error is acceptable (as in most of what we do in InfoSec), we can work on that.  Just don’t tell me that those good metrics are common in our field, and never forget the underlying truth that an attacker with adequate resources will ALWAYS defeat us- and we have something to work on.  Even the authors of this paper see the value in PRA, just not as an absolute.  Immediately following the above quote is this comment:
“However, using structured thinking processes and techniques to characterize security risk could improve NNSA’s understanding of security vulnerabilities and guide more effective resource allocation.”
Brian Snow talked about this paper on an episode of Risky Business last month if you want to hear his perspectives (Brian was one of the authors of the paper).
My takeaway?  We have to determine of the tools we use are truly appropriate for task at hand- and if we are using those tools properly.  With good metrics and measurements we can gain insight from risk analysis.  Conversely, with crap statistics, improperly applied, we can waste an astounding amount of time and money.  No hype, no drama, just a little common sense.
[Note: Some days you question the paths you have taken in life, and where they have led you.  One such day (night, actually) I had just returned to my room from Frankie’s Tiki Room in Las Vegas.  I was preparing for a brief slumber when I noticed the two documents on my desk- and asked myself how the [expletive] I got to a place in life where this makes sense (at least to me)].

IMAG0225

Jack