Sunday, May 29, 2011

Spaf’s Memorial Day thoughts on “CyberWar”

Take a few minutes and go read Gene Spafford’s “U.S. Memorial Day Thoughts on Cyber War”.  This is not the typical “OMFG THE SKY IS FALLING ONLY THE GOVERNMENT AND MILITARY CONTRACTORS CAN SAVE US FROM THINGS THEY CAN’T DEFEND THEMSELVES AGAINST!!!11!1” bullshit we regularly see, nor is it the oft-repeated flippant dismissal of the existence of whatever it is people mean by “cyber war”.  It is a reasoned and balanced view of the current situation, and a look at where we seem to be headed- from the perspective of Dr. Spafford.

His observations about the state of technology education may be the scariest thing about the situation, and if unchecked will likely be more devastating than any “attack” we may suffer.



Monday, May 23, 2011

Risk analysis and things that go boom.

A recent paper with the sexy title “Understanding and Managing Risk in Security
Systems for the DOE Nuclear Weapons Complex
” (and subsequent coverage of it) wound some people up over its attack on probabilistic risk analysis (PRA).  You can download a free copy of the PDF at  It is worth a read, it is only 30 pages, and the meat is really only a few pages.  Read the preface on page IX, then the content from pages 1-5 and you can skip the rest unless you really want the tedium of government documentary fluff.  (Note: this is the public, sanitized version- there is a longer, and understandably classified version).
Here’s the quote that seems to trigger the reaction:
“The committee concluded that the solution to balancing cost, security, and operations at  facilities in the nuclear weapons complex is not to assess security risks more quantitatively or more precisely.  This is primarily because there is no comprehensive analytical basis for defining the attack strategies that a malicious, creative, and deliberate adversary might employ or the probabilities associated with them.”
I don’t have a problem with this, I think it is dead on.  Part of my frustration with the risk analysis crowd is many of them insist on using made-up or otherwise useless metrics for “calculating” their “probabilities”.  That isn’t the issue here, though- In this case, PRA fails for the the reason stated above:
“…there is no comprehensive analytical basis for defining the attack strategies that a malicious, creative, and deliberate adversary might employ…”
This is especially true in the context of this document- physical threats to the US nuclear weapons arsenal.  The consequences are simply too high to not just do the absolute best job possible.  That is not true for what we deal with in information security, no matter what the cyberhypemeisters tell us.  Things like “acceptable level of compromise” aren’t acceptable when we’re talking about nuclear weapons.  Small incidents with nukes are not small.  (Tangent: say what you will about his politics, birth certificate, whatever- I am relieved the hear a president who doesn’t say “nuculer”).
Now, if you want to put together some good metrics, reliable and repeatable ones, and use them for predictive modeling in environments where some margin of error is acceptable (as in most of what we do in InfoSec), we can work on that.  Just don’t tell me that those good metrics are common in our field, and never forget the underlying truth that an attacker with adequate resources will ALWAYS defeat us- and we have something to work on.  Even the authors of this paper see the value in PRA, just not as an absolute.  Immediately following the above quote is this comment:
“However, using structured thinking processes and techniques to characterize security risk could improve NNSA’s understanding of security vulnerabilities and guide more effective resource allocation.”
Brian Snow talked about this paper on an episode of Risky Business last month if you want to hear his perspectives (Brian was one of the authors of the paper).
My takeaway?  We have to determine of the tools we use are truly appropriate for task at hand- and if we are using those tools properly.  With good metrics and measurements we can gain insight from risk analysis.  Conversely, with crap statistics, improperly applied, we can waste an astounding amount of time and money.  No hype, no drama, just a little common sense.
[Note: Some days you question the paths you have taken in life, and where they have led you.  One such day (night, actually) I had just returned to my room from Frankie’s Tiki Room in Las Vegas.  I was preparing for a brief slumber when I noticed the two documents on my desk- and asked myself how the [expletive] I got to a place in life where this makes sense (at least to me)].



Friday, May 6, 2011

Astaro accepts offer from Sophos.

This is about my employer.  It is an unusually corporate market-y sounding kind of post for me.  Feel free to skip this one if you aren’t interested, I will not be offended.

It is kinda big news for some of us.  The nice folks who pay me to do all kinds of cool things, Astaro, have agreed to be purchased by Sophos.  There are a lot of questions that I have seen and heard, and some utter nonsense has been said.

BUT FIRST: I am just an employee of Astaro.  I am not a founder, owner, or senior management team member.  What follows are my personal observations and opinions.  I have no “inside knowledge” to share, and even if I did, I couldn’t.

The official press release and FAQ have the basic info.  There are several blog posts and articles about it.  My new colleague Graham Cluley over at Sophos’ Naked Security Blog did a good summary post with links to several other articles and posts.  Mike Rothman’s analysis over at Securosis is a good one.  I’m sure you can find more.  You can also find some that are off the mark.  If you read something that doesn’t make sense, please apply a little skepticism.  (If you read this blog, that should be easy for you).

I have received several questions, and I will UNOFFICIALLY answer them based on my understanding of the situation:

Q: What about the free home version of Astaro?

A: Don’t worry it is not going away.  Astaro’s management team will continue to manage the Astaro line as part of Sophos.  The home version is important to them, and to most of us in the Astaro team.  It is a key part of Astaro’s success, and a key part of building the Astaro community.  It would be silly to discontinue it.

Q: You say that now, but will it change as the product evolves? 

A: I am sure it will, and have no idea what that will look like.  If I were psychic, I’d be a gambler, not a packet monkey.  But, see previous answer.

Q: What about X Open Source project?

A: Open Source provides great value to Astaro, and Astaro provides support back to Open Source projects.  That will continue.  And, any Open Source code will stay Open- as the licenses require.

Q: What about BSides and other community sponsorships and support?

A: Short term- nothing changes.  The awesome Astaro PR and Marketing team is committed to building communities.  It is a differentiator for Astaro, and it is the right thing to do.

Long term- it is a financial decision, as long as it makes sense, I expect it to continue.  And I expect that to make sense for a long time.

Now, a little strategy talk. 

This creates a combined company with a broad diversity of security and management products.  There is not a lot of overlap in product lines, so there is not much redundancy to reconcile, there will just be the challenges of integration where appropriate.  (On the Astaro gateway side, that’s pretty easy- it is a modular platform which has allowed adding and modifying components and features as the product and customer needs have evolved).

Some have said that the endpoint and network security channel partners are different, and the buyers are different, and this will cause difficulty for the combined company.  While that may be true in limited cases, most likely in larger environments, my experience brings me to quite the opposite conclusion.  I talk to our partners, as well as other VARs, MSPs and resellers regularly; most I speak with want a complete and diverse product line to offer their customers and prospects.  Likewise, the pressures of the economy and the never-ending push for increased efficiency are driving the consumer to look for efficiencies and cost savings.  This pressure on those in the IT trenches is why the UTM (Unified Threat Management) segment is gaining traction in ever larger environments- simplified, unified, cost-efficient products conserve scarce resources.  It only makes sense that a properly integrated, quality suite of products will be attractive to businesses.  And even in the cases where the desktop team doesn’t “play nice” with the firewall guys (or web filtering, or whatever), I have a couple of thoughts:

  • It is about the company’s best interests, the pressure is on, and cooperation is happening, or will happen.  With the current teams, or those who replace them.
  • More importantly, the budget authority is frequently above these levels, and good managers understand the value of efficiency.

Things will change. There will be opportunities, there will be missteps, and there will be successes. I believe this is a good move, but that is speculation: it is now up to us in the new, combined company to prove it.

And finally (for now): if you have questions, comments, or concerns- let us know.  If you do not know who to ask in the Astaro team, or at Sophos, ask me.

Drop a note to jdaniel at astaro dot com.  I am on the road a lot, especially for the next month, so my responses may not be immediate, but ask me, and I will answer as soon as I can, or I will connect you with the answer.



Wednesday, May 4, 2011

Verizon DBIR (or, I told you so)

Now that I’ve had over a week to read, re-read, digest, etc.

I told you so.  For all the scary, uber-sophisticated attacks we run off to conferences to see, and all the amazing feats of exploitation we hear about, real-world compromises are most often exploiting basic failures in security.  If you are a regular reader of the Verizon Data Breach Investigations Report you will know that the DBIR has again confirmed our failure to secure the basics.

That is a pretty gross oversimplification, but it is true.  This year’s report reflects a pretty significant shift from the enterprise to SMB, and has some interesting data.  One thing that many have latched on to is the rise in the number of breaches, but significant drop in the number of records breached in 2010; if Verizon’s numbers reflect the world at large we will see a stunning reversal in the 2011 data.  This anomaly doesn’t alter the value of DBIR data, but it highlights the difficulties in making pronouncements based on a single report.

Before we go on, some background information will be useful.  The Verizon report includes both Verizon and US Secret Service data, and while it represents hundreds of cases the experience is far from universal.  There is a lot of selection bias at play, and that narrows the scope of the results.  See my recent How to misinterpret the Verizon DBIR post for more thoughts on interpreting the report.

As far as the substance of the report, there are a handful of things I find insightful, or at least interesting:

  • The “internal threat” is real.  Just not a big deal compared to outsiders kicking our butts.  And, the insider take tends to be smaller than external attackers.
  • 83% of victims were targets of opportunity.  Too many people are still making it easy.
  • Speaking of… 92% of attacks were “not highly difficult” per Verizon.  Even if we argue about the ones Verizon labels “moderate”, there are still 43% of attacks in the “stupid easy” category.  (OK, technically speaking, Verizon refers to 6% with a difficulty of “none” and 37% “low”).
  • 89% subject to PCI-DSS had not even achieved compliance with this fraudulently imposed sub-minimum sub-standard.  (Sorry, I may have let a little editorializing slip in there).

Continuing trends included:

  • Organizations not knowing where their stuff is and how it can be accessed (the unknown unknowns) appear to be improving, but this is still a big problem.
  • Organizations do not log everything they should, but it is OK, because they don’t look at the logs anyway.  At least not until it is too late.
  • And how do they know it is too late?  Again, third parties are much more likely to discover a breach than the organizations themselves.

There are a lot of ATM, gas station, and other POS (Point of Sale systems) attacks in this year’s report, largely split into two categories: physical compromise of ATM and gas station card readers, and exploitation of remote access deficiencies in POS systems.  While the remote access attacks fit in with our traditional idea of criminals attacking computer systems, the physical installation of skimmers on ATMs and gas pumps does not.  It is hard to look cyber while standing at the location of compromise while carrying hardware and wearing a toolbelt.

I like this report more each year.  BUT, there are things which make me nuts.  If I could have two things from the DBIR team they would be:

  • More raw data.  Give me the numbers.  I understand that too much detail could undermine the anonymization, but I want more raw data.  (Yes, I’m one of those folks who generally believes good data visualization means a readable font in the spreadsheet).
  • An adult version of the report.  Take out the redundant high-point popups, they’re a distraction to those of us who really read the report.  Pull out some of the infographics, too (see above). I’ll read the whole report, with highlighter and pen in hand, more than once, and decide what is important to me.

That’s enough from me- there are already more than enough summaries of the DBIR out there.



Monday, May 2, 2011

Cloud computing resources

No hype here.  No “cloud will change everything” nonsense (it won’t).  No “cloud is nothing new” nor “cloud is completely new” nonsense, either (cloud is perfect for a wedding- “Something Old, Something New, Something Borrowed, Something Blue, and a Silver Sixpence in Her Shoe.” But you’ll need more than sixpence).
If you’ve been keeping up with the smart cloud folks, you probably won’t find anything exciting here- but below are some good general resources.
Properly deployed for appropriate purposes, cloud computing can be fantastic.  I have moved most of my lab systems to a cloud environment and it has provided a huge improvement in my ability to test systems and deliver demonstrations.  My employer uses cloud systems to deliver content and services for partners and customers more effectively that we could with internal resources.  But, cloud computing is not for everyone, or for everything.  You just need to research, plan, and migrate wisely.
There are a handful of very good cloud computing security documents out there, here are ones I recommend (some are pretty big PDFs):
Start with the NIST definitions doc, it was only two pages, but has been bloated to seven without adding value.  Just read the last two pages, ignore the rest.  It is not “security specific”, but is sets a common terminology for the rest:
My new favorite cloud security reference is from the Australian Defence (yeah, they spell it funny over there) Signals Directorate; their Cloud Computing Security Considerations is great resource, and a great conversation starter for those considering a move to cloud computing.  (It is 19 pages and an easy read, too).  If you read only one, read this.  And share it.
For more meaty discussions of cloud security, it is hard to beat the documents recommended for those preparing to take the Cloud Security Alliance CCSK (Certificate of Cloud Computing Knowledge) exam:
CSA’s own “Security Guidance for Critical Areas of Focus in Cloud Computing V2.1” is not a light read, and is enterprise focused, but has a lot of good information.
The other study document is the ENISA “Cloud Computing Risk Assessment”.  It is also not a quick read, but has more small- to mid-sized business focus (reflecting its European origin).
Speaking of CCSK, it is an interesting certification.  I’ve recently passed the exam, and heartily recommend the study material- but the certification is probably of limited value to most people until “cloud” is better understood.  As you would expect, CSA has an enormous amount of information on their site, covering a myriad of cloud concepts.
A couple more references for those of you who want a broader understanding:
NIST also has a “Cloud Computing Reference Architecture” which needs some help in the area of readability, but is a good resource, especially for the discussion of cloud computing roles.
OpenCrowd’s Cloud Taxonomy is useful for help in categorizing cloud products and services and for understanding the categories.
This is by no means a complete, or even exhaustive list (although I do feel somewhat exhausted); it is just a pile of stuff that I hope will be helpful to those considering a move to cloud computing (or to those already in the clouds, but afraid of heights).