Monday, June 14, 2010

Security BSides Las Vegas announcements

The venue for Security BSides Las Vegas is phenomenal.  As great as that is, BSides is about content and community, and I’m happy to spill a few details about content.  The first few talks confirmed are great and there are plenty more killer talks to be announced.  Here are a few teasers:

David Mortman has assembled an all-star panel including Marisa Fagan, Erin Jacobs, James Arlen, Dave Lewis, Leigh Honeywell, and Rafal Los for “Mentoring, Mentee-ing (Telamachusing? Manatee-ing?) In Information Security: A How-To Panel.”: Come and learn how to get the most of out the Mentor/Protégé relationship from our panel of experts.

HD Moore will present “Fun with VxWorks”: this talk focuses on the VxWorks operating system, how it works, what devices use it, and how to compromise it.  The content will include background information on VxWorks itself, a checklist of common vulnerabilities, mappings from these vulnerabilities to shipped products, and a live demo of gaining access to a widely deployed commercial product.

Gene Kim will present “Mobilizing the PCI Resistance: Lessons Learned From Fighting Prior Wars (SOX-404)”: I have noticed that there is a growing wave of discontent from the information security and compliance movement around complying with the PCI DSS… My desired outcome is to find fellow travelers who also see the pile of dead bodies in PCI compliance efforts... and catalyze a similar movement to achieve the spirit and intent of PCI DSS.

Bruce Potter will bring bring us “How to Make Network Diagrams that Don't Suck”:  We've all been there.  You walk in to a network blind and the first thing you ask for is a network diagram.  What gets handed to you has apparently fallen out of a bowl of ramen and on to the page.  Overlapping lines, big arrows, and host names in print so small that only insects can read it.

Egyp7 will deliver “Beyond r57”: PHP is an easy language to learn and is among the most popular in the web development world.  Because of this, many PHP applications are written by novice programmers with little knowledge of writing secure code.  Combine that fact with a few poor design decisions and you end up with vulnerabilities in PHP applications being published daily.

And that is barely scratching the surface.  There will be plenty more, and there will be informal and impromptu talks, too.  And healthy conversations.  Maybe an argument.

And a first for BSides, we will be arranging a press area for BSides LV.  It is the place to be, and we want to provide those covering the event a place to hold interviews and get work done.



Wednesday, June 9, 2010

I just want to fix my car.

I'll close out my series of car rants with this one, on our ability to repair our cars.  This is not a new battle, but the front has moved into new territory.  The “Experimental Security Analysis of a Modern Automobile” paper touched on the subject briefly, pointing out that some of the “vulnerabilities” they reported could be addressed by locking down diagnostic and repair procedures.  They also stated that:

“...individuals desire and should be able to do certain things to tune their own car (but not others).”

Starts off good, then takes a dive.  So, who gets to decide what you can to to your car?  That is academic arrogance and lack of perspective there folks.  Yes, if I want to use my car on public roads, I have obligations to my fellow drivers and to the law.  If I am on a racetrack, the obligations are to my fellow drivers and the rules of the sanctioning body.  In the fields of my farm, the regulators, manufacturers, and pointy-headed academics can [insert your own creative answer here] themselves.

And on the commercial side:

“Similarly, how could mechanics service and replace components in a locked-down automotive environment? Would they receive special capabilities? If so, which mechanics and why should they be trusted?”

Once again, a little historical perspective…

Manufacturers have built vehicles requiring special tools for many years, and have tried to limit access to these tools to limit independent shops’ and do-it-yourself mechanics’ ability to maintain and repair vehicles.  Manufacturers have tried to restrict access by only allowing sales of some tools to authorized dealers, and when they can’t get away with that, they resort to making tools available at excessively high prices.  Special fasteners are the most obvious example, but there are few parts of an automobile which haven’t seen bizarre adaptations which require either serious creativity, or special tools, to access or repair.

Tools like these are easy to reverse engineer and duplicate.

With these physical components, we do have the ability to look at them and improvise- and tool manufacturers can make their own versions of the tools like the ones shown above.  Unless they run into patent issues, of course.

Going beyond repairs, tuning used to be a lot more obvious, too.  Changing some settings, swapping a few parts, these were commonplace tuning techniques.  Even the term “tune up” tells us something- we had to tune cars regularly, adjusting carburetors and points were regular service procedures.  One very common performance swap was replacing the carburetor, this was not done simply for performance, but for the ability to fine tune the aftermarket carburetors in a way we couldn’t tune factory systems.  For example, use of the ubiquitous Holley carbs meant that with some skill and patience, and a couple of boxes of jets we could precisely refine the fuel mixture fed to the engine.

Holley carburetor jet assortment.

We are now in a situation where a many routine repairs require interaction with the computer systems of the automobile.  Even tasks like changing fluids or servicing brake pads can require use of the computer systems.  Depending on make and model you may need one of these ~$18,000 (USD) systems to perform simple repairs:

Automotive diagnostic and repair computer

(Note: that’s a real system used by some European manufacturers, they really are about $18k, and are just a feeble Windows 2000 laptop in a user-unfriendly form factor).

Repair information is, and has been, a bigger problem.  Mechanical systems can be torn down, inspected, and independent publishers could (and still can) create repair manuals.  The diagnostics, and the underlying operation information, was always where we fought for information; the move to computerized systems has made this information both harder to find and more desperately needed.

We can’t just look at the problem and improvise, that’s why we need the manufacturers to cooperate in making information available, or at very least we can’t allow them to block access to information.  This is not easy, there are standards, but there are also proprietary implementations- so we are back in the awkward Intellectual Property/software patents/reverse-engineering-breaks-DMCA world that is familiar to those of us in the information security.

And it is more complicated than that- if you reverse-engineer proprietary software on your computer and alter its functionality, what are the consequences for society at large?  Altering automotive systems can have a profound impact on fuel economy, emissions, braking and other safety systems- that can have a real impact on society.  Or, at least have an impact on the car in front of you if you’ve screwed up your brakes.  Again, a little perspective: we’ve always been able to screw up our cars, we are just exposing new ways to do it. 

Let’s not ignore government’s role in this situation.  Much of the push to computerization of powertrain management systems was a reaction to ever-tightening emissions and fuel economy mandates.  It doesn’t stop with the design of the car, either; most automobiles have to undergo inspections, many modifications to the fuel and emissions systems are likely to cause your vehicle to fail.

I do think the paper has highlighted a couple of real issues, and implementing some basic safeguards such as limiting the conditions under which certain commands can be executed, and limiting which systems can issue certain types of commands should improve the security of automotive computer systems without compromising our ability to repair our vehicles.

If you are interested in this issue, check out Right to RepairH.R. 2057 (PDF) the proposed “Right to Repair” bill looks like a good starting point, it is proclaimed as

“A bill to protect the rights of consumers to diagnose, service, maintain, and repair their motor vehicles, and for other purposes.”

And we could all use a little protection.  Of course, we often want protection from the government, so protections mandated by the government will require a bit of scrutiny.



Monday, June 7, 2010

A bit of deep thought.

A couple of weeks ago Michal Zalewski wrote a guest post for Ryan Naraine over on the ZDNet Zero Day blog.  It stirred up some conversation, but I wasn’t going to comment on it until I hung out with the Pauldotcom crew for their 200th episode extravaganza and Hackers for Charity fundraiser.  The post and responses came up, and after a little deep (beer-induced) thought, I decided to share a few thoughts, and offer links to a variety of responses.

First, I almost skipped the post, the first sentence lost me:

“On the face of it, the field of information security appears to be a mature, well-defined, and an accomplished branch of computer science.”

Seriously?  Anyone who thinks that is clearly delusional.  But I know the Michal is not, he is brilliant, and Ryan encouraged me to read the entire post.  So, I did.  Even though the rest of the first paragraph really isn’t much better:

“Resident experts eagerly assert the importance of their area of expertise by pointing to large sets of neatly cataloged security flaws, invariably attributed to security-illiterate developers; while their fellow theoreticians note how all these problems would have been prevented by adhering to this year’s hottest security methodology. A commercial industry thrives in the vicinity, offering various non-binding security assurances to everyone, from casual computer users to giant international corporations.”

I am not sure how someone Michal’s age attained that level of cynicism, but it is impressive.  He goes on to say we have had no successes in software security, elegantly define the problems in a few ways, and then leave us there.  Michal appears to be making the kind of assertions that triggered my last post, I think he could really use a bit of perspective.  But enough of that, if you are interested in an interesting look at software security from a variety of perspectives check out the following links.  Note: these are some seriously smart folks, It often takes me a couple of passes at some of the ideas to get it.

Michal’s original post is here.

Amrit Williams has a great response here.

Ivan Arce responded here.  Ivan is crazy smart, and this is a thorough response.  It may take a little digesting to grasp Ivan’s points.

David Mortman has a great follow up post here.

Michal has a follow up to his post on his blog, including some comments, and links to a few responses (including some of the above).

It is an interesting series of posts.  But remember, nothing you read in any of them changes the fact that tomorrow is “Patch Tuesday”, with all the baggage that brings.  So keep a little perspective as you read the installments of this little drama.



Tuesday, June 1, 2010

Time for a new mantra.

[NOTE: On re-reading this post before publishing, I realize it sounds pretty bitter in places.  It should.  But, I do want to make clear that I respect the vast majority people who do the hard work, even when I disagree with some of what they say, or the way they say it.]
We need a new mantra in information security.  We've heard various forms of "think like an attacker" for ever.  And it is absolutely true.  But seriously, enough.  Make the point to the new, the uninitiated, those outside our craft- but otherwise, stop it.  The choir knows the tune, and the chorus, and lyrics, and can do it in rounds while drunk.
Here's my proposal:
Run a [Optional: expletive of your choice] enterprise.
Or maybe just
Run a [Optional: expletive of your choice] network.
It doesn't need to be a big one environment, but your MacBook, roommate's XP laptop, and a NAS server does not count.  You need to run a network, remediate problems, scramble and patch, screw up, get yelled at when things go down, and occasionally score victories.  You need to see things work, and see things fail.  If you are both good and lucky, you may get to see The Next Big Exploit in the wild, and watch it pass you by unscathed.
I am not saying to stop thinking like an attacker, but I am suggesting that if we accept that defenders should understand the attacker, those who do attack research should experience the world from the other side.  A classic case of this is the "technology X is fundamentally broken" statements we hear year after year, con after con.  Many people don't understand why they are ignored by management and admins when they make these absolutely true statements.  I'll tell you why, because no matter what we're told about the failures of anti-virus, web filtering, IPS, or whatever, we've seen these technologies work.  Perfect, no.  Fewer helpdesk calls, yes.  That is success.  Limited success, sure.
I just want people to tell the truth, and offer solutions, even imperfect ones.  "Technology X does not work as well as you need it to, but you can minimize the pain by doing Y" will have people at your feet begging for more.
I am not even asking for researchers to "pity the poor admin", but should a little empathy develop, I'm good with that.
By the way, some of the criminals do get this.  When the new MSRT ships and your botnet starts evaporating, you learn a lesson.  Bonus points for retiring "Criminals don't play by the rules", that is the epitome of an NSR statement.  (NSR == No [stuff], Really?)
A little perspective goes a long way- which is a very good thing, because many in our business seem to have very little perspective.