Monday, December 29, 2014

“Is your computer working?”

As promised, that other hospital tech incident.  I was leaving a friend’s room right after the nursing shift changed and the new nurses were beginning their rounds.  As I was preparing to leave I heard the nurse outside my friend’s room call down the hall “Is your computer working?”.  I paused in saying my goodbyes and we listened to the nurse muttering and typing ever louder on the mobile cart keyboard.  Not good.  Especially since that computer stood between my friend, and every other patient, and medications.  The nurse popped in, said they were having computer issues, and that she was going to pull his medications manually- the delay would only be a few more minutes.  And true to her word, his meds arrived only about 20 minutes late thanks to a manual backup routine for checking out medications.

As I left I saw that two of the cart computers were displaying “unable to authenticate” errors.  I don’t know what the problem was, and my friend never found out.  I guess he was too busy being seriously ill to diagnose authentication failures.

Not bad, eh?  There was a system failure, but backup procedures were in place to prevent serious problems.  High fives for all?

Not so fast.  That 20 minute delay doesn’t seem significant, unless of course you were the one waiting for medication.  Most critical meds would be administered intravenously so… wait, those are behind the same system.  But still, only a 20 minutes delay… except the process had to be repeated for each patient until the error was resolved, and the manual paper records had to be transferred into the computers when they were restored- so at the end of their shift the nurses were further distracted from patient care to do data entry. 

I’m not repeating these medical computer issues to throw stones at the medical profession, or at technologists working in healthcare- but to illustrate some fundamental issues with technology and security.

In the first tales of poor communication, there seemed to be be a few symptoms and causes, but one crucial result.  Data input was inconsistent and maybe not as easy for medical professionals to use as it could have been.  Probably related since there often wasn’t timely info available in the computer system, people relied on it less, and thus input less frequently- a classic “chicken and egg” situation.  The critical end result was delayed patient information,  but there was also the sadly familiar case of a system becoming a burden (and possibly even a liability) when it should have been an asset.  Usability, user buy-in, and management oversight all needed to improve to move this forward.  I’m sure that sounds familiar, although hopefully in different contexts.

Today’s tale is a bit different, it is about a failure to understand the consequences of operating on backup procedures.  “We have a plan for when things go wrong” is great and all, but if it doesn’t let people do their jobs in a reasonable manner without undue consequences your fail-safe is a failure.  Granted, these are extreme conditions; delayed email is not the same as delayed patient care, but there are still lessons to learn.

Oh, and you’ll note I didn’t mention compliance, that wasn’t an oversight.  I’m not an expert on healthcare compliance (unlike many who pontificate on it but can’t spell HIPAA) and I don’t want to blindly speculate on things like what perversions to pain management are imposed by the “war on drugs” and what that means for procedures for dispensing controlled substances.  If potential impact on patient care doesn’t get you thinking, I hope you aren’t working in healthcare.


Tuesday, December 16, 2014

About that Herbie Hancock book

The first Hancock story I mentioned last week is the opening story in his new book.  He tells the story better than I do.

I’m not far into the audiobook, but I wanted to hear a bit of it the other day between chapters of Kim Zetter’s new(ish) book on Stuxnet.  That one is good, too- Zetter balances making the story approachable to non-techies with detail enough to keep those with some knowledge of the events engaged.  Unfortunately, the audiobook version means I don’t have access to the extensive footnotes unless I buy a print copy, too- but I spend enough time on the road that the audiobook was the fastest way I would get to digest the book.

A note on the audio of these two books- the reader of Zetter’s “Countdown to Zero Day” speaks slowly and clearly, so slowly that I find the book much more listenable at 1.5x speed.  Herbie Hancock reads his own book and tells his own stories, his delivery is, not surprisingly, fantastic.

Yeah, I still owe you that other hospital story.  Remember, patience is a virtue.  It is not one of mine, but that’s another story.


Computers are efficient. And other lies.

Sometimes stuff gets put into perspective.  With force.

I was recently reminded of a few things which happened several months ago while I was visiting friends in hospitals (this happens more and more as you get old- or they are visiting you).

All events occurred at large, modern facilities- the kind with computers in every patient room plus roving computer carts, and all the patient info readily available to authorized personnel.  Of course, by “all” I mean “all information which has already been entered into the right systems”, which leads to my first observation.

Hanging out with my friend for an afternoon I got to overhear some of his conversations and frustrations with the medical staff.  It was a busy afternoon for him, no sooner had one team of specialists left him than another would wander in.  Each team came in with a handful of patient files, and checked up on him in the computer when they were talking to him.  And he invariably had to fill them in on some test result or comment from other specialists about his challenging situation- it was common enough that he kept a journal to make sure he could pass the latest info on to his caregivers.  Remember, computers everywhere, in rooms, staff stations, and mobile carts.  Oh, and paper files in a binder outside each patient’s room.  And that wasn’t enough to get info shared in a timely manner.  The computer systems were apparently less than efficient, so data input was tedious- thus forcing the reliance on paper, further slowing the timely input of data.  Somewhere the technology became a burden instead of an aid, and that compounded aggravation for the people who relied on the systems to do their jobs.  We’ve all seen poorly implemented technology like this, but seeing it in a hospital where a patient, your buddy, has to keep notes to make sure he bridges failures in communication with medical staff, that’s pretty terrifying.

Just as this was sinking in, one of the aides came in and took his vital signs- and scribbled them down on a scrap of paper to input somewhere else later.  This was not an anomaly, my buddy assured me that happened every time his vitals were taken throughout his stay, expensive machines display numbers, aides scribble them on scraps of paper for later input.  Damn, that’s the way to share important information in a timely manner.  And efficient, too.

doctor at office

That afternoon I wandered down to the waiting room a few times as doctors were examining him.  One time I overhead an interesting conversation, there was a pretty ugly technical problem and the person looking into it was the kind of network admin I want working in healthcare.  I’m sure he thought the waiting room was empty as he used the phone in the hall, so I got to overhear a pretty candid exchange.  He was investigating a connectivity problem with the wireless telemetry system, the system which monitors patients and reports the vitals and more to staff throughout the floor.  Wireless telemetry systems are generally used for patients who need continuous monitoring, but are somewhat mobile, such as post-operative recovery and patients with self-administered pain management.  The telemetry wireless was down and patient data wasn’t filling the screens in the halls and nurses stations, and that threatens patient care.  The admin was polite, and chose his words carefully, but he was obviously livid.  It was clear he was a network guy, not a medical professional, but his primary concern was patient care (as you would hope in a hospital).  It sounded like poorly planned maintenance had caused the outage, and proper procedures weren’t followed, resulting in the outage.  Another pretty scary scenario given the systems affected by the outage.  As appalled as I was that this happened, I was impressed with the admin’s focused outrage.  “Not only can’t this be happening now, it can’t ever have happened, and it can’t ever happen again” was one comment he made over the phone, a line I’m not likely to forget soon.  At the end of the call he explained to whoever was on the other end of the line that the issue would be reported to senior management- not IT management, but senior medical management.  As bad as it was that this problem happened, I was glad to hear that a network admin had a direct path to report issues to appropriate executives directly, even in a huge facility like [redacted].  That’s the way it should be, the head of medicine needs to know about preventable and unusual threats to patient care, regardless of the source.  Imagine what technology could accomplish without insular silos disconnecting technology from consequences- maybe my buddy could put away his notepad.

Another incident happened as I was leaving a different friend’s room at the end of visiting hours in another large, modern facility.  But that’s a story for another day, I’ll leave you to reflect on this little set of horrors until then.


Friday, December 12, 2014

The other Herbie Hancock story

Herbie Hancock’s other story

As promised, the second lesson from Herbie Hancock’s interview a couple of weeks ago.

Hancock was asked about the ease of musical creation and experimentation with modern computers and electronics. Not surprisingly, he loves the lower barrier to entry and the ease of experimentation- especially compared to the amazing lengths required for electronic musical experimentation in his early days. Then he said something striking, he talked about having to learn all of the old ways, the basics, the fundamentals- and then having to unlearn them to get the most out of new musical technologies.

The foundation provided a deep understanding, but could also hold him back from fully utilizing the new tools; that applies to many advances in technology, from understanding point ignition and carburetors before tackling modern computer controlled ignition and fuel injection, to advances in networking, virtualization, and cloud technologies.

Mastery includes knowing not only what to learn, but what to unlearn, and when- and knowing how to unlearn without forgetting.

I’m pretty good at the unlearning part, the rest I’m still working on.


Thursday, December 11, 2014

Herbie Hancock Stories

Herbie Hancock 2010 by Guillaume Laurent

Herbie Hancock

After the horror of faux country bubblegum abuse of “Crazy” I saw part of an interview with Herbie Hancock, it more than made up for the horror. Hancock has a new book out, “Possibilities”. I haven’t read it yet, but it is in my Audible queue for my next road trip. Based on the interview I heard, I’m really looking forward to hearing the book in his own voice.

Miles Davis 22

Miles Davis 

The first story came from the days when Hancock played with the great Miles Davis. During one show Herbie played an obviously wrong chord, and he was mortified at his mistake. Miles’ reaction was to pause very briefly, then play the “mistake” into the song until it was no longer a mistake, but part of the performance. And nothing was ever said about the mistake- because it was no longer a mistake. At face value, that is a great story about a gracious and talented musician. Beyond that, you can find a lot of inspiration and run with it as it moves you. It certainly can be applied to the mayhem of InfoSec in a few different ways.

There are a couple of quotes we often hear in InfoSec (and in the rest of life), both carry the same message, but come from two very different people.

In recent years, the more common quote comes from Mike Tyson:


Mike Tyson Portrait

“Everyone has a plan 'till they get punched in the mouth.”

The older quote, which I’ve heard attributed and misattributed to many people, is from Helmuth Karl Bernhard Graf, translated and paraphrased from the original German:

Helmuth Karl Bernhard von Moltke

“No plan survives contact with the enemy.”

As accurate (and quotable) as these quotes are, they are negative. I think Herbie Hancock’s story of Miles Davis dealing with the unexpected is a much better model for us and the challenges we face, no matter how idealistic that may be.

Tomorrow you can have the second story.


Wednesday, December 10, 2014

Manual labor and the horrors of television

File:Patsy Cline II.jpgWillie UK2K7 2

Are you either of the people shown above?  If not, please don’t try to sing “Crazy”.

The past several weekends have involved a fair amount of manual labor, which has reminded me how happy I am that I don’t do that kind of thing for a living anymore. On one of my beer breaks I flipped on the TV to see what horrors it held for me, and I was reward with one horror, and a couple of great stories.

First, the horror: Someone who was neither Patsy Cline nor Willie Nelson was attempting to sing “Crazy” on what passes for country music TV. It was pathetic. (Patsy Cline made that song hers, but Willie wrote it and his take on it is authentic). There are some songs that simply shouldn’t be done by folks who aren’t up to the task, Crazy is one of them. Stick to that pitch and tempo corrected bubblegum country crap, don’t defile masterpieces.

You may be wondering about the InfoSec angle here- but there really isn’t one. Most of us who are in InfoSec did it very badly and passed it off as good enough for quite a while when we started out- and many of us still do. That’s the nature of what we do, we rarely have the luxury of delivering “masterpiece” quality work, we do the best we can in the situation; expecting perfection is na├»ve in our world. In InfoSec, even Patsy Cline would be reduced to singing “99 bottles” with some regularity- and as with pop music, in InfoSec we get what the market demands and what the market will pay for. By the very nature of what we do we are technicians, not artists. If I were deep I might reflect that this may be why so many in InfoSec have artistic outlets- but that’s a simple answer to the complexity of humanity.

Now, about the good stories… those are for tomorrow.