q The American View: When Draconian Security Behaviour Expectations Backfire - Business Reporter

The American View: When Draconian Security Behaviour Expectations Backfire

The cybersecurity field sometimes gets s a bad rap because it’s so easy for people outside of it to get crosswise with their security department. Our inflexible techno-wizards are always banging on about “compliance with official standards.” We glower at everyone that we accuse of not measuring up. It never matters why, does it? If you miss one step in a hundred-step process, us security boffins lose our minds. Really makes it hard to empathize with our position, doesn’t it?

Sure, it does. I get it and I think I know how and why we created this source of endless friction. Far too many security policies and performance standards are written to be executed under ideal conditions: that is, they assume that a worker is fully trained, aware of the requirement, properly equipped, and free from distraction every time they’re called on to “follow the process.” It should be obvious that complicated requirements are less challenging when everything is running smoothly. Easy, even. Nothing that any reasonable employee should complain about (let alone fail to consistently perform!). The thing is, “ideal conditions” don’t come around that often in the real world, do they?

Take for example last Tuesday evening. Exactly one week ago today, my wife had a heart attack right after dinnertime. Seriously. It started with strong pain in her chest that radiated down one arm, extreme nausea, and increased blood pressure. All the normal symptoms of a lady suffering from cardiac distress. We jumped in the car and raced to our nearest A&E where the medics swiftly confirmed our suspicions: she was in medical distress and needed immediate treatment.

Fortunately, we have good health insurance and live near a fantastic hospital. My wife was able to be seen immediately. She received an EKG and blood tests within a half hour of being checked in (that’s very fast for an American “emergency room”). Even better, our local hospital is technologically advanced enough that they sent my wife emails announcing when each of her medical tests were posted to the facility’s “patient portal.”

This was the dumbest artist’s interpretation stock photo I could find for “patient portal.”

This put us in a bit of a bind: on the one hand, getting access to your medical records from your phone in near real time is amazing, especially for those of us who grew up in the era when all healthcare was recorded solely on paper forms. Super cool if you’re a nerd! On the other hand, gaining access to the hospital’s “patient portal” required setting up access credentials for a new service … on a phone … after midnight … while stressed completely out of our gourds. I’d consider this to be less than ideal conditions.

Obviously, I work in cybersecurity. My wife doesn’t, but she’s listened to me rant and rave about security topics for decades now and is overqualified to fill an entry-level security analyst position as a result. We both make it a point to follow cybersecurity best practices in our personal lives. For example, we make all of our system passwords long, complicated, nonsensical, and – most importantly – unique. We strive to never use the same password twice. Our habit is to craft a passphrase that maximizes entropy, like four or five unrelated words string together. We want to make the bad guys work for it if they aim to hack us. It’s professional courtesy, donchaknow.

The trouble with that standard is that it’s relatively simple to follow when you’re, say, wide awake. In your office. While not clenching your teeth from the pain of, say, experiencing a heart attack. In the moment, complying with that standard became … difficult. My wife couldn’t manage at all; her pain was severe enough that the A&E nurses gave her morphine which pretty much shorted out her attention span. When her first “patient portal” emails came through, she handed me her phone and told me to deal with it on her behalf.

Bear in mind that it was two in the bloody morning by then and I was struggling to stay awake and focused. I went through all the setup steps to establish the new account and when it came to creating a new, long complex, unique password I choked. Between fatigue and worry, I couldn’t think straight. Rather than fail the task, though, I fell back to the last compliant password that I’d created for a completely unrelated account earlier that day. I sacrificed the “unique” requirement to craft a password for this necessary and time-sensitive requirement that I’d still be able to remember the next day.

“Wait, did I type ‘ZFGWw9e07-@$#’ or ‘ZFGWw9eO7-@$#’? Dagnabit!”

Technically, this constituted a failure to fully perform the necessary security protocol governing accounts that hold sensitive information (PII, PHI, and financial data, specifically). Had this been a task performed for work, my choice would no doubt have been caught and flagged by the security boffins or the auditors (or both) for corrective action. Possibly for disciplinary action, too, since I knew the complete standard and didn’t obey it. I understand that. Hell, I teach that.

The thing is … Needs must when the devil drives. My B- performance felt like an acceptable compromise given the time sensitivity and my abbreviated personal risk assessment. Taking more time might have allowed me to fully meet the requirements, however any delay in reviewing the data might have compromised our ability to make life-saving decisions while we had a viable window of opportunity. I chose to accept the slightly increased risk of having this account get compromised because I wasn’t willing to gamble with my wife’s critical medical treatment.

To be clear, I’m not trying to justify my decision. Rather, I want to leverage this story to illustrate the sort of “make or break” decisions that workers regularly face under real world conditions. Sometimes, the totality of circumstances makes full compliance with a difficult performance standard impractical … or even impossible. Process owners and process auditors both need to understand that processes designed for optimal conditions may need to be modified during high stress events. No process design is ever so well-crafted that it covers every contingency.

Some standards must be utterly inflexible, like not allowing unauthorized users to access a protected information system. For example, it’s perfectly appropriate and correct for a hospital to demand that no infants may be removed from its neonatal ward by anyone other than the child’s authenticated parent or a badged member of the ward staff. That makes sense.

For example, even in ‘gun crazy’ Texas, you NEVER leave your firearms unsecured. There is no excuse to violate this rule EVER.

Most standards, though, must be modified when things are going completely to *&#%. A hospital’s infant check-out rule must pragmatically be violated if, say, the ward is on fire. If a firefighter is present to help carry a new-born to safety before the flames reach the creches, let her! Throwing a fit about “authorized baby-movement personnel” two years later during a process audit misses the point of having the standards in the first place: protect the children’s health and safety first; protect the hospital from legal threats a distant second.

I say all this not to justify relaxing or ignoring security standards during an emergency. Rather, I want to remind my colleagues in the cybersecurity world of two critical points: First, we must design processes and standards that allow some measure of flexibility under emergency conditions. Being overly rigid puts people in an impossible dilemma during a crisis: obey and get fired for causing harm or disobey to prevent harm and get fired for wilful noncompliance.

Second, we must take the totality of circumstances into account before we bring the proverbial hammer down on a user who deviated from required standards during a crisis. Sometimes, our tendency to be draconian about standards is actively harmful to our institutional credibility. We become the users’ enemy rather than their trusted partner

I changed one of those two common passwords after the fact, BTW. As soon as I had time to think straight and set things right. Sometimes, that’s the most we can expect. Better that than not trying at all because you’re damned for a lack of realistic options.

Keil Hubert

Keil Hubert

POC is Keil Hubert, keil.hubert@gmail.com Follow him on Twitter at @keilhubert. You can buy his books on IT leadership, IT interviewing, horrible bosses and understanding workplace culture at the Amazon Kindle Store. Keil Hubert is the head of Security Training and Awareness for OCC, the world’s largest equity derivatives clearing organization, headquartered in Chicago, Illinois. Prior to joining OCC, Keil has been a U.S. Army medical IT officer, a U.S.A.F. Cyberspace Operations officer, a small businessman, an author, and several different variations of commercial sector IT consultant. Keil deconstructed a cybersecurity breach in his presentation at TEISS 2014, and has served as Business Reporter’s resident U.S. ‘blogger since 2012. His books on applied leadership, business culture, and talent management are available on Amazon.com. Keil is based out of Dallas, Texas.

© Business Reporter 2021

Top Articles

Reforming upskilling strategies for the changing work landscape

Leaders across industries must upskill the workforce to deliver new business models in the post-pandemic era

Green or greenwashing?

Procurement must stamp out greenwashing from supply chains, to ensure that organisations’ products and goals are not just a “green…

American View: Why Do Cultural Taboos Frustrate New Technology Implementation?

Businesspeople seldom evaluate new technologies on capabilities alone; why do peoples irrational beliefs impede attempts to discuss worthwhile innovations?

Related Articles

Register for our newsletter

[ajax_load_more loading_style="infinite classic" single_post="true" single_post_order="previous" post_type="post" elementor="true"]