Most information security training is painfully boring. Business Technology’s resident US blogger Keil Hubert argues that it can be made exciting and effective by putting employees in the bad guys’ shoes and challenging them to exploit their own vulnerabilities.
I advanced the argument week-before-last that information is significantly more important than technology within the Information Technology career field. To be fair, we tech-heads keenly enjoy getting wrapped around the axle with thorny technology problems because – let’s face it – tech toys are a blast to play with. Getting a complex SAN or IPS fully operational can be a wicked challenge, and success rewards us with a flood of endorphins (and maybe even a performance bonus). More importantly, our bosses can actually see the end result of all our striving when we take them into the data centre. They may not understand what they’re looking at, but the can touch our newest creation and coo about all the blinking lights. That easily counts as a ‘win.’
Protecting vital information isn’t nearly as concrete a process as is standing up a new hardware component. Nuanced information protections measures are hard to explain – and much harder to prove! – to a boss that doesn’t grasp the underlying principles or processes. It’s no wonder that IT leads shy away from pursuing abstract security measures. No one wants to look like a fool in front of his or her boss. Further, no one wants to get mired in the endless struggle to change people’s behaviour. Unfortunately, that’s a large part of our role as tech leaders, and we’re honour-bound (if not contract-bound) to not shirk from our duty.
Unfortunately, changing people’s behaviour is darned difficult. That’s why so much information security training is just awful. It’s much easier to teach people how to memorize simple rule statements (e.g., ‘Don’t plug unapproved flash drives into company computers’) than it is to teach the fundamental principles that explain why such a rule is necessary. Getting meta with people – teaching them to see their world through a new metaphorical filter – is considerably more difficult. It’s not impossible, though …
One of the annual training programs that I was responsible for in my IT department was a recurring ‘operations security’ brief. In essence, OPSEC is the art of preventing unclassified indicators of friendly activities from falling into an adversary’s hands. For example, if a ship is about to leave from one port to another, an enemy spy could figure out that the move is about to happen by watching changes in the crew’s normal behaviour patterns.
Right before the sailors embark, for example, there might be a surge in spending on new uniforms or on dry cleaning. Or all the sailors from one ship might queue up to refill their allergy prescriptions at the chemist. By paying attention to these subtle changes in routine, an adversary could figure out that something is up. That’s why sailors (and soldiers, etc.) are reminded every year to be extremely careful in how they act in public, so as to mask any pre-deployment activities – thereby denying an adversary a chance to guess what’s about to happen.
The military is quite keen on OPSEC as a key sub-set of information security, because they’ve seen for years just how vulnerable friendly forces are to OPSEC disclosures. When I was in Korea, it was a running joke that the old lady who ran the snack bar outside the main gate would always know when a unit was about to leave on a field exercise … because the old woman would always be waiting at the training area with a box full of over-priced snacks when the first squaddies arrived.
I’m just as fond of teaching OPSEC to corporate employees as I ever was teaching it to squaddies, because good OPSEC coursework can permanently change how employees perceive the value of information security overall. A good OPSEC lesson can change a person’s life. It excites the hell out of me to teach it … but I always loathed sitting through other people’s classes because the content is usually boring as hell the way most people teach it.
Standard protocol is to drag a bunch of employees into a classroom. Fire up the projector. Show some PowerPoint slides that define some key terms. Maybe hold an impromptu quiz. Repeat until you’ve filled an hour, then wake up the audience and check their names off of a roster.
Repeat annually. Pointless. Useless. Boooooooooooorrrrrrrrrrrring.
That’s why I always preferred to teach the course my way. I’d change my presentation every time so that no one knew what was coming. Rather than rely on slides, I’d use a practical exercise to get people’s brains stimulated, and would then follow up the practical with a free-form discussion.  In every class, at least one student (often more than one) would have an epiphany that stuck with them.
In one class, I broke the employees up into a half-dozen random pairs and assigned each pair a different department (e.g., HR, payroll, etc.) to raid. Then I gave them ten minutes to go swap out the trash and recycle bins in their target location with empty ones – and after they returned to our classroom, another five minutes to find something potentially valuable to an adversary in the purloined refuse. It never failed … teams found everything from travel plans, to personnel rosters, to bank account numbers, to systems passwords – all content that was required to be destroyed, but had been carelessly chucked in the bin.
In another class, I took everyone outside to the company picnic grounds and gave the group ten minutes to walk up and down the car park. They weren’t allowed to touch or enter any vehicle, but they could look at anything on or in the cars that was in plain view. When the crew came back with their notes, I gave them five minutes to figure out who the most valuable potential targets were for a robbery, or for tailing to a high-profile home or office.
Inevitably, the profusion of critical records plainly observable on car seats helped to betray which drivers had money (from exposed bank slips), which had valuable medications (from exposed prescriptions and pill bottles), and which had access to wealthy communities (from country club and gated community access stickers).
In yet another class, I challenged the security team to build a profile of all of my various online identities using nothing more than ten minutes with Google. They lead tech was able to find my home address (from domain registration records), social media profiles, and two distinct account pseudonyms on discussion forums (one from cross-links, the other from distinct writing styles).
After all three of these challenges, I sat down with the employees and we discussed how the ‘critical indicators’ that they’d discovered could be used for evil purposes by a potential adversary. Then we compared how they lived their own lives to what they’d just done in the exercise. Did they throw valuable papers into the trash that really should be shredded or burned? Did they leave papers visible inside their car when they parked? Did they allow different online identities to link to one another such that they could be easily back-traced? The purpose of these chats wasn’t to scare people; it was to open their eyes to how trivial elements of information about them could be captured and capitalized on by a potential bad guy.
At least one employee always raised the argument that no one was ‘actively’ targeting them; they weren’t anyone special, and nothing special was going on at work. I countered that it was usually impossible to know when a potential bad guy was or was not targeting them. At any given moment, they could be actively or passively targeted for corporate espionage, for simple identity theft, for revenge by angry neighbours, for potential misbehaviour by law enforcement, or any number of potential adversaries. Operations security isn’t an active-selection, ‘use it when you need it’ skill – it’s a passive life skill. You apply it all the time as a matter of simple discipline because you have no idea when you’ll need to have used it. 
At this stage of the class, people were really getting into the subject. They could see how the abstract theory played out in their own lives. They could put themselves into an adversary’s position, looking at the world in terms of collecting valuable clues. They started to perceive just how vulnerable they’d been through inattention, carelessness, or overweening pride. Folks were hooked.
From there, we segued into a practical discussion about just how many indicators each employee was taking into the public sphere every time they ventured out of their homes. I’d let people draw their own conclusions based on what they’d just experienced during the practical exercise. Usually, I’d be able to sit back and let the employees carry the course along by arguing with one another, seminar style. People were often miffed when I had to cut them off and send them back to work.
Another great side effect of the hands-on training and personal revelation approach to understanding the core content was that employees were eager to share what they’d learned with their peers in other departments. Interactive training made for gripping (and often funny) stories that people felt compelled to share. Many times, I’d walk into a discussion where one of my employees was enthusiastically teaching a fellow from a different workgroup about what he’d just learned – good stories have a positive memetic contagion effect.
The best part of all was that I never had to wake anyone up at the end of a class. Everyone participated, most everyone learned something from the experience, and the core lessons from each exercise stuck with people for years afterwards.
Compare and contrast that with a traditional, dry, emotionally adrift, stultifying, slide-driven security awareness course. It doesn’t matter how high your employees might score on an end-of-course proficiency quiz if they haven’t actually internalized any of the concepts. Effective training programs need emotional involvement, personal resonance, and a spark of wonder to truly take root and stick with a student. For that to happen, you need to craft activities, revelations, and discussion content that speaks directly to people’s lives – create stories overflowing with meaning that the employee can’t help but share them with others.
That’s what you want out of a good cyber security training program: you want to change how your people perceive the world, so that they’re inclined to take prudent defensive measures instinctually when they encounter something squify. You can’t permanently change a person’s perspective by boring him half to death.
 Speaking of building practical exercises, my forthcoming book on tech interviewing techniques includes multiple chapters focused of how to build challenges like this. More to follow.
 This was much easier to teach to the hard-core gamers in the department. Compare, I explained, a defence against sneak attacks or critical hits that you have to activate with a deliberate action in order to function, versus a passive ability (like fortification) that’s always active against sneak attacks and critical hits regardless of whether you’re capable of responding or not.
Keil Hubert is a retired U.S. Air Force ‘Cyberspace Operations’ officer, with over ten years of military command experience. He currently consults on business, security and technology issues in Texas. He’s built dot-com start-ups for KPMG Consulting, created an in-house consulting practice for Yahoo!, and helped to launch four small businesses (including his own).
Keil’s experience creating and leading IT teams in the defense, healthcare, media, government and non-profit sectors has afforded him an eclectic perspective on the integration of business needs, technical services and creative employee development… This serves him well as Business Technology’s resident U.S. blogger.