Astronauts are a highly select group of engineers, scientists and pilots, and also some of the most rigorously trained of all professionals. For months and months, they run through scenario drills for every possible contingency. They are smart, proficient and motivated to solve any problem they encounter. And they should be. They are thousands of miles away from earth and must rely on their sophisticated space capsule, training, ground support and each other. It is safe to say they “want to be safe while completing the mission”.
The quote, “Houston, we’ve had a problem here” came from Astronaut John Swigert when the crew of Apollo 13 announced to Houston Ground Control than an explosion crippled their spacecraft. Something in the complex inner workings of the Apollo Command Module went wrong. The three Apollo astronauts were in serious trouble. Astronauts rely on a sophisticated system of technology and machinery to keep them safe.
But astronauts are not the only ones who want to be safe. I believe nearly every individual inside our organizations today wants to be “cyber safe”. Only a very small fraction of employees actively seek to create a negative cyber incident. And yet 80% of all cyber breaches involve employee actions and mistakes.
If we start from the assumption that “employees want to be cyber safe, but they often just don’t know how”, then we are dealing with an organizational system issue, not a behavior problem. And a good place to look for potential risks is the cyber security culture.
Culture As A Business System
A bad system will defeat a good person every time. ~ W. Edwards Deming
The traditionally accepted definition of culture tends to focus on habitual behaviours, shared beliefs and collective values among employees that result in either effective or ineffective behavior towards work, management, suppliers and customers. Employee behavior impacts business results.
However, the real issue is: what in the organizational system influences employees to behave in certain ways, and how are employee behaviors sustained and reinforced? Answers to these questions can give business leaders significant insight into the causal factors creating a specific corporate culture and the potential business risks inherent in the current culture.
At PYXIS Culture Technologies, we believe habitual employee behaviors, attitudes and beliefs are an outcome of corporate culture, and not the culture itself. By using a broader, enterprise definition of culture as a business system, it is possible to identify a network of organizational causal factors that directly and indirectly influence and sustain specific employee behavior. In simple terms, the system influences employee behavior, which in turn drives business results.
If there is a concern about cyber security effectiveness, then corrective insights can be gained by understanding and mapping the numerous organizational drivers and causal factors that make up the cyber security culture.
Cyber Security Culture
What are the cultural factors inside the company that influence cyber security? Our research with clients shows a number of company processes, policies, management capabilities and social network elements that are linked together into the cyber security culture system. Some of these drivers are healthy and promote positive cyber awareness and behaviors, while others foster poor or even risky employee behavior towards cyber security.
Using systems mapping and the PYXIS culture algorithm, we can build a visual map of a company’s cyber security culture using internal company data combined with expert assessments. Each culture driver can then be color coded based on whether it is an enabler or blocker of effective cyber security. Such a map gives the leadership team the ability to locate current and potential risks in the culture. With enough internal data on drivers and business performance it is even possible to link culture to business results.
Here is a model of a cyber security culture map adapted from our work with the CISO and cyber management team of a large European-based International Retail Bank. As you can see, several culture drivers show up as orange or red, indicating they act as barriers to effective cyber security.
Culture Change Is Really a System Change
Change the system to change the outcomes.
Fortunately for the astronauts of Apollo 13, with the help of Houston Ground Control staff they were able to contain the damage and land safely back on earth. For the safety of future manned space flights, it was critical for NASA scientists and engineers to discover what in the design and construction of the Apollo command module contributed to the explosion. It turned out that not just one faulty part, but a series of small errors and miscommunications to equipment manufactures several years before, when networked together as the Command Module oxygen tank system, resulted in the explosion.
Next time your organization reports a cyber security problem, look deeper to locate those factors in the cyber security culture system that may be the ultimate causal factors. A cyber failure is an excellent opportunity to discover the real causal factors.
For some time, I thought Apollo 13 was a failure. I was disappointed I didn't get to land on the moon. But actually, it turned out to be the best thing that could have happened. We learned from it. ~ Jim Lovell
For a discussion on cyber security culture inside your organization and how to map culture drivers, contact us here.