The Practical Aspect: The Human Elements of Risk

ISACA Journal volume 3
Author: Vasant Raval, DBA, CISA, ACMA, and Rajesh Sharma, Ph.D., CMMI Lead Appraiser, ITIL Foundation, Six Sigma Black Belt
Date Published: 30 April 2020

In classical Greek mythology, Daedalus was helplessly watching Icarus, his son, fall to his death. Daedalus, having designed the Minotaur’s Labyrinth, was imprisoned. To escape, he had fitted himself and his son with wings that he had innovated. During the escape, Icarus became intoxicated by this new power of flight and, despite Daedalus’s repeated warnings and his own lack of experience, Icarus took the risk of flying so high that the sun melted the wax holding his feathered wings.1 No matter how well humans try, risk scenarios remain in ways unnoticed or unimagined by human insights.

While nature offers its own set of systems, humans design systems for their wants and needs—systems that inherit subtle attributes of human nature; principally, the way in which people perceive, assess and mitigate risk. Assuming Daedalus did not anticipate anyone flying so high and approaching the sun, there was no design error. However, Icarus, through his own behavior, indulged in flying high and, thus, created an operational error.

Security breaches are on the rise. A 2015 PricewaterhouseCoopers (PwC) survey sponsored by the UK government revealed that the percentage of large organizations affected by breaches increased from 81 percent to 90 percent. The survey estimated an average of 117,339 incoming attacks daily, or 42.8 million annually.2 According to the Ponemon Institute 2018 Cost of a Data Breach Study, the root causes of data breaches were human error (27 percent), malicious or criminal attacks (48 percent) and system glitches (25 percent).3 All three causes essentially point to the human element of risk. While the scope here is on a single aspect—data loss—the findings reveal that, behind all consequences, the human hand is present.

But the difficulty is this: Knowledge of what these elements are and how to proactively address them is limited. The PwC survey showed that, compared to 68 percent in the previous year, staff awareness training was delivered to 72 percent of large organizations.4 However, it appears to have failed to effect change in human behavior.

Origins

Figure 1This view of human elements of risk, shown in figure 1, is drawn mainly from the book Thinking, Fast and Slow.5 The author asserts that fast thinking involves intuition and automatic—sometimes almost unconscious—mental activities of perception and memory. In contrast, slow thinking implies a more deliberate and effortful form of thinking. Originating in different regions of the brain, fast thinking is sometimes called the “hot system” or “System 1,” and slow thinking the “cool system” or “System 2.”6 System 1 is dominated by emotions, while System 2 emphasizes a cautious approach and reasoned answers. The author argues that the intuitive System 1 is “more influential than your experience tells you.”7 Both systems are geared to judgment and choice. Unfortunately, System 1 is not designed to incorporate multidimensional, hard evidence offered by statistics in its process; only System 2 can deal with such complex scenarios. Impulsive and intuitive, System 1 is where snap judgments are made using evidence that may be unreliable, but can be retrieved easily. In System 1, associative memory continually constructs a coherent, but not necessarily truthful, interpretation of what is going on in the world. The illusory certainty of hindsight feeds overconfidence, much like in Nassim Taleb’s book, The Black Swan.8

When System 2 is busy, System 1 takes over the task of judgment and choice. When risk-related decisions are made by System 1, chances are, the answers are, at best, inadequate, and may even be faulty. Take, for example, the spread of coronavirus. The media blitz combined with information overload from social networks inundates the social mechanism of availability of information, while the probability of cases of such virus may be unknown or low, say, in a small town in the US. But the availability rules over probability when System 1 is in charge, leading to an unreliable assessment of risk. The bottom line is System 2 should always be in charge of the human elements of risk.

Because the same pool of mental energy powers all voluntary effort of System 2, such energy may be depleted at times, for example, at the end of a tiring day of audit work. If System 2 is too tired to handle any more tasks, due to what one researcher calls ego depletion,9 System 1 takes over. Ego-depleted people are much more likely to make intuitive errors, and this can happen during a risk assessment exercise. In Thinking, Fast and Slow, the author describes a study of eight parole judges who spent an entire day reviewing parole applications. The proportion of approved cases rose to approximately 65 percent in a period of two hours after the last meal and then gradually dropped down to approximately zero immediately before the next meal.10 The conclusion is clear: Tired and hungry judges resort to the more defensible position of denial of parole requests.

WHEN PEOPLE THINK ABOUT RETURN, THEY TEND TO PUT AWAY THE THOUGHT OF RISK, AND WHEN EVALUATING RISK IN ANY DECISION, THEY TEND TO BE MORE OPTIMISTIC.

The same author identifies three additional factors likely to contribute to risk of human judgment:11

  1. Optimistic bias in risk perception
  2. What you see is all there is (WYSIATI)
  3. Theory-induced bias

Optimistic Bias
It can be argued that decisions have two sides: risk and return. When people think about return, they tend to put away the thought of risk, and when evaluating risk in any decision, they tend to be more optimistic. This bias toward optimism in risk assessment causes people to expect success, thus predicting failures on the lighter side. They design for acceptable risk and generally remain optimistic when evaluating the downside of an initiative. Optimistic bias has a tendency to overweight gains, while underweighting losses affects risk perception. “It is not going to happen here” is the syndrome that drives overconfidence and nonchalant acceptance of certain risk factors, albeit intuitively and without enough rational thinking. People know more about benefits, less about risk.12

WYSIATI
Humans have a tendency to assume that the past predicts the future. WYSIATI causes a restrictive or constrained view of the present, masking potential new risk. The eyes that look at the experience may be blinded, or at least not open enough to interpret what they see. In his 2001 letter to shareholders, Warren Buffett, chief executive officer (CEO) of Berkshire Hathaway, talked about how the company’s mistake of focusing on experience rather than exposure resulted in assuming a huge terrorism risk in its insurance business for which the company received no premium.13 Experience can hinder, rather than help, proper identification of risk.

Theory-Induced Bias
It is quite likely that the manager is projecting from a well-accepted theory in her or his evaluation of risk. But the theory itself may be faulty or incomplete. As stated in Thinking, Fast and Slow:

Once you have accepted a theory and used it as a tool in your thinking, you assume that there must be a perfectly good explanation that you are somehow missing. You give the theory the benefit of the doubt, trusting the community of experts who have accepted it.14

This is called theory-induced bias and it can lead one to not challenge any anomalies that might be otherwise examined seriously.

PERHAPS ELABORATE AND CREDIBLE FRAMEWORKS EMPHASIZE TOO MUCH TECHNOLOGY AND VERY LITTLE HUMAN FACTOR.

While individual perception of risk and its mitigation may suffer from optimism, the situation is unclear in a group setting, where much of the work gets accomplished in organizations. The group may resign itself to the loudest voice, an authority’s opinion (tone from the top) or to the organization’s traditions. The risk scenarios of the Boeing 737 MAX were probably known among engineers, but they reported to business managers who worry more about time-to-market. As a result, it is likely that engineers yielded to an optimistic bias among business managers. The organization’s environment and culture, including the tone from the top, should nurture practices that motivate risk-informed compliance to policies and practices.

 

Together, these origins of the human elements of risk suggest that the scenario is complicated, and it is generally not possible to weed out all gaps in risk assessment. As long as humans are in charge of designing systems, there will be misses.

The human elements of risk are often discussed in the context of employees, the most common user group. However, risk may emanate from other stakeholders (e.g., customers, suppliers, end users) or hackers in both technical and socio-technical systems. In any case, risk should not be equated with committed “errors,” for there may be risk related to omissions. The focus should be on all types of consequences of risk, not just errors in a narrow sense of the term. Finally, although the focus is on risk, the ultimate aim is risk mitigation, which is fundamental to governance.15

Cybersecurity Risk

Using figure 1 as a guide, it is important to reflect on the state of cybersecurity risk assessment and mitigation. High-profile cybersecurity breaches are reported in the media almost incessantly, leading to much greater availability of threat scenarios combined with little understanding of probability of their occurrence in one’s own world. At this time, if System 1 takes over, results can be misleading. This exposes the inadequacy of current assurance methods, which can gain much from human reliability assessment and improved statistical methods of obtaining true assurance16 based on a reasoned approach of System 2. Using security breach statistics, researchers contend that half of significant security incidents are due to people and the unintentional mistakes and errors they make.17 Citing data, the researchers concluded that it is difficult to apply cybersecurity controls concerning human behavior.18

WHILE MUCH OF THE RISK OF THE HUMAN ELEMENT IN SYSTEM DESIGN AND OPERATION CAN BE MITIGATED, SUCH RISK CANNOT BE TOTALLY AVOIDED.

Perhaps elaborate and credible frameworks emphasize too much technology and very little human factor, presumably leading to theory-induced bias. Moreover, human behavior is not consistent (that is, deterministic) and can be influenced by relationships, as in group settings in enterprises. Additionally, people naively assume that bad things only happen to other people.19 Also, research suggests that people are willing to undertake risky practices,20 perhaps due to asymmetric risk perception (underweighting risk, overweighting gains) and optimistic bias (figure 1).

What Can Be Done?

Human behavior is difficult to change. Perhaps the approach to the human elements of risk is inappropriate. Effective human reliability assessment should complement sound technical analysis of the physical systems with the development of organizationwide safety culture and risk management. The human error assessment and reduction technique (HEART) is one such validated error analysis and quantification technique to provide proactive quantification of human behavior,21 which may be helpful in effecting change. One researcher asserts that people instinctively resist being forced to do things differently.22 Appeals to fear may not work effectively; instead, it would help if barriers in their way are removed. The researcher suggests five ways to remove such barriers to change: reduce reactance (people’s desire to feel that they are in the driver’s seat), ease endowment (attachment to things we know or have used for a long time), shrink distance (keep incoming content close enough to people’s current perceptions), alleviate uncertainty (e.g., lower the barrier to trial and experimentation), and find corroborative evidence (hearing from more than one source).23

While much of the risk of the human element in system design and operation can be mitigated, such risk cannot be totally avoided. With the continuing explosive growth of the connected world, if anything, the human element will be at the forefront in future risk scenarios. As more of the human role in systems is automated through robotic process automation (RPA), for example, less risk may exist if the modified system is designed properly. Nevertheless, humans will remain the weakest link in the risk management chain.

Author’s Note

Opinions expressed in this column are the authors’ own and not those of their employers.

Endnotes

1 Duffey, R. B.; J. W. Saul; Managing Risks: The Human Element, John Wiley & Sons, United Kingdom, 2008
2 PricewaterhouseCoopers UK, PwC 2015 Information Security Breaches Survey, United Kingdom, 2015, http://www.pwc.co.uk/assets/pdf/2015-isbs-technical-report-blue-03.pdf
3 Ponemon Institute, 2018 Cost of a Data Breach Study: Global Overview, USA, 2018, http://securityintelligence.com/series/ponemon-institute-cost-of-a-data-breach-2018/
4 Op cit PricewaterhouseCoopers UK
5 Kahneman, D.; Thinking, Fast and Slow, Farrar, Straus and Giroux, USA, 2011
6 Ibid.
7 Ibid., p. 13
8 Taleb, N. N.; The Black Swan, Random House, USA, 2010
9 Baumeister, R. F.; E. Bratslavsky; M. Muravan; D. M. Tice; “Ego Depletion: Is the Active Self a Limited Resource?” Journal of Personality and Social Psychology, vol. 74, iss. 5, 1998, p. 1252–1265
10 Ibid., p. 43–44
11 Ibid.
12 Ibid.
13 Buffett, W.; “2001 Chairman’s Letter,” Berkshire Hathaway, 2001, http://www.berkshirehathaway.com/letters/2001pdf.pdf
14 Op cit Kahneman, p. 277
15 Raval, V.; Corporate Governance: A Pragmatic Guide for Auditors, Directors, Investors, and Accountants, CRC Press, Taylor and Francis Group, United Kingdom, 2020, Chapter 3: Risk and Governance
16 Evans, M.; L. A. Maglaras; Y. He; J. Janicke; “Human Behavior as an Aspect of Cybersecurity Assurance,” Security and Communication Networks, vol. 9, 2016, p. 4667–4679
17 Ibid., p. 4668
18 Ibid., p. 4670
19 Johnston, A.; M. Warkentin; “Fear Appeals and Information Security Behaviors: An Empirical Study,” MIS Quarterly, vol. 34, iss. 3, 2010
20 Op cit Evans et al., p. 4671
21 Lyons, M.; S. Adams; M. Woloshynowych; C. Wincent; “Human Reliability Analysis in Healthcare: A Review of Techniques,” International Journal of Risk & Safety in Medicine, vol. 16, iss. 4, 2004, p. 223–237. Also see figure 2 in op cit Evans, et al.
22 J. Berger, J.; “How to Change Anyone’s Mind,” The Wall Street Journal, 21 February 2020, http://www.wsj.com/articles/how-to-change-anyones-mind-11582301073?mod=searchresults&page=1&pos=1
23 Ibid.

Vasant Raval, DBA, CISA, ACMA
Is professor emeritus of accountancy at Creighton University (Omaha, Nebraska, USA). The coauthor of two books on information systems and security, his areas of teaching and research interest include information security and financial fraud. He recently published a book on corporate governance. He can be reached at vraval@creighton.edu.

Rajesh Sharma, Ph.D., CMMI Lead Appraiser, ITIL Foundation, Six Sigma Black Belt
Is a director of product and quality at Software Engineering Services. He has more than 19 years of experience in establishing and managing project management offices (PMOs), quality management offices (QMOs), metrics programs, process improvement, cybersecurity programs, and as a lead for independent verification and validation (IV&V) projects. As a QMO and IV&V lead, he has performed quality audits, process improvement and IV&V assessments. He can be reached at rajsharmane@gmail.com.