Better Cybersecurity Awareness Through Research

Better Cybersecurity Awareness Through Research
Author: Ranjit Bhaskar
Date Published: 18 May 2022
Related: A Holistic Approach to Mitigating Harm from Insider Threats | Digital | English

In the last few years, information security professionals have faced tremendous challenges. Just in 2021, there were more than two billion malware attacks and trillions of intrusion attempts.1 Ransomware attacks alone have increased by 151 percent compared with 2020.2 In fall of 2020, Cybersecurity Ventures estimated worldwide cybercrime costs would reach US$6 trillion annually by the end of 2021, ransomware damage costs would rise to US$20 billion, and an enterprise would fall victim to a ransomware attack every 11 seconds during the year.3 The European Union introduced 474 separate enforcement actions for EU General Data Protection Regulations (GDPR) violations, starting from the time enforcement of the GDPR began in 2018 through December 2020, with fines totaling US$312.4 million.4

Cybercrime is projected to worsen due to the rapid changes resulting from the COVID-19 pandemic. Thousands of organizations allowed employees to continue working from home throughout 2021, and there are indications that many will permit hybrid work indefinitely. Remote work opens a Pandora’s box of issues for organizations trying to maintain some semblance of security. Cybersecurity firm Malwarebytes reported in an August 2020 survey that remote workers caused security breaches in 20 percent of the organizations it surveyed.5 Although the need for security awareness training for remote employees is pressing, many organizations have been finding it more difficult to implement than providing established training in a centralized workplace.

Importance of Awareness and Training

One study conducted with participation from more than 5,000 organizations around the world discovered that organizations are becoming more aware of the role of their employees play in information security incidents. Survey data collected as part of the study reported that 52 percent of organizations indicated employees were their biggest weakness in IT security, with their actions putting the business and the organizational information security strategy at risk. Forty-three percent of the organizations polled considered deployment of more sophisticated software an effective way to safeguard themselves against evolving threats (figure 1). Offering staff training was the second most popular method for safeguarding organizations according to the survey, closely followed by increasing internal IT or IT security staff.6

Figure 1
Source: Adapted from Kaspersky, “The Human Factor in IT Security: How Employees Are Making Businesses Vulnerable From Within,” Kaspersky Daily, www.kaspersky.com/blog/the-human-factor-in-it-security/.

Similar to those findings, Verizon’s 2021 Data Breach Investigations Report states that nearly 85 percent of incidents and data breaches from 2020 were attributable to human error.7 Additional data from the field comes from the Willis Towers Watson Cyber Claims Analysis Report, 8 which reveals that clients filed close to 1,200 data breach claims in nearly 50 countries from 2013 to December 2019. The report identifies human error such as employees clicking on links in phishing emails or replying to spoofed emails as the most common root causes of breaches (figure 2). The costliest events were typically those where the threat actor impersonated a chief executive officer (CEO) or senior manager. The most frequently employed social engineering tactic was impersonation of a vendor or supplier. All these things could easily be prevented through employee education and training, the report concludes.9

Figure 2
Source: Adapted from Willis Towers Watson, Cyber Claims Analysis Report, United Kingdom, 2020, www.willistowerswatson.com/en-NZ/Insights/2020/07/cyber-claims-analysis-report. Copyright 2020 Willis Towers Watson. All rights reserved.

According to a white paper from Osterman Research, employees who received cybersecurity training demonstrated a significantly improved ability to recognize potential threats, earning the respect of their organization’s security teams.10 By applying a model that Osterman developed to data acquired through a survey of 230 individuals in North American organizations, the researchers concluded that smaller organizations could achieve a return on investment (ROI) of nearly 70 percent and larger organizations could achieve an ROI of 500 percent, on average, by implementing security awareness training.11

Deficiencies in Current Learning Techniques

The importance of cybersecurity awareness is underscored by reports of incidents attributed to careless human behavior and lack of training, which continue to rise at alarming rates, despite commitments from small and large organizations to increase staffing in information security support groups and expand cybersecurity technology budgets. Many organizations either underestimate the effort needed to educate a workforce or do not realize that their current cybersecurity training approaches are ineffective.

KnowBe4’s 2021 State of Privacy and Security Awareness Report notes that a large percentage of surveyed employees did not feel confident that they could identify a social engineering attack, recognize the warning signs that their computers were infected with malware or describe to their senior management the security risk associated with employees working from home.12 Government, healthcare and education employees were the least aware of various social engineering threats.

Much of the current literature and research on improving cybersecurity awareness training is focused on how to develop an effective program or how to identify the components missing from a program. This is a great start, but it is not enough. For example, the 2021 SANS Security Awareness Report: Managing Human Cyber Risk13 identifies which needs to prioritize while building an effective program, such as having several full-time employees focused on changing behavior, providing job titles commensurate with responsibilities, ensuring leadership support, fostering collaboration among departments and engaging people with specialized communication skills to strike the right balance by not being too technical or lengthy with organizational messaging.

Conspicuously missing from many current reports, including the ones already mentioned, are inquiries into whether both the training material content and its delivery are inherently flawed. Something is lacking in the current environment. Could it be related to the technique, or lack of it, in delivering cyberawareness material within organizations? Are some approaches more effective? Do people learn, absorb and remember better when material is presented a certain way?

For example, the SANS Security Awareness Maturity Model (figure 3) gives organizations the ability to compare and contrast the maturity level of their security awareness program and helps them focus on areas that need improvement.14 However, the model could be made even more helpful if it included references to how organizations can apply research on how humans learn and the most effective methods of content delivery. Organizations that do this should be rated higher on the maturity scale.

Figure 3
Source: Adapted from SANS, “Measuring Program Maturity,” www.sans.org/security-awareness-training/resources/maturity-model/.

Many organizations either underestimate the effort needed to educate a workforce or do not realize that their current cybersecurity training approaches are ineffective.

Models and frameworks are a great start, but developing a structured awareness program with tools for metrics monitoring only solves part of the puzzle. Organizations also need research-based information on how to create inspiring content, along with techniques for delivering it effectively.

Better Ways to Learn

Review of the literature on how humans learn and retain information reveals interesting techniques and practices that are applicable to cybersecurity awareness and training campaigns. Organizations looking to maximize return on investment when it comes to cyberawareness would do well to take a closer look at adopting some of the following into their own learning systems.

Distributed vs. Massed Practice
There may be benefits to moving away from training assignments that offer a single, continuous training session. Offering an initial fact-sharing or concept-learning session followed by periodic reviews may be a better approach. Research indicates that providing the same information after the initial session in smaller chunks and at a carefully chosen frequency reinforces learning.15 Short, spaced-out study sessions lead to meaningful learning, whereas cram sessions often lead to nothing deeper than memorization.

In a 2019 interview, a researcher at Dartmouth College, Hanover, New Hampshire, USA, said that studying information or practicing a task just once is not good enough.16 For permanent learning, the timing of the review or practice of the information is critical. Distributed practice refers to studying the material to be learned at a specified time after the original learning event. Massed practice, on the other hand, refers to study sessions that happen right after the original learning event.

Research shows that distributed lessons improved elementary school children’s ability to generalize their learning17 and that when faced with unique situations, college students who participated in a spaced review following the primary lecture adapted better than students who just received a massed online review.18

Conspicuously missing from many current reports… are inquiries into whether both the training material content and its delivery are inherently flawed.

Distributed reviews of the same information make the technique effective. It is not the same as spreading different chunks of material over several sessions or days. Organizations that comprehend the advantages of distributed practice over massed practice may choose to drop the idea of holding annual or quarterly cybersecurity awareness training events—that is, cram sessions—that take all day or multiple days, usually with exercise sessions on learning objectives at the end. A better approach may be to opt for short sessions that introduce the idea of a single cybersecurity concept, say phishing, followed by defined, specific breaks of days or weeks before the same concept is revisited in the form of multiple follow-up sessions or a set of spaced-out exercises emailed to the participants. The examples in figure 4 illustrate the differences between the two approaches.

Figure 4
Source: Sjouwerman, S.; “Red Flags Warn of Social Engineering,” KnowBe4 Security Awareness Training Blog, 29 May 2021, http://blog.knowbe4.com/red-flags-warn-of-social-engineering. Reprinted with permission.

Massed practice, or the boot camp approach, may work to some extent for achieving a short-term goal such as passing an exam. However, for employees to achieve long-term retention of learned concepts—something that is of vital importance to organizations when it comes to cybersecurity—distributed practice is a superior method of learning.

Reconsolidation
Tweaking a distributed practice approach to include memory reconsolidation can make training more effective. Although distributed practice involves presenting the same information, a small alteration of facts or measures reinforces the training. The key is to combine distributed practice with subtle changes in the follow-ups.

Making slight changes to the study material or task during practice sessions may help trainees master a skill much more quickly than they would without alteration. The results of a study by researchers at Johns Hopkins University, Baltimore, Maryland, USA, lends credence to the theory of reconsolidation by showing that motor skills are strengthened when existing memories are recalled and modified with new knowledge.19 The researchers found that the gains in performance such as speedier and more accurate task completion nearly doubled in the experimental group given an altered second session, compared to a group that repeated the same task without any change. The researchers concluded that a trainee learns more and learns faster by practicing a subtly altered version of a task than by practicing the same thing multiple times in a row. However, the changes in the training must be subtle because if the modification renders the task noticeably different, trainees do not realize the desired gain.

The science behind reconsolidation is still subject to debate, but results so far offer a glimpse of the possibilities for using it to strengthen learning. Consolidation refers to how the human brain learns new material. Retrieving that material after first exposure but before the learner has had a chance to forget it reactivates the learning process. It theoretically gives the learner an opportunity to weaken or strengthen memory retention. It appears possible to disrupt or impair retention by providing conflicting or incorrect information after an initial learning event or to strengthen it by providing correct information with slight updates. The updates might help to close gaps in the initial learning experience, thereby strengthening it.

For example, when using the distributed practice approach to teach employees about phishing, it might be beneficial to use the principles of reconsolidation in the follow-up sessions by subtly changing the phishing scenarios in the exercises. Each social engineering red flag (figure 5) in the set might be used to introduce a slight variation to the initial learning session to reinforce the main learning subject of phishing.

Figure 5
Source: Sjouwerman, S.; “Red Flags Warn of Social Engineering,” KnowBe4 Security Awareness Training Blog, 29 May 2021, http://blog.knowbe4.com/red-flags-warn-of-social-engineering. Reprinted with permission.

The Value of Case Studies

Nothing raises cybersecurity awareness more effectively than showing the aftermath of real breaches in targeted enterprises and highlighting how vulnerable all organizations are to cyberthreats. One well-known and effective training technique is the use of case studies in problem-based learning (PBL) scenarios. The results of multiple studies indicate that PBL has a significant positive effect on students’ skill development and knowledge retention.20 The US National Center for Case Study Teaching in Science polled more than a hundred university faculty members who had been trained to use case studies and found that more than 90 percent reported that students who were taught using sample cases learned new ways to think about issues and took an active part in the learning process.21

Key characteristics of a good discussion case include being concise; being somewhat controversial to grab attention, but maintaining balance and not getting carried away; having memorable characters act out the case study with dialogue; ensuring that the subject material is relevant to the learners; presenting a dilemma to be solved; being contemporary rather than purely historical; using real rather than fabricated scenarios and having clear learning objectives.

Additional research expands on the qualities that make a good case study,22 including being pertinent to the class and learning objectives,23 being connected to theory and practice 24, 25, 26 and telling a story containing some form of ambiguity.27

Information security officers struggling to sell internal leadership and stakeholders on abstract concepts such as segregation of duties (SoD), change management and other internal IT controls might find that case studies based on events in the news are effective tools to get their message across.

For example, a complete case study on the Bernard Madoff financial scam—complete with short news video clips and government press releases on IT personnel being charged with crimes—helped illustrate complex topics in an easy-to-understand format for the leadership at a not-for-profit and get its buy-in for implementing organizationwide checks and balances (figure 6).

Figure 6

Learning From Incidents and Accidents
Industrial accidents have been studied and analyzed over many decades, and learning from these incidents that inevitably occur in large systems—especially ones that pose a danger to human safety, such as those that occur with chemicals and other hazardous materials—has always been a top priority. By establishing a framework for learning from incidents, an organization can reduce risk and minimize loss and, thus, become a more reliable organization over time. Learning from incidents and accidents fosters a culture of continuous organizational improvement that will reduce incident severity and risk of disaster. Organizations that do not learn from past errors are doomed to repeat them, for example:

  • The US National Aeronautics and Space Administration (NASA) lost two space shuttles, the Challenger in 1986 and the Columbia in 2003. The Columbia Accident Investigation Board noted that “[T]he causes of the institutional failure responsible for Challenger have not been fixed. Second, the Board strongly believes that if these persistent, systemic flaws are not resolved, the scene is set for another accident.”28
  • Failure to learn was among the causes for the Deep Water Horizon disaster. British Petroleum had experienced several major incidents before— specifically the Grangemouth refinery incident in Scotland, the Texas City refinery explosion in the US state of Texas, and the Prudhoe Bay leaks in the US state of Alaska.29

Incidents, however, do not always have to end in disaster.30 A system can be put in place to control their severity. It is important to recognize that, in most cases, a disaster results from a chain of events going undetected rather than from a standalone, spontaneous event. If an effective learning system could detect the incident, the chain of events could be broken and a disaster could be prevented.

By establishing a framework for learning from incidents, an organization can reduce risk and minimize loss and, thus, become a more reliable organization over time.

In addition to implementing systems that enable learning from incidents, technology organizations should investigate the use of mitigating systems, including a sort of kill switch that could potentially prevent an incident from becoming a disaster. One IT organization that suffered a series of ransomware events used data security software to study the data from the incidents and determined that all the attacks originated from end-user workstations and spread to network shares before showing up on assets of higher value.31 A mitigating system using the data security platform was soon put in place so that ransomware activity detected at the end-user’s virtual local area network (VLAN) would automatically disable the user account the questionable activity was operating under and close ports to other parts of the network. Other examples of mitigating systems are automatic blocking of remote Internet Protocol (IP) addresses based on scanning activity detected by a security information and event management (SIEM) tool and automatic disabling of user accounts flagged as exhibiting unusual activity by other security tools.

In IT, especially cybersecurity, distilling the data collected from past incidents and accidents into actionable, effective training remains a challenge.

Organizations other than typical industrial factories have already adopted techniques to prevent incidents from descending into chaos and disaster. In the mid-1980s, researchers at the University of California, Berkeley, USA, began taking a closer look at why some organizations, despite their complex and risk-prone environments, continued to succeed in avoiding major safety incidents.32 The term high reliability organizations (HRO) soon evolved to refer to this category of organizations. Researchers discovered that HROs use several tools and initiatives to learn from safety incidents, some of which could be adapted for use in healthcare, for example.33 The researchers conducted a systematic review to identify effective learning tools that multidisciplinary teams in healthcare could adapt and use following a patient safety incident. IT-reliant organizations are on the cusp of facing major disasters; the attempted poisoning of city water in Oldsmar, Florida, USA,34 and the Colonial Gas pipeline ransomware event are early warning signs.35 If IT does not invest in learning systems championed by industries other than IT over the years, catastrophes will become inevitable.

Sometimes lessons are not learned. This can happen for various reasons, including the handling of information within and between organizations in a way that discourages dissemination of lessons learned from emergencies, training and educational programs, placing an unhealthy emphasis on what to learn rather than how to learn and ingrained organizational cultures preventing learning.36

Recommendations for avoiding these traps include creating an official policy for identifying and learning lessons, developing techniques to identify and learn lessons from mock exercises and major emergencies, and engaging in an ongoing process of cross-training so that diverse teams can fully develop a broad understanding of how other teams think and operate under pressure. Data from the incident learning system can be applied in future training sessions.37

Safety-conscious industrial organizations, often under the watchful eye of regulators, have long prioritized collecting data about accidents and feeding them into learning systems that are then converted to simulations used in training. Although the typical IT organization may not have as many situations that could affect human safety as other industry sectors, IT in general is known for having a culture of investigating failures and conducting root-cause analyses. However, in IT, especially cybersecurity, distilling the data collected from past incidents and accidents into actionable, effective training remains a challenge.

For example, one organization responded to the challenge by converting root-cause analysis of its actual cybersecurity incidents into animated training simulations that were then emailed out as quarterly or annual retrospectives (figure 7). IT personnel received a more detailed tear-down of the incidents using the same animation medium.

Figure 7

Conclusion

Despite investing in training, technology and technical know-how and expanding their budgets to support ever-increasing information security operational costs, organizations are still falling victim to cyberattacks every day. These attacks show no sign of slowing down. Checkpoint research reported 900 weekly attacks per organization in 2021, a 50 percent jump compared with 2020.38

Organization leaders need to realize that cybersecurity awareness is not just about training nontechnical employees about phishing and online scams and then arming them with better security practices. Cybersecurity awareness also plays a vital role in changing an organization’s culture for the better by changing certain behaviors (e.g., leaky change management practices; arming people with the data to make better decisions; providing the confidence and means to challenge unethical behavior, such as a senior manager falsifying disaster-recovery test results). The server left unpatched, the files left unprotected and the critical security feature, which took nine months to roll out, reflect organizational cultures that are in dire need of reform.

To create lasting change, organizations must not only build the components of an effective cyberawareness program, but also improve the quality of their content and the mechanisms for its delivery. Informed by research concerning how humans learn, organizations can adopt the most effective techniques to aid the developers of their training systems. With the right tools, developers can tailor content that improves employees’ learning speed and information retention and help employees adapt quickly to changing environments and situations, even during times of heavy workload and high pressure.

Author’s Note

The information and views expressed in this article are those of the author and do not constitute any official position, policy or pronouncement of his employer.

Endnotes

1 SonicWall, 2021 SonicWall Cyber Threat Report, USA, 2021, http://www.sonicwall.com/resources/white-papers/2021-sonicwall-cyber-threat-report/
2 Ibid.
3 Morgan, S.; “Cybercrime to Cost the World $10.5 Trillion Annually by 2025,” Cybercrime Magazine, 13 November 2020, http://cybersecurityventures.com/hackerpocalypse-original-cybercrime-report-2016/
4 KnowBe4, 2021 State of Privacy and Security Awareness Report, USA, 2021, http://www.knowbe4.com/hubfs/2021-State-of-Privacy-Security-Awareness-Report-Research_EN-US.pdf
5 Malwarebytes, Enduring From Home: COVID-19’s Impact on Business Security, USA, 2020, http://www.malwarebytes.com/resources/files/2020/08/malwarebytes_enduringfromhome_report_final.pdf
6 Kaspersky, “The Human Factor in IT Security: How Employees Are Making Businesses Vulnerable From Within,” Kaspersky Daily, http://www.kaspersky.com/blog/the-human-factor-in-it-security/
7 Verizon, 2021 Data Breach Investigations Report, USA, 2021, http://enterprise.verizon.com/content/verizon-enterprise/us/en/index/resources/reports/2021-data-breach-investigations-report.pdf
8 Willis Towers Watson, Cyber Claims Analysis Report, United Kingdom, 2020,
http://www.wtwco.com/en-NZ/Insights/2020/07/cyber-claims-analysis-report
9 Ibid.
10 Osterman Research, Inc., The ROI of Security Awareness Training, USA, August 2019, http://www.mimecast.com/resources/analyst-reports/osterman-research---the-roi-of-security-awareness-training/
11 Ibid.
12 Op cit KnowBe4
13 DeBeaubien, ; L. Spitzner; H. Xu; N. Zhang; 2021 SANS Security Awareness Report: Managing Human Cyber Risk, USA, 2021, http://www.sans.org/security-awareness-training/resources/reports/sareport-2021/
14 SANS, “Measuring Program Maturity,” http://www.sans.org/security-awareness-training/resources/maturity-model/
15 Carpenter, K.; N. J. Cepeda; D. Rohrer;H. K. Kang; H. Pashler; “Using Spacing to Enhance Diverse Forms of Learning: Review of Recent Research and Implications for Instruction,” Educational Psychology Review, vol. 24, iss. 3, http://www.jstor.org/stable/43546797
16 Francisco, A.; “Ask the Cognitive Scientist: Distributed Practice,” Digital Promise, 8 May 2019, http://digitalpromise.org/2019/05/08/ask-the-cognitive-scientist-distributed-practice/
17 Vlach, H.; C. Sandhofer; “Distributing Learning Over Time: The Spacing Effect in Children’s Acquisition and Generalization of Science Concepts,” Child Development, 22 May 2012, http://ncbi.nlm.nih.gov/pmc/articles/PMC3399982/ncbi.nlm.nih.gov/pmc/articles/PMC3399982/
18 Kapler, ; T. Weston; M. Wiseheart; “Spacing in a Simulated Undergraduate Classroom: Long-Term Benefits for Factual and Higher-Level Learning,” Learning and Instruction, April 2015, http://www.sciencedirect.com/science/article/abs/pii/S0959475214001042?via%3Dihub
19 Wymbs, N.; A. Bastian; P. Celnik; “Motor Skills Are Strengthened Through Reconsolidation,” Current Biology, 8 February 2016, http://www.sciencedirect.com/science/article/pii/S0960982215015146
20 Herreid, ; “Using Case Studies to Teach Science,” American Institute of Biological Sciences, 2005, http://files.eric.ed.gov/fulltext/ED501359.pdf
21 Herreid, ; “Case Studies in Science–A Novel Method of Science Education,” Journal of College Science Teaching, February 1994, http://eric.ed.gov/?id=EJ487069
22 Anderson, ; “Teaching Developmental Theory With Interrupted Video Case Studies,” Journal of the Scholarship of Teaching and Learning, December 2019, http://scholarworks.iu.edu/journals/index.php/josotl/article/view/25385/3711
23 McFarlane, D.; “Guidelines for Using Case Studies in the Teaching-Learning Process,” College Quarterly, Winter 2015, http://files.eric.ed.gov/fulltext/EJ1070008.pdf
24 Anderson, B.; S. Bradshaw; J. Banning; Using Interrupted Video Case Studies to Teach Developmental Theory: A Pilot Study, Gauisus, 2016, http://sotl.illinoisstate.edu/downloads/gauisus/AndersonVolume4.pdf
25 Penn, ; C. Currie; K. Hoad; F. O’Brien; “The Use of Case Studies in OR Teaching,” Higher Education Pedagogies, 8 March 2016, www.tandfonline.com/doi/full/10.1080/23752696.2015.1134201
26 Prud’homme-Généreux, A.; “Case Study: Formulating Questions That Address Student Misconceptions in a Case Study,” Journal of College Science Teaching, March 2017, http://eric.ed.gov/?id=EJ1136640
27 Boston University Center for Teaching and Learning, Massachusetts, USA, “Using Case Studies to Teach,” http://www.bu.edu/ctl/teaching-resources/using-case-studies-to-teach/
28 Columbia Accident Investigation Board, Report Volume I, USA, August 2003, http://s3.amazonaws.com/akamai.netstorage/nasa-global/CAIB/CAIB_lowres_full.pdf
29 Dechy, N.; J. Rousseau; F. Jeffroy; “Learning Lessons From Accidents With a Human and Organisational Factors Perspective: Deficiencies and Failures of Operating Experience Feedback Systems,” EUROSAFE Forum 2011, researchgate.net/publication/233997934
30 Cooke, ; T. Rohleder; “Learning From Incidents: From Normal Accidents to High Reliability,” System Dynamics Review, September 2006, http://onlinelibrary.wiley.com/doi/10.1002/sdr.338
31 Varonis, Varonis Case Study: City of San Diego, USA, http://info.varonis.com/hubfs/docs/case_studies/en/Varonis_Case_Study_San_Diego.pdf
32 Roberts, K. H.; “HRO Has Prominent History,” Anesthesia Patient Safety Foundation Newsletter, 18, iss. 1, Spring 2003, http://www.apsf.org/article/hro-has-prominent-history/
33 Serou, ; L. Sahota; A. Husband; S. Forrest; Slight; S. Slight; “Learning From Safety Incidents in High-Reliability Organizations: A Systematic Review of Learning Tools That Could Be Adapted and Used in Healthcare,” International Journal for Quality in Health Care, 17 March 2021, http://academic.oup.com/intqhc/article/33/1/mzab046/6174559
34 Staff, “‘A Matter of National Security: FBI, Secret Service Investigate After Hacker Tried to Poison a Florida City’s Water With Lye,” USA Today, 9 February 2021, http://www.usatoday.com/story/news/nation/2021/02/09/oldsmar-florida- water-hacker-lye-sodium-hydroxide/4444387001/
35 Turton, W.; K. Mehrotra; “Hackers Breached Colonial Pipeline Using Compromised Password,” Bloomberg, 4 June 2021, http://www.bloomberg.com/news/articles/2021-06-04/hackers-breached-colonial- pipeline-using-compromised-password
36 Coles, E; “Learning the Lessons From Major Incidents: A Short Review of the Literature,” Emergency Planning College, June 2014
37 Ibid.
38 Check Point Research, “Check Point Research: Cyber Attacks Increased 50 Percent Year Over Year,” 2022, http://blog.checkpoint.com/2022/01/10/check-point-research-cyber- attacks-increased-50-year-over-year/

Ranjit Bhaskar, CISA, CISM, CISSP

Is a senior security architect at Texas Windstorm Insurance Association (TWIA). Bhaskar has 25 years of experience in enterprise architecture and is the author of the op-ed, “A Cybersecurity Culture Score.” He can be reached via LinkedIn at http://www.linkedin.com/in/ranjit-bhaskar-467877218.