2024 Training Schedule Available

Cybersecurity and Cognitive Bias

Share This Post

In the digital age we find ourselves in today, each company has an obligation to protect not only themselves, but also their employees. That is why, in 2021, spending on information security and risk management totaled an estimated $155 Billion.

Despite these efforts, the numbers of cyber-attacks continued to rise throughout the year. In fact, cyber-attack related data compromises increased by 27% compared to 2020. A report conducted by IBM and the Ponemon Institute stated that the average cost of a data breach among companies surveyed reached $4.24 million per incident in 2021. This was the highest cost the report had seen in 17 years. So how come the billions of dollars invested into cybersecurity seem to produce results that still fall short? This is because 85% of all data breaches are caused by an otherwise unpredictable variable…Human Error.

How does this happen? Many different factors may come into play, such as age, a fast-paced work environment or even improper training. However, one common trap that all humans are susceptible to on a psychological level is Cognitive Bias.

Cybersecurity and Cognitive Bias Cognitive Bias, What Is It?

Our brains are extremely powerful, yet we are all subject to limitations. Cognitive Bias is defined as: “a systematic error in thinking that occurs when people are processing and interpreting information in the world around them”. Essentially, it is shortcuts our brains take in reasoning and judgment as an attempt to simplify information processing. These shortcuts are flaws in our thinking than can lead us to drawing inaccurate conclusions.

The concept of cognitive bias was first introduced by Amos Tversky and Daniel Kahneman in 1970. Since its discovery, researchers have described many types of biases that can affect our decision making more than we realize. When our brains succumb to these biases in situations pertaining to cybersecurity, the results can be catastrophic. A lapse in judgment by just one employee can put an entire organization at risk of a security breach. Let us look at some examples.

Types of Cognitive Biases

Consider the following biases that an employee may fall into while on the job.

Affect Heuristic:

Affect Heuristic is a mental shortcut that is heavily influenced by the current state of emotion. When someone has a good feeling about a situation and does not perceive a threat is imminent, they can wrongly assume there is none. Decision making can be skewed by an emotion felt in the moment. So, what if a malicious attacker were to masquerade as a friendly and bubbly HR Representative? Able to get a laugh out of everyone they talk to? If an employee were to drop their defenses and immediately assume there was no danger, the results could be catastrophic. They may give up sensitive company information without even realizing it. Who knows what sensitive information about the company could be given up!

Anchoring:

Anchoring is a pervasive bias where humans accept the first piece of information as truth while arriving at a decision. An employee can easily fall into this thinking if an attacker presents “evidence “that creates sense of legitimacy. So, what if an attacker were to call claiming to be from IT and name drops a well-known manager. Would the name drop be enough for our employee in question to bypass authentication protocols while on the phone? If they did, it would certainly be a disastrous decision.

Bounded Rationality:

This bias is a process where people attempt to satisfy instead of optimize in decision making. It is the act of making “good enough” decisions for the sake of simplicity. An employee could make poor decisions because they feel rushed due to a time constraint. Like how someone may order something they don’t want at a restaurant if they feel pressured by the waiter. So, what if an attacker pretended to be a disheveled intern trying to pass a security checkpoint? What if they claimed to have forgotten their ID because they are running late for their “onboard training?” Hopefully, this would not be enough to simply let them pass through unverified.

Herd Behavior:

The “Bandwagon Effect” may be a term that most are familiar with in reference to Herd Behavior. Humans subconsciously mimic the actions of a wider group. An employee may give in to a compromising situation if they are led to believe everyone else is participating as well. So, what if an attacker decided to send a phishing email regarding a “company-wide” HR Survey? Would the idea of everyone else participating sway an employee into thinking it is okay to take it themselves?

There are plenty more Cognitive Biases that we could cover. However, do you see how easy it can be for an employee to succumb to these thought processes? Often, it happens without them even realizing it. All it takes is one minor lapse in judgment to cause a wide array of problems and security breaches.

How to Protect Yourself

Is there a way for a corporation to protect itself from vulnerabilities due to cognitive bias? There are steps they can take to do so. Recognizing cognitive bias as a real problem that exists and threatens the workplace is imperative. It is not something to be taken lightly. The idea that “this could never happen to me” is itself a cognitive bias known as the blind spot bias. Much work can be put into a corporation’s cybersecurity initiatives, but it can all be derailed by human error.

The truth is, humans are the weakest link when it comes to cybersecurity. Whether security professionals or less technical users, we all deal with these kinds of biases on a daily basis. Therefore, corporations should always keep human behavior at the center of their security initiatives.

They do well to address these vulnerabilities by educating their employees. It is important for all staff to know about the common tactics cybercriminals use such as vishing and phishing. Security awareness training programs are an excellent way to accomplish this.

SECOM simulates attacks and provides training to enhance security policy. This results in advanced analytics of user behavior, and ultimately, greater protection of corporate assets. Proper use of this training will help all employees to maintain a security first mentality while on the job. With that mentality at the forefront, it will be much more difficult to fall into biased thinking patterns. It will also ensure that employees can identify and prevent possible security threats to come.

Social-Engineer provides program expertise in the areas of vishing, phishing, and risk assessments. These fully managed programs are highly customizable and designed to achieve client-based outcomes. For more information or a quote, visit their website at www.social-engineer.com

Sources:
https://www.csoonline.com/article/3645091/cybersecurity-spending-trends-for-2022-investing-in-the-future.html
https://www.idtheftcenter.org/post/identity-theft-resource-center-to-share-latest-data-breach-analysis-with-u-s-senate-commerce-committee-number-of-data-breaches-in-2021-surpasses-all-of-2020/
https://newsroom.ibm.com/2021-07-28-IBM-Report-Cost-of-a-Data-Breach-Hits-Record-High-During-Pandemic
https://www.tessian.com/research/the-psychology-of-human-error/
https://www.verywellmind.com/what-is-a-cognitive-bias-2794963
https://effectiviology.com/bias-blind-spot/#Examples_of_the_bias_blind_spot
https://www.ey.com/en_ca/cybersecurity/your-employees-are-the-weakest-link-in-your-cybersecurity-chain

Image:
https://images.unsplash.com/photo-1580894742597-87bc8789db3d?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=1470&q=80

More To Explore

Discover Your Vulnerabilities before Hackers Do!
Security Assessment

Discover Your Vulnerabilities before Hackers Do! 

Discover Your Vulnerabilities Before Hackers Do! In this fast-paced world, staying ahead of hackers and attackers is vital. Our personal and company’s security are more important than ever. For this

AI assisted social engineering attacks
Social Engineering

AI Assisted Social Engineering Attacks 

Social engineering attacks have become more complex with the integration of artificial intelligence. Malicious actors are leveraging AI, resulting in social engineering attacks that are increasingly cunning and difficult to