Information Overload -Know These 12 Cognitive Biases

This post is also available in: हिन्दी (Hindi)

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own “subjective reality” from their perception of the input. An individual’s construction of reality, not the objective input, may dictate their behavior in the world.

Four problems that give rise to biases are:

  • Information Overload
  • Lack Of Meaning
  • The Need To Act Fast
  • How To Know What Needs To Be Remembered For Later

In this article, we will look into the first type of Cognitive Biases i.e., “Too Much Information” or “Information Overload”. This type of Cognitive bias is often a result of your brain’s attempt to simplify information processing — we receive roughly 11 million bits of information per second, but we can only process about 40 bits of information per second.

Decision Making Biases That Tries To Address Too Much Information Problem

Following are the cognitive biases that happen because individuals are dealing with information overload:

1. We notice things that are already primed in memory or repeated often

This is the simple rule that our brains are more likely to notice things that are related to stuff that’s recently been loaded in memory. Examples of these biases are:

A. Availability Heuristic

The Availability Heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific topic, concept, method, or decision. The availability heuristic operates on the notion that if something can be recalled, it must be important, or at least more important than alternative solutions which are not as readily recalled. Subsequently, under the availability heuristic, people tend to heavily weigh their judgments toward more recent information, making new opinions biased toward that latest news. 

Example: Politics is a prime example of availability heuristics in action. For instance, politicians usually stick to a couple of key areas and nail home their points. Usually, these points will appeal to the masses. Whether it’s immigration, healthcare, or schools.

We’ve seen it time and time again. Politicians promise the public they can fix a problem. They get elected and fail to fix it. However, the next candidate comes along and promises they can. They get elected and equally fail to fix the problem – thereby creating a vicious cycle.

What happens is voters will tend to forget about the unfulfilled promises made by the incumbent. Instead, they hear about the promises from the new candidate, which takes prominence.

Effects: The effects of the Availability Heuristic can be categorized as on individual or group:

  • Individual Effects: The availability heuristic can lead to bad decision-making because memories that are easily recalled are frequently insufficient for figuring out how likely things are to happen again in the future. Ultimately, this leaves the decision-maker with low-quality information to form the basis of their decision.
  • Group Effects: Exploring the availability heuristic leads to troubling conclusions across many different academic and professional areas. If each one of us analyzes information in a way that prioritizes memorability and nearness over accuracy, then the model of a rational, logical chooser, which is predominant in economics as well as many other fields, can be flawed at times. The implications of the availability heuristic suggest that many academics, policy-makers, business leaders, and media figures have to revisit their basic assumptions about how people think and act in order to improve the quality and accuracy of their work.

Why Does It Happen?

A heuristic is a ‘rule-of-thumb’, or a mental shortcut, that helps guide our decisions. When we make a decision, the availability heuristic makes our choice easier. However, the availability heuristic challenges our ability to accurately judge the probability of certain events, as our memories may not be realistic models for forecasting future outcomes.

For example, if you were about to board a plane, how would you go about calculating the probability that you would crash? Many different factors could impact the safety of your flight, and trying to calculate them all would be very difficult. Provided you didn’t google the relevant statistics, your brain may do something else to satisfy your curiosity. Many of us do this on an everyday basis.

  • Your brain uses shortcuts: Your brain could use a common mental shortcut by drawing upon the information that most easily comes to mind. Perhaps you had just read a news article about a massive plane crash in a nearby country. The memorable headline, paired with the image of a wrecked plane wreathed in flames, left an easily recalled impression, which causes you to wildly overrate the chance that you’ll be involved in a similar crash. This is the availability heuristic bias at work. The availability heuristic exists because some memories and facts are spontaneously retrieved, whereas others take effort and reflection to be recalled. Certain memories are automatically recalled for two main reasons: they appear to happen often or they leave a lasting imprint on our minds.
  • Certain memories are recalled easier than others: Those that appear to happen often generally coincide with other shortcuts we use to comprehend our world. 

How To Avoid It?

Now that you’re aware of the availability heuristic, what can you do about it? It’s much easier to make well-informed decisions when you’re aware of your cognitive biases.

Here’s how to overcome the availability heuristic and make more educated decisions.

  • Avoid making impulse decisions or judgments: When you’re about to make a decision on the fly, take a moment to think about it. What’s informing your decision? Where’s your judgment of the situation coming from?
  • Clear out your echo chambers: In order to make more informed decisions, it’s important to seek out information sources that don’t necessarily line up with your personal beliefs.
  • Watch overall trends and patterns: Recent events can skew your perception of reality. But if you take a look at long-term trends and patterns, they will likely tell a different story.
  • Consider overall statistics: If you know several people who are left-handed, it doesn’t mean that the majority of people across the world are also left-handed.

B. Attentional Bias

The attentional bias involves the tendency to pay attention to some things while simultaneously ignoring others. This impacts not only the things that we perceive in the environment but the decisions that we make based upon our perceptions.

Example: When hungry, you may find yourself inordinately distracted by food-related words or images, and you may have a hard time thinking of anything other than food. Likewise, individuals with anxiety have a strong attentional bias towards threatening or emotionally negative information (e.g., images of violence, death). In other words, they find such information incredibly distracting and might have a hard time switching their attention away from threatening or negative imagery, in part due to deficits in executive functioning.

decision making biases

Effects: The effects of Attentional Bias can be:

  • Individual effects: Our attention is a finite resource – there are limits to how much we can attend to at any given time. In order to make rational decisions, ideally, we would want to consider all of our options and examine them each in turn. When attentional bias shows up, however, we end up directing a much larger share of our focus toward a single option or stimulus, and this comes at the expense of others. It can also make it more difficult for us to let go of distracting or unhelpful thoughts, causing us instead to fixate on (and overthink) certain things.
  • Systemic effects: Attentional bias carries implications for many institutions. One important example pertains to law enforcement. One study demonstrated that police officers who were experiencing high levels of anxiety were more likely to shoot at suspects during a training exercise, suggesting that anxiety biased the officers to narrowly focus on threat-related information. Attentional bias is also highly relevant to racial profiling and prejudice in policing.

Why Does It Happen?

Attentional bias is just a consequence of our limited cognitive abilities as humans. Since we have a finite capacity for attention; as much as we try to convince ourselves otherwise, we can only focus on a small number of things at a time. There are various evolutionary and cognitive explanations for why certain things consistently bias our attention.

  • Biased attention carries evolutionary advantages: Some experts believe that this tendency might have an evolutionary basis. In order to ensure survival, our ancestors were more likely to survive if they paid greater attention to risky things in the environment and ignored things that did not pose a threat. Importantly, attentional biases that proved to be an advantage in the ancient past may not be advantageous today. Our environment has changed profoundly: for most people, food is available in abundance, and we no longer have to worry about guarding the village from sabertooth tigers. But our brains retain the hardwiring that benefited our ancestors, even if it is no longer appropriate.
  • We attend to information consistent with our schemas: Our brains rely on many shortcuts and rules of thumb to speed up processing and help us navigate the world. Schemas, or frameworks that help us organize and sort information, are one type of shortcut. We have schemas for virtually everything we encounter in our day-to-day life, from people we meet to situations we encounter. As an example, your schema for your friend might include information such as “tall,” “plays hockey,” and “hates spicy food.” The majority of the time, schemas are useful tools that our brains use to sort through the massive amount of information it must process every day. However, they can also facilitate attentional bias: people are more likely to attend to information that matches up with their existing schemas, and to ignore information that does not. 

How To Avoid It?

It is difficult to completely avoid attentional bias. Often, the influence of this type of bias on our thinking is at such a deep, automatic level that we are not aware it is happening.

  • Feedback and practice: In some cases, it appears that it is possible to reduce the effects of attentional biases through training. For instance, depressed participants can be trained to focus more on positive stimuli. However, in this context, study participants were not merely practicing on their own; instead, they were receiving feedback from the researchers that reinforced focus on positive stimuli and discouraged focus on the negative. To apply this in the real world, if there is a specific type of attentional bias one is looking to avoid, it might help to enlist a friend or family member who can point out moments you fall into biased thinking, and offer reminders to zoom out.
  • Plan around bias pitfalls: For some types of attentional bias, it is often possible to plan in a way that minimizes the risk of that bias arising. Scheduling your food shopping for sometime one is not likely to be hungry—after dinner, for example—will likely reduce attentional bias for unhealthy items, making it easier to avoid them.
  • Try some mindfulness exercises: In recent years, mindfulness meditation is often prescribed as a tool to boost attention and improve productivity. As much as it has become a buzzword, there is actually empirical evidence to support the effectiveness of mindfulness practice—including as a tool to reduce attentional bias.

C. Illusory Truth Effect

The illusory truth effect, also known as the illusion of truth, describes how, when we hear the same false information repeated again and again, we often come to believe it is true. Troublingly, this even happens when people should know better—that is, when people initially know that the misinformation is false.

Example: In the wake of the COVID-19 pandemic, the search for effective treatments and preventative measures has been at the front of everyone’s mind, politicians and citizens alike. Given the considerable political benefits that would come with finding a cure, it’s not surprising that elected officials have been especially keen to talk up promising new drugs. But the strategy adopted by Donald Trump and his campaign—to repeatedly tout the benefits of a specific drug, hydroxychloroquine, before they were clinically proven—looks like an attempt to capitalize on the illusory truth effect. For months, Trump sang the praises of hydroxychloroquine, prompting tens of thousands of patients to request prescriptions from their doctors. Even now, with clinical trials showing that the drug is not effective to treat COVID-19, the belief that it works is still widespread.

Effects: The effects of the Illusory Truth Effect can be:

  • Individual effects: We all like to think of ourselves as being impervious to misinformation, but even the most well-informed individuals are still prone to the illusory truth effect. We may be skeptical of a false claim the first time it floats through our social media, but the more we are exposed to it, the more we start to feel like it’s true—and our pre-existing knowledge can’t prevent this.
  • Systemic effects: In the age of social media, it’s incredibly easy for misinformation to spread quickly to huge numbers of people. The evidence suggests that global politics have already been strongly influenced by online propaganda campaigns, run by bad actors who understand that all they need to do to help a lie gain traction is to repeat it again and again. While it may sound overly dramatic, this is a threat to the integrity of democracy itself, and to the cohesion of our societies. Now more than ever, it is important to be aware of the fact that the way we assess the accuracy of information is biased.

Why Does It Happen?

To conserve our limited mental energy, we rely on countless shortcuts, known as heuristics, to make sense of the world, and this can often lead us to make errors in our judgment. There are a few fundamental heuristics and biases that underlie the illusory truth effect.

  • We are often cognitively lazy:  There are two thinking systems in our brains. System 1 is fast and automatic, working without our awareness; meanwhile, System 2 handles deeper, more effortful processing, and is under our conscious control. System 2, since it’s doing the harder work, drains more of our cognitive resources; it’s effortful and straining to engage, which we don’t like. So, wherever possible, we prefer to rely on System 1 (even if we don’t realize that’s what we’re doing). This preference for easy processing (also known as processing fluency) is more deeply rooted than many of us realize. 
  • Familiarity making processing easy: What does processing fluency have to do with the illusory truth effect? The answer lies with familiarity. When we’re repeatedly exposed to the same information—even if it’s meaningless, or if we aren’t consciously aware that we’ve seen it before—it gradually becomes easier for us to process. And as we’ve seen, the less effort we have to expend to process something, the more positively we feel about that thing. This gives rise to the mere exposure effect, which describes how people feel more positively about things they’ve encountered before, even very briefly.

How To Avoid It?

The illusory truth effect is tricky to avoid. Because it is driven by System 1, our unconscious and automatic processing system, we usually don’t realize when we’ve fallen prey to it.

Critical thinking is the only solution to this problem, but it’s the best thing you can do to avoid falling for the illusory truth effect. With such massive amounts of information filtering past our eyeballs every day, it’s easy to let suspicious claims slide and just move onto the next tweet, or the next status update. But by neglecting to think critically the first time we encounter a false statement, we make ourselves more susceptible to the illusory truth effect.

D. Mere Exposure Effect

The mere exposure effect describes our tendency to develop preferences for things simply because we are familiar with them. For this reason, it is also known as the familiarity principle.

Example: A mere exposure effect example is when you hear a song on the radio for the first time, and you hate it. But then after you have heard it many times, you begin to like it. Because you become increasingly aware of the tune, lyrics, etc., you begin to believe you are fond of the song, despite your initial aversion. After a while, you might even prefer the song to others you liked initially.

decision making biases

Effects: The effects of the Mere Exposure Effect can be:

  • Individual effects: The mere exposure effect can result in suboptimal decision-making. Good decisions are made by evaluating all possible courses of actions based on their effectiveness, not their familiarity. When deciding between alternatives, we shouldn’t be choosing the familiar option, we should be choosing the best option. This is because sometimes the best option is not the most familiar one. Sometimes the most effective course of action is the one that is unfamiliar to us. Moreover, sticking with what we know limits our exposure to new things, ideas, and viewpoints. 
  • Systemic effects: When this effect is expanded in a societal or institutional setting, the consequences can be more serious. A company that favors its current business model simply because management has grown familiar with it may miss out on necessary organizational and technological changes that require venturing into uncharted waters. This cognitive bias may also help create social norms and reinforce social stereotypes. 

Why Does It Happen?

There are two main reasons why we experience the mere exposure effect:

  • It reduces uncertainty: We are less uncertain about something when we are familiar with it. We are programmed by evolution to be careful around new things because they could pose a danger to us. As we see something repeatedly without noticing bad consequences, we are led to believe it is safe. 
  • It makes understanding and interpreting easier:  In what’s known as “perceptual fluency,” we are better able to understand and interpret things we have already seen before. 

How To Avoid It?

Recognize diversity and new experience: While it is inevitable that we develop an attachment to things we often see, the mere exposure effect may eventually counteract itself. Research has shown that repeated exposure that is sustained may limit and then detract from our attraction to a stimulus as it loses novelty. We can start to avoid something if we are exposed too much to it. A more proactive strategy might be to recognize the value of diversity and new experiences. By looking for unfamiliar and different experiences, we might limit how often we are exposed to anyone stimulus.

2. Bizarre/funny/visually-striking/anthropomorphic things stick out more than non-bizarre/unfunny things

Our brains tend to boost the importance of things that are unusual or surprising. Alternatively, we tend to skip over information that we think is ordinary or expected. Examples of these biases are:

A. Von Restorff Effect

The Von Restorff effect, also known as the “Isolation Effect“, predicts that when multiple homogeneous stimuli are presented, the stimulus that differs from the rest is more likely to be remembered. The theory was coined by German psychiatrist and pediatrician Hedwig von Restorff (1906–1962), who, in her 1933 study, found that when participants were presented with a list of categorically similar items with one distinctive, isolated item on the list, memory for the item was improved.

Example: Bolded text, italic text, and text in different colors and fonts stand out. If certain messages need to reach customers, the best way to make that happen is to recruit the Von Restorff Effect and isolate those messages away from the rest of the text. 

decision making biases

Why Does It Happen?

There are different theories proposed to explain the increased performance of isolated items. 

  • The total-time hypothesis suggests that isolated items are rehearsed for a longer time in working memory compared to non-isolated items. 
  • Another approach offers that subjects could consider the isolated items to be in their own special category in a free-recall task, making them easier to recollect. 
  • A separate explanation is based upon the analysis of the deep processing of similarities and differences among the items. 
  • Debate surrounds whether perceptual salience and differential attention are necessary to produce this effect. 
  • Modern theory holds that the contextual incongruity of the isolate is what leads to the differential attention to this item. 
  • Empirical data has shown a strong relationship between the von Restorff effect and measures of event-related potential in the brain. Specifically, evidence has shown that exposure to novel or isolated items on a list for free recall generates an ERP with a larger amplitude and this amplitude in turn predicts a higher likelihood of future recall and faster recognition of the items.

B. Negativity Bias

The negativity bias, also known as the negativity effect, is the notion that even when of equal intensity, things of a more negative nature (e.g. unpleasant thoughts, emotions, or social interactions; harmful/traumatic events) have a greater effect on one’s psychological state and processes than neutral or positive things. In other words, something very positive will generally have less of an impact on a person’s behavior and cognition than something equally emotional but negative.

Example: You argued with your significant other, and afterward, you find yourself focusing on all of your partner’s flaws. Instead of acknowledging their good points, you ruminate over all of their imperfections. Even the most trivial of faults are amplified, while positive characteristics are overlooked. 

Effects: The effects of Negativity Bias can be:

While we may no longer need to be on constant high alert as our early ancestors needed to be to survive, the negativity bias still has a starring role in how our brains operate. Research has shown that negative bias can have a wide variety of effects on how people think, respond, and feel.

Some of the everyday areas where you might feel the results of this bias include your relationships, decision-making, and the way you perceive people.

Why Does It Happen?

Our tendency to pay more attention to bad things and overlook good things is likely a result of evolution. Earlier in human history, paying attention to bad, dangerous, and negative threats in the world was a matter of life and death. Those who were more attuned to danger and who paid more attention to the bad things around them were more likely to survive.

This meant they were also more likely to hand down the genes that made them more attentive to danger.

How To Avoid It?

These are some of the ways negativity bias can be avoided:

  • Be poised to gently recognize what is happening when negative patterns start to get activated and practice doing something each and every time—even something very small—to break the pattern. If you are inclined to overanalyze parts of conversations that you assume are negative, figure out a hobby or habit that keeps you from overanalyzing, like reading, going for a run, cleaning your house up, or creating a music playlist that makes you feel happy.
  • Notice your negative self-dialogue and substitute positive approaches. “You idiot!” becomes, “I wish I had made a different choice, but I will remember how I wish I had acted and apply it to future situations.”
  • Another tactic that might feel strange at first, but can help to approach your mean inner voice with kindness, is talking to yourself as you would a friend. When negative thoughts intrude, ask yourself, “Are you ok? What’s wrong?  Why are you so angry? Are you feeling hurt?” The idea is to good-naturedly interrupt yourself whenever you start to trash talk yourself. It’s kind of like The Golden Rule: “Do unto others as you would have them do unto you,” except it involves treating yourself with the same kindness and compassion that you treat the people you love.
  • Perhaps most important, is to “cultivate a gentle, curious and patient attitude with yourself. Learn to celebrate small victories [over negativity and self-recrimination] while understanding that you may have days of back-sliding. It’s all a natural part of the learning and growth process.”

C. Publication Bias

Publication bias is a type of bias that occurs in published academic research. It occurs when the outcome of an experiment or research study influences the decision whether to publish or otherwise distribute it. Publishing only results that show a significant finding disturbs the balance of findings and inserts bias in favor of positive results. The study of publication bias is an important topic in metascience.

Example: One example cited as an instance of publication bias is the refusal to publish attempted replications of Bem’s work that claimed evidence for precognition by The Journal of Personality and Social Psychology (the original publisher of Bem’s article)

Effects: Scientific journals are much more likely to accept for publication a study that reports some positive than a study with negative findings. Such behavior creates false impressions in the literature and may cause long-term consequences to the entire scientific community. Also, if negative results would not have so many difficulties to get published, other scientists would not unnecessarily waste their time and financial resources by re-running the same experiments.

Why Does It Happen?

the following factors as those that make a paper with a positive result more likely to enter the literature and suppress negative-result papers:

  • The studies conducted in a field have small sample sizes.
  • The effect sizes in a field tend to be smaller.
  • There is both a greater number and lesser preselection of tested relationships.
  • There is greater flexibility in designs, definitions, outcomes, and analytical modes.
  • There are prejudices (financial interest, political, or otherwise).
  • The scientific field is hot and there are more scientific teams pursuing publication.

How To Avoid It?

Publication bias may be reduced by journals by publishing high-quality studies regardless of novelty or unexciting results, and by publishing protocols or full-study data sets. No single step can be relied upon to fully overcome the complex actions involved in publication bias, and a multipronged approach is required by researchers, patients, journal editors, peer reviewers, research sponsors, research ethics committees, and regulatory and legislation authorities.

D. Omission Bias

Omission bias is the tendency to favor an act of omission (inaction) over one of commission (action). It can occur due to several processes, including psychological inertia, the perception of transaction costs, and a tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions). It is controversial as to whether omission bias is a cognitive bias or is often rational.

Example: Some parents choose not to have their children vaccinated for pertussis (also known as ‘whooping cough’) because of “fears that reaction to the vaccine itself may lead to death or serious injury”. Medical data proves these fears to be negligible. In the 1970’s Britain, there was a decline in pertussis vaccinations that resulted in a major increase in cases and pertussis-related deaths. Thus, the researchers used the real-life example of the pertussis vaccine to examine these decisions with historical relevance.

decision making biases

Effects: Generally, most people want to do good and avoid causing harm in their everyday lives. We like to feel altruistic and compassionate. Although there is often a gray area, we try to listen to our internal barometer of morality and act accordingly. Yet, sometimes the moral judgments we make are grounded in biased thinking. The omission bias causes us to view actions as worse than omissions (cases where someone fails to take action) in situations where they both have adverse consequences and similar intentions.

Why Does It Happen?

There are frequent situations in which actions are more harmful than omissions. In those cases, our judgment is unbiased and our moral compass points in the right direction. So what offsets our moral compasses and why?

  • We overgeneralize: There are many cases where our judgment that actions are worse than inactions is correct. This becomes a heuristic, or a cognitive ‘short-cut’, we use to assess morality of others and guide our own actions. It is when we are confronted with scenarios in which the outcome and the intent of harmful actions and inactions are the same, but we continue to treat them differently, that this heuristic becomes overgeneralized and detrimental. Overgeneralizing a heuristic can be likened to the “inappropriate transfer of mathematical rules”, like using the Pythagorean theorem to determine the length of a rectangle.
  • We are averse to loss: Another explanation for the omission bias is that we weight losses more than gains of the same amount, otherwise known as loss aversion. If we fail to act and it results in a bad outcome, we can think of it as a missed opportunity for gain. If we act, and it results in a bad outcome, we think of this as a loss.

How To Avoid It?

Sometimes this overgeneralization occurs because we don’t even realize that we are using a heuristic to assess morality. This prevents us from thinking critically about the situations in which it may be incorrectly applied and results in biased thinking. So, a good place to start is reflecting on how we revere omissions over actions in our everyday lives. Think about the cases where this heuristic is grounded and think about the cases where it might not fit. Moving forward, we can try and think about the consequences of our inactions, rather than thinking of our inactions as inconsequential.

3. We notice when something has changed

We’ll generally tend to weigh the significance of the new value by the direction the change happened (positive or negative) more than re-evaluating the new value as if it had been presented alone. Also applies to when we compare two similar things. Examples of these biases are:

A. Contrast Effect

A contrast effect is an enhancement or diminishment, relative to normal, of perception, cognition, or related performance as a result of successive (immediately previous) or simultaneous exposure to a stimulus of lesser or greater value in the same dimension. (Here, normal perception, cognition, or performance is that which would be obtained in the absence of the comparison stimulus—i.e., one based on all previous experience.)

There are two types of contrast effect:

  • Positive Contrast Effect: something is viewed as better than it would usually be when being compared to things that are worse. 
  • Negative Contrast Effect: something is viewed as being worse than it would usually be when compared to something better.

Effects: This type of bias reveals a problem: someone always ends up at the bottom when employees are compared to each other instead of measured against a company standard. The problem is usually not the employee but the standard set by the manager. The impacts of this bias include:

  • Losing Talent. Employees performing at an acceptable standard are being told they are not. This can result in the employee feeling undervalued and leaving the organization.
  • Eliminating Teamwork or a Collaborative Culture. When the team learns their manager is comparing them against each other, it fosters a negative workplace culture by pitting employees against each other. In addition, this may increase potential interpersonal conflicts, which waste company time and decrease the productivity standards needed for business success.
  • Introducing Flawed Data. Contrast effect bias can create a false impression that more people need to be hired or the current workforce is not skilled enough to meet company goals.

How To Avoid It?

Contrast Effect can be minimized following these procedures:

  • Increase the distance between the options. Increasing the distance between the entities that you’re evaluating, in terms of factors such as time and space, can reduce the degree to which you experience a contrast effect between them.
  • Add more options. Throwing a variety of additional options into the mix can sometimes reduce the degree to which you notice the contrast between the initial options that you were presented with, because it makes it more difficult to compare them.
  • Explain why the comparison is irrelevant. Explaining to yourself why the comparison that you’re presented with is irrelevant, for example by focusing on the absolute price of a product rather than on its relative price, can help reduce the likelihood that you will experience the contrast effect.

B. Focusing Effect

The focusing effect is a cognitive bias that causes us to attribute too much weight to events of the past and translate them into future expectations. Knowing how to handle the focusing effect can improve our decision-making process not just in our designs but throughout life.

Example: You give your boss a gift on their birthday; they then call you into their office an hour later and promote you. Connecting the gift-giving to the promotion is a poor choice. Bosses generally don’t promote people for a present.

decision making biases

Effects: In the commercial arena, the Focusing Effect is often utilized as a selling technique. Knowing that people are more likely to accept the merit of a certain product you are trying to sell if that merit is based on only a few well-chosen factors, means that web marketing strategies can be developed accordingly. Consumers are looking for products that they believe will better their lives in some way and are receptive to being informed of the positive and glamorized aspects of a product, event, or service. Focusing on only a few key components of the product you are trying to sell, concentrating on the most widely recognizable or most distinctive features, is an effective way of converting the Focusing Effect into an effective marketing tool.

How To Avoid It?

It can be difficult to recognize when we’re falling victim to the focusing effect but it’s most likely to occur when you’re feeling particularly obstinate about a choice of action.

The easiest way to handle it is to start asking yourself questions. “What happened last time I did that?” “Is it reasonable to assume that that single action caused that outcome?” “If not, what else might have contributed to that outcome?” “How many other factors were in play?”

Once you can put the action into perspective with a series of other actions – you can likely develop a sense of perspective overall. That doesn’t mean that you have to repeat the action but you may be able to answer the question; “What’s the worst that can happen?” and find that the potential consequences aren’t as dramatic as you may have first thought.

C. Framing Effect

The Framing Effect can be described as a cognitive bias wherein an individual’s choice from a set of options is influenced more by the presentation than the substance of the pertinent information. Here people decide on options based on whether the options are presented with positive or negative connotations; e.g. as a loss or as a gain. People tend to avoid risks when a positive frame is presented but seek risks when a negative frame is presented. Gain and loss are defined in the scenario as descriptions of outcomes (e.g., lives lost or saved, disease patients treated and not treated, etc.).

Example: At a shopping centre customers are provided with two yogurt pots. One says “10 percent fat” and another says “90 percent fat-free”. The framing effect will lead to us picking the second option, as it seems like the second is the healthier option.

decision making biases

Effects: The effects of the Framing Effect can be

  • Individual effects: Decisions based on the framing effect are made by focusing on the way the information is presented instead of the information itself. Such decisions may be sub-optimal, as poor information or lesser options can be framed in a positive light. This may make them more attractive than options or information are objectively better, but cast in a less favourable light.
  • Systemic effects: The framing effect can have considerable influence on public opinion. Public affairs and other events that draw attention from the public can be interpreted very differently based on how they are framed. Sometimes, issues or positions that benefit the majority of people can be seen unfavorably because of negative framing. Likewise, policy stances and behaviour that do not further the public good may become popular because their positive attributes are effectively emphasized.

Why Does It Happen?

Our choices are influenced by the way options are framed through different wordings, reference points, and emphasis. The most common framing draws attention to either the positive gain or negative loss associated with an option. We are susceptible to this sort of framing because we tend to avoid loss.

  • We avoid loss: According to one theory, a loss is perceived as more significant, and therefore more worthy of avoiding, than an equivalent gain. A sure gain is preferred to a probable one, and a probable loss is preferred to a sure loss. Because we want to avoid sure losses, we look for options and information with certain gain. The way something is framed can influence our certainty that it will bring either gain or loss. This is why we find it attractive when the positive features of an option are highlighted instead of the negative ones.
  • Our brain uses shortcuts: Processing and evaluating information takes time and energy. To make this process more efficient, our mind often uses shortcuts or “heuristics.” The availability and effect heuristic may contribute to the framing effect. The availability heuristic is our tendency to use information that comes to mind quickly and easily when making decisions about the future. 

How To Avoid It?

There are a few strategies for reducing the framing effect. Research has shown that people who are more “involved” in an issue are less likely to suffer from framing effects surrounding it. Involvement can be thought of as how invested you are in an issue. 

  • We should think through our choices concerning an issue and try to become more informed on it. 
  • A more specific strategy that falls in line with this more general approach is to provide rationales for our choices. If we really think through why we selected an option or relied on certain information, we might realize that the way in which it was presented influenced our decision too much.

D. Distinction Bias

Distinction bias describes how, in decision-making, we tend to overvalue the differences between two options when we examine them together. Conversely, we consider these differences to be less important when we evaluate the options separately.

Example: When asked if someone would like an apple, they may say “Yes”. So, an apple is placed before them and they begin to eat it and are happy. But what if two apples were placed on the table, one was the one they would have happily eaten and the other which is slightly fresher looking. The individual will choose the fresher apple and eat it and be happy but if asked, “would you have enjoyed eating that other apple”, they would likely say “No”. Even though in the alternate, a no-choice reality they were perfectly happy with the apple.

Effects: The effects of Distinction Bias can be:

  • Individual effects: When we directly compare our options, we become hypersensitive to any differences that exist between them. This makes the differences seem glaringly obvious and causes us to view them as more important than they actually are. 
  • Systemic effects: The negative implications of distinction bias at the individual level can add up to challenges on a larger scale. Take, for example, agents, who make decisions on the behalf of other people, who are referred to as principals. Agents include people like parents, lawyers, and policymakers, while their respective principals include children, clients, and constituents. It has been found that, when agents engage in joint evaluation, they tend to select options with outcomes that principals rate as less favorable. 

Why Does It Happen?

The theory behind distinction bias was developed quite recently, so not much research has been conducted on the matter. Part of what gives rise to this bias is the disconnect between our prediction of which option will lead to the most favorable outcome and our experience of choosing that option. Furthermore, this bias is driven by the fact that, when we directly compare two options, we tend to focus on specific details, instead of judging each option holistically. 

How To Avoid It?

The key to avoiding distinction bias is to stop comparing options side-by-side. Joint evaluation, or examining options simultaneously, causes us to view our options as more dissimilar. Separate evaluation, or examining each option on its own, allows us to view each option as its independent unit. By eliminating this side-by-side comparison, we’re less likely to overvalue the differences between our options. 

Furthermore, separate evaluation allows us to form a holistic opinion of each option. Recognizing the pros and cons of our options on their own allows us to make a decision that is in line with our goals and values and respects any constraints we may be subject to.

Also, Read What Is Cognitive Bias – An Overview

Leave a Comment