When Negativity Bias does not apply
The fight or flight response is an energy-consuming bodily state. So, dismissing a casual scream before the body spends its resources on fight or flight makes sense when it comes to “instincts.” However, the picture changes when we are on our screens sitting at home. We may not have to go into fight or flight as often, so a bias toward negative information may not be biologically costly. This is still conjecture, and we can wait and find out more in the future.
What's wrong with psychology research?
The analogy opens researchers to a powerful bias–the Faulty Generalization Heuristic. It occurs when people try to simplify a complex world by incorrectly grouping behavioral patterns and people together, leading to bad decisions, bad science, and misplaced effort to improve mental health. Medical analogs for mental health may not always be helpful.
Traditional techniques like factor analysis and modern clustering algorithms help humans reduce biases in categorization, but if they don’t make much sense, we may ignore them. The lack of usefulness of personality tests like the famous HEXACO, OCEAN, NEO-FI, and MBTI is well explained by faulty generalizations that don’t account for specific behaviors and individual differences.
The pandemic mental health dilemma
People develop Cognitive Dissonance when it comes to exercise and occurs when two simultaneous, conflicting thoughts cause anxiety. When it comes to mental health or exercise, people have two views– “ I should do something” AND “I can’t do anything.” They reach a point of conflict where they want to approach good mental health AND approach physical activity. But the conflict is simultaneously an avoidance-avoidance conflict; they want to avoid the stress of working on mental health and the burden of making lifestyle changes.
The conflict creates additional anxiety and stress that prevents them from acting. They fall back on Status Quo Bias, where they continue not to exercise if they weren’t exercising and continue thinking about improving mental health if that’s what they were doing. These issues pile up and amplify the cognitive dissonance and continue perpetuating the cycle of inaction.
How your gut can be a reliable tool for decision-making
We often let our emotions guide decisions instead of logic or rationality. That is our affect heuristic. But we also incorrectly assume that emotional decisions can be bad and try to override them with a wide range of seemingly logical but faulty thought processes. Sometimes it creates a decision-making conflict (cognitive dissonance). Even dissatisfaction with decisions because emotions and logic lead to different choices.
Turns out, thinking with your heart and using "gutshots" can be reliable ways to maintain consistency over time. This has strong implications. Consistency is a highly valued skill, and those who use Affect Heuristic based on positive emotions may be the most consistent decision-maker. It explains why an expert’s instinctive approval is reliable.
Why announcing half-baked beliefs in public can be bad for us
If people endorse certain beliefs or make commitments in public, they may find it more difficult to change their point of view or go back on their word. This is the manifestation of commitment bias, which keeps us motivated to uphold commitments we make in public. This bias explains why people stick to their ideas and choices against their will, even when they may no longer be their preferred ideas or choices. A likely reason people do this is to show that their thoughts have conviction. Sometimes, people may even feel that changing their thoughts in public might make them look uninformed or insincere, which could cost them their social capital.
Commitment Bias can be dangerous if one irresponsibly makes remarks in speeches or on the internet. It will be easier to change one’s opinion if no one knows about it rather than publicly admitting they were wrong before.
How Confirmation Bias has an evil twin, and how they collaborate.
We also have a bias against disconfirming evidence (BADE), which is the opposite of confirmation bias. It describes how people are reluctant to change their point of view when presented with disconfirming evidence. It explains why people continue to commit to their beliefs by rationalizing in other ways so they can uphold their beliefs by dismissing contradicting evidence. It is one of the key biases that make people resistant to change. Researchers found that those who believe in COVID conspiracies exhibit a bias against disconfirming evidence in a study conducted on the German population.
Why a long list is better than a top 10 list
Chances are people will 5 out of 10 items from a 10-item list, but they might remember 15 out of a 50-item list. If a list’s goal is to educate a reader with maximum to-dos or tips, longer lists can outperform shorter lists by a big margin. Using the list length effect may be a reliable way to educate readers better. This effect can be amplified with a process called chunking, where grouping similar information can make similar items easier to remember.
How we decide to take risks after getting infected or vaccinated.
Perceived immunity becomes a decision criterion to engage in risky behaviors that may lead to problems like infecting others, catching different diseases, splurging money, and motivating others to take risks. This creates a spectrum of decision criteria ranging from immunity promoting risk-averse behavior to risk-taking behavior. Wearing a mask and using disinfectants can only help to counter other risk-taking behaviors.
Deception tactics in poor science reporting
Such seemingly superficial evidence-based statements are often used to make others perceive opinions as facts. It explains why reporters use it to push propaganda or loosely justify some other point they are making. Here, words like “Harvard”, or “decades of research” become a weak substitute for logical or evidence-based arguments. It acts as a heuristic for the reader/listener to believe the reporter’s scientific claim without going through the science. Since everyday conversation is mostly understood through context, such fact-less, shortened statements become a tool for deception, benign or malicious.
Why do humans like to outsource decision-making?
We may not realize how often we trust decisions that have been, at least in some part, already made for us before we even begin evaluating. We don’t have to choose from a huge list of TV shows; we just must scroll through the algorithm's few options. The success of recommendation systems is a testament to how much we like outsourcing our decision-making, and two heuristics explain why we like this outsourcing system:
Attribute Substitution: Instead of basing decisions on a complex process of reading reviews and descriptions, we base our decision on something far simpler–a friend’s recommendation or an algorithm that limits our options and simplifies the task of sifting through too many choices. Authority Heuristic: We tend to blindly follow those in positions of authority because it is easier, and we rationalize it with statements like, “If it is good enough according to the expert, I’ll be ok with it.”
Why comparing two products at a time is better than showcasing one.
Product comparisons are here to stay, if only from 3rd party sites. Showcasing one single product at a time might no longer be an effective sales strategy because the internet is full of comparisons, and people are drawn to them.
Why one should not focus too much while reading medical reports!
It’s called inattentional blindness, which is our tendency to miss something obviously odd or unexpected when focusing on something specific. For example, if we focus too much on comparing a new phone’s processing power, we might miss an obvious fact of the phone being too heavy or bulky. This cognitive bias explains diagnostic errors that arise while reading detailed medical reports–fixating on certain numbers can blind one to other problematic numbers.
The best way to overcome inattentional blindness is to focus on the big picture and the details one after the other in an alternating fashion. Paying attention to the details can help us. Casually wondering and paying attention to random things could also help us notice some glaring problems in our assessment. Not focusing can help the brain’s general-purpose attention randomly latch on to things that don’t seem to make much sense.
Is our brain truly efficient? And is it a good reference to build better machines?
We’ve evolved biases and heuristics to help us reduce the signal-to-noise ratio. Heuristics are shortcuts that help us select relevant information or simplify mental processing. They make us efficient and have worked well for millennia. We then “learn” new things that make us specialists, and this is, in turn, reflected in the brain’s biology.
Depending on what problem we are solving with machines, depending on the signal-to-noise ratio in the environment, we must pick a generalist vs. specialist approach. For that, we can be inspired by the whole brain or a specific learning process that we acquire. The brain is a good model to inspire machines, but it can only be effectively accomplished when we understand the signal-to-noise ratio of the machine’s environment.
What the pandemic is teaching us about biases
This bias can be particularly damaging during a pandemic because public health is a Non-Zero-Sum Game–either everyone wins, or everyone loses. Helping your team members first, whether it's the ones that speak your language, share your skin color, or professional status – are things that bind the team.
The higher the diversity in those who get vaccinated, the more “teams” will get on board. The problem crops up when someone else’s “team” might need solving first. For example, those who run stores and gas stations might not have the time to take leave for a vaccine, and this portion of the population might remain under-vaccinated.
Lesson: Make a conscious effort to change your team’s definition and identity to include more of the world.
How a rich society doesn't always add up to a happy society
People rush to use their resources to hoard limited supplies, and it becomes a Zero-Sum Game where one person’s gain is another’s loss. In times of limited supply, having the power to purchase more than necessary to feel secure and in control can create a stronger disadvantage for someone with less purchasing power.
The Happy People Happy Choices tendency describes how happier people make more optimistic and positive choices than those who are unhappy. With high purchasing power, one’s negative mood can facilitate poorer decisions, higher risks, and impulsive investments–lowering one’s purchasing power in the future
Research shows that the true root of frequent happiness is holistic well-being, including individual and community well-being. For a society to flourish, it’s necessary that people are happier, the economy is booming, and the environment is stable.
Why skeptics disobey doctors
Skeptics might not like the idea of getting a recommendation from doctors because they may interpret getting recommendations as “telling them what to do." This may threaten their sense of control over their health and well-being. To regain that control, one may choose alternative medicine simply because it is your own choice. Since most health recommendations are evidence-based, the likely candidate for gaining control is alternative medicine
Rule of Consistency: People feel the need to portray a consistent image of themselves and engage in activities that uphold that image. Skeptics may wish to uphold their image as a skeptic because it is a part of their social and individual identity. They may partake in many activities like rebelling against vaccines, rejecting doctor’s recommendations, rejecting prescription medication, etc.
Why mental models don’t always help
Let us explain, a mental model acts as a Heuristic. If you can’t evaluate something with logic or experience, a mental model can give you an answer. However, getting an answer doesn’t always mean it’s relevant or useful.
We may use the wrong mental model because the Representativeness Heuristic can highlight the wrong approach to solving problems. The representativeness heuristic makes us look for similarities between what is analyzed and how it is analyzed. If the mental model feels appropriate, we choose it. If compound interest is about growth, can we really apply it to personal growth? Only up to a limit. Skill development has diminishing returns where the amount of effort doesn’t justify the amount of improvement. This is where the model breaks down.
How messiness promotes a clear mind!
The first is personalization of choice, wherein the outcomes we are nudging people toward are personalized. The second is personalization of delivery, which means personalizing, which nudge is used based on individual decision-making styles.
Those designing these types of interventions in any capacity should be wary of ethical and privacy concerns to protect individuals. Though it can be difficult to know what data is necessary to build a personalized choice or delivery nudge, choice architects should make concerted efforts to collect and use only data that is (foreseeably) relevant.
How messiness promotes a clear mind!
It seems logical, right? So, what explains this conventional wisdom? The Representativeness Heuristic makes our brain think that a clean desk and a clear mind go better together than a messy desk and a clear mind. We believe that the desk’s characteristics should represent the mind’s characteristics based on clarity or chaos. Effectively, we falsely believe clarity belongs to clarity. Interesting, right?
How we make decisions in the supermarket!
The Availability Heuristic explains this strategy. Memorable brands are easy to remember and remain available in memory to populate the "index" of possible purchases. Past experiences, whether good, bad, or neutral, affect ranking within that index list. Companies that want to ensure a particular product enters the Index Strategy can create more memorable products through attractive designs, interesting shelf placements, and positive emotional tones.
How to counter persuade someone to come out of pseudoscience
Two biases explain how LoA fails: Semmelweis Effect and Confirmation Bias. When a “persuader” talks to someone within their LoA, it facilitates Confirmation Bias. If the conversational content is outside the LoA, it enables the Semmelweis effect, where a person reflexively rejects the conversational content if it disagrees with their preconceived notions and beliefs.
The trick to persuasion is to avoid reinforcing the confirmation bias while not evoking the Semmelweis Effect. Evoking both biases reinforce prior beliefs. To counter this, incrementally convert “confirming” information into “emotionally compatible information” and move toward facts in small steps.
How we can combat biases through observation
Social Projection: A biased belief that we are more like others in attitudes and preferences than we are . Anchoring: Our evaluations are based on some arbitrary reference point, even if that reference point deceives us. Representativeness: Believing something is more likely to be true in a situation because that something fits well with the situation.
In fact, observation works better than practicing and reading instructions on how to counter biases. Attentional Bias (we remember or prefer emotional and self-relevant information better) explains the power of observation. Observation allows us to understand others’ decision-making in a context empathetically, and it informs us of others’ mistakes and adjustments that we can implement. Learning via observation feels more real and familiar than learning abstract rules.
Is there any “good” information hidden in the side effects?
Questions like “is the vaccine even working?” are hard to answer because they involve analyzing biochemical changes in the immune system. The Attribute Substitution Heuristic explains an odd phenomenon–knowing that there are observable side effects can indicate that the vaccine is doing its "job." In attribute substitution, people avoid basing their evaluation on a complex attribute (like biochemical changes) and instead substitute it with a simple attribute (observing side effects) for easier/faster evaluations. So, people may think a vaccine is working if they can see or feel the side effects.
Most of us think only of the passive form: indecision and inaction that feeds stress and further inaction. This type of procrastination can be attributed to the Current Momentum Bias, wherein humans choose a pleasurable activity now and push less pleasant activities off to their later selves.
On the other hand, active procrastination involves leveraging time pressure to one's advantage. The best active procrastinators make intentional decisions to delay action, keep deadlines in mind, work best under pressure, and–importantly–believe they produce their best work under pressure.
In a research setting, active procrastination yielded superior results over passive in both self-reported and expert-rated creativity. In the real world, people classified as passive procrastinators exhibited better GPA, higher life satisfaction, more purposeful use of time, and higher self-efficacy.
Evidently, for some people, purposeful delay can optimize outcomes for some people–are you one of them?
Are we really that irrational if our irrationality is predictable?
Let’s explain a fascinating part of this–predicting choice reversals.
Choice Reversal, aka the Preference Reversal phenomenon, occurs when we end up choosing something that is the opposite of our original or logical choice.
Here are three common reasons for it:
Framing Effect: If we prefer choice B between A and B, changing B’s wording to a highly negative tone or changing A to a highly positive tone can make us choose A. Decoy Effect: Between 2 extreme options, if a 3rd one is added between the extremes but closer to one in terms of pricing and offer, we end up preferring the closer extreme. Social Proof: If we learn our preference is not popular, the strength of popularity for an alternative can dominate and make us choose the popular choice at the last moment.
Hope and Optimism Bias
Humans are predisposed to Optimism Bias, where they believe good things are more likely to happen to them even if the chances of bad things happening are 50-50. This bias ends up cultivating irrational hope, which is countered by point #4. Other research suggests unhealthy optimism sets people up for disappointment and that, in turn, lowers well-being.
#Bottomline: Learning to be hopeful during uncertainty is a valuable motivation but can also backfire.
How boredom sparks creativity Boredom may be a blessing in disguise because it promotes creativity. A lack of stimulation activates a brain network (the default mode network) that sparks random connections between different brain areas, which in turn act as a catalyst for unexpected ideas. This is usually when we experience "aha moments," with old ideas getting incubated and reprocessed subconsciously.
When we think about objects and ideas, we typically fall for Functional Fixedness–a notion that each object has one dedicated function (e.g., a bottle is used only for storing liquid). Creativity involves breaking functional fixedness. Boredom explains creativity by re-combining and re-interpreting objects and ideas to create Functional Flexibility. If you have a decision block, try getting bored. Do something that doesn’t make sense and doesn’t stimulate your mind.
Since humans are generally averse to boredom, we, unfortunately, fail to realize our creative potential. Being fully immersed in a task–the opposite of boredom–may raise productivity but probably reduce creativity because high concentration suppresses the default mode network’s activity.
Why AI and marketers cannot predict buying decisions
Base Rate Neglect can explain choosing the wrong approach. 1 in 10,000 is the base rate of conversions without extra effort, and AI can change that to 1.5 in 10,000. The 50% improvement is even smaller if the base rate is lower. To make matters worse, the Law of Diminishing Returns may apply where additional effort starts to yield lesser and lesser gain as effort increases.
Should AI understand language?
The Inconsistency Bias explains how we might prefer a human’s text when compared to that of an AI’s even though either could write brilliantly. We fail to use the same decision criteria – the quality of text – in similar contexts such as “wanting a good text” because AI deals with language differently
Psychiatric medicine
Psychological Distance is a cognitive separation between oneself and others, situations, or times and describes how close or far something feels. The mind can feel intimately close, but the brain can feel distant because we often observe our thoughts, not neurons. Higher psychological distance weakens emotional attachments, making the brain a distant unemotional part of the self. This makes it easier to blame distant biochemical problems when compared to emotionally closer thoughts.
Why are pretty people better?
The Halo Effect can be a powerful tool to influence others. When we pass judgment about someone or something based on just one unrelated trait of that person or object, we demonstrate the Halo Effect. It explains why we fuss over actors, celebrity authors, and politicians. We may assume a good-looking actor must be socially responsible too. If a CEO is exceptionally well-dressed, we may assume the CEO is exceptionally smart. While some correlation exists, we judge one trait based on another trait. Because the Halo Effect is about passing quick judgment, take some additional time to make objective observations before jumping to such conclusions.
The matter of Brain Training
Every few months, a new study provides evidence to either support the efficacy of brain training or explain why brain training fails. Either way, plenty of expert opinions exist, and most opinions, even contradictory ones, are replete with supporting research. What happens when it comes to personal experiences? Does brain training make one FEEL more intelligent or even BECOME more intelligent?
The Emotional Reasoning thought pattern explains many personal experiences. Some people equate feelings and facts–i.e., If it feels true, it must be true. The feeling may come from the achievement in completing brain training tasks, not from gaining IQ points.
Mental Models – making sense of the world around us.
Attribute Substitution is one barrier to implementing the correct mental model, which explains a tendency to replace a complex decision process with a simpler one. We may inaccurately use simpler mental models like the Survivorship Bias to reject good advice when regression to the mean explains why the advice was unrelated to the outcome. One would use attribute substitution because it’s much harder to understand.
Self-control in this context might be defined as the ability to do what’s in your best interests long-term versus doing what's most enjoyable now. It's closely related to the heuristic Hyperbolic Discounting, wherein humans tend to make choices that are inconsistent over time – immediate choices that their future selves would likely not make, despite knowing the same information.
As it turns out, focusing on proactively reducing temptation rather than reactively overpowering it is your best bet. Take the cookies, for instance – situational strategies to remove temptation might include, well, not bringing them home in the first place. But since that's not always an option, psychological strategies have also shown to be effective, including distraction and 'reappraising' how you think about the cookies. So next time, snack strategically!
How our brain judges pricing.
One key bias called the Scarcity Heuristic explains the complexities involved while bargaining for a product or finalizing the "best" offer. The scarcity heuristic describes the human tendency to want what is rare or in short supply.
That is why messages like “Hurry up, last 5 slots remaining” and “40% discount valid for 6 minutes only!” work. When the scarcity heuristic kicks in, our brain may become a poor judge of the right price to pay for a product. This heuristic also explains hoarding behavior.
How to overcome biases in decision making.
One way to explain the failure of a self-awareness campaign is the Availability Heuristic. Humans tend to base decisions on information that is immediately available to us. If an awareness campaign focuses on some biases, the availability bias will help us identify only a select few biases and possibly irrationally apply them to every problem.
In short, all problems will look like nails if we possess a hammer. A way to overcome this situation would be to ask, “What information do I need right now?” and look at biases that act on th at information. That changes the focus to what’s relevant as opposed to what’s available.
Our predisposition to "Google it!"
When patients need to make a buying decision, it appears vital to account for the worry caused by uncertainty. Marketers can optimize their messaging to address that uncertainty so that an intervention returns control to a patient.
#cognitivebiases
Read the full article: https://lnkd.in/dKgY2-F
Better than average effect.
The Better Than Average Effect is a tendency to feel one is better at something than most other people. Similarly, the Worse Than Average Effect is a tendency to feel one is worse at something than most other people. Both lead to an overestimation or underestimation that affects self-confidence. Improving the quality of measurement used to assess ability can counter these effects.
In his new book, psychologist Adam Grant investigates the struggle to change one's mind and how we can get better at it. He encourages "anchor[ing] your sense of self in flexibility, rather than consistency.” Flexibility, he argues, is far more productive than the traditional advice to "see both sides" of an argument, which Grant–who used to subscribe to this counsel but managed to change his mind–believes perpetuates polarization.
This practice leaves us prone to the Binary Bias, where people oversimplify a complex spectrum of opinions into just two categories, with a preexisting favor for one side and disapproval for the other. Grant suggests that instead of just thinking through the "other side," we try to find a third and fourth angle–a place your opinion could shift toward more realistically.
Why less is more!
Too many options can lead to poor decision-making. Choice Overload Bias explains this behavior. Our brains tend to simplify and reduce the effort needed to make decisions. A wide variety raises customers’ expectations to hope for the best option to fit their needs, but those expectations can be unrealistic and can’t be fulfilled, leading to disappointment. So, sometimes it’s good to have just a few options!
Read on for full article: https://lnkd.in/ePQppQ8
#cognitivebiases
#choiceoverload
What is Newton’s flaming laser sword?
At any given point in time, there are things we can know and things we can’t know. Science closes knowledge gaps and opens new gaps to look forward to as time goes by.
Science is more accessible today. It’s unlikely that a practical question has no answer. There is someone somewhere who can shed some insight. Newton’s flaming laser sword can explain why people may give up while exploring hard-to-answer questions.
Can we judge if a machine is biased?
The Attribute Substitution Bias, which is the tendency to base decisions on an easy judgment instead of a complex one – can shed some light on this. De-biasing the raw data for an AI can be difficult, so we substitute the complex task of de-biasing it with an easier approach of feeding the AI more data, hoping the AI de-biases it. This is reinforced by our tendency to believe computers are logical and precise.
Newristics
#cognitivebiases
To read more: https://lnkd.in/gBccJ5n
1) Rest/sleep on it to improve motivation, attention, and control
2) Take your time. We are more likely to make risky choices, urged by the Pressured Time Effect
3) Gather the facts rather than relying on biases, hunches, and emotions
4) Keep an open mind for new information to avoid falling victim to Confirmation Bias and Status Quo Bias
5) Create simple rules for making decisions, especially in emotional or rushed situations
Subtle changes in wordings affect persuasion.
Framing Effect: Our choices are influenced by the way they are framed through wordings and settings and whether a message has a positive or a negative tone affects its adoption.
Social Proof: People are likely to indulge in a particular behavior if others indulge or endorse it.
The Framing Effect explains how saying, “9 out of 10 patients felt better within 30 minutes” sells better than saying “1 out of 10 patients didn’t feel be tter within 30 minutes”. Combine this with social proof by suggesting it is popular among similar people to make a more persuasive case.
Read on to see how wordings of #vaccination messages are going to influence consumer choices.
https://lnkd.in/g6GHHNT
Newristics #cognitivebiases
#behavioralscience
#vaccinationmessaging
Time on screens has little impact on kids’ social skills, study suggests.
The stories we hear create an Illusory Truth Effect–repeated exposure to an idea or opinion increases its veracity. Researchers say this happens because frequent exposure makes things easier to digest and believe. When we hear an incorrect explanation for something very often, it becomes intuitive regardless of the evidence. Stories can be unreliable because we sometimes confuse correlation with causation.
Your Brain Makes You a Different Person Every Day
Psychologists define personality as a relatively stable and enduring pattern of behaviors and tendencies. But our brains continuously change a little to accommodate every cumulative experience. So, at what point were we different people?
When we think retrospectively, the End-Of-History Illusion explains perceived change. People believe their experiences have changed them dramatically in their past to converge on their current, stable personality that won’t change in the future. People of all ages can fall for this illusion of stability, which becomes our retrospective identity – unstable yesterday, stable today, unknown tomorrow.
Yes, you’re irrational, and YES, that’s okay!
“Being manipulated OR seeing a manipulated reality” is a false dichotomy.
A false dichotomy is an irrational but simplified way to conceptualize complex issues, and this is because we can differentiate two opposite ideas from each other.
They have unique characteristics that create boundaries – much like black or white thinking. Perhaps we should assume irrationality is the norm.
Perhaps we should assume irrationality is the norm.
#heuristics
#cognitivescience
#irrationalilty
Simple, behavioral science interventions offer viable solutions. As in other areas, reminders and opt-out strategies encourage participation. The conveying of Authority, with messages from highly credentialed public figures like Dr. Anthony Fauci, can be used in tandem with Social Proof strategies, showing influential people receiving the vaccine.
Reasons for vaccine opposition vary widely across people and population segments. Rather than contradicting the beliefs of reluctant others, the best strategy is to approach these conversations with empathy and include personal testimonies and anecdotes rather than just factual data supporting vaccination.
We know from behavioral science that small, non-intrusive nudges offer a much more viable path to increasing uptake than mandates. Crafting these nudges will require an empathetic lens.