Brain with puzzle pieces and text to represent cognitive biases

Why Your Brain Lies to You: Cognitive Biases and Mental Health

Ever wonder why you “do” that or “act” that way? A big part of therapy involves what I call “getting a PhD in YOU”!

In this post, we will getting you started by exploring how cognitive biases impact your thoughts, feelings, and actions as thoroughly examined by Kathy Brodie on Early Years TV. The content in this post comes from Ms. Brodie’s careful examination of cognitive biases, an important topic.

To dive further into this fascinating subject, visit Early Years TV.

Cognitive Biases: Complete Guide to Thinking Errors

Research reveals that most decision-making occurs unconsciously through mental shortcuts, yet people remain largely unaware of how systematic thinking patterns called cognitive biases influence their choices in relationships, career, finance, and health.

Key Takeaways:

  • Cognitive biases can distort our perceptions and judgments, leading to irrational decisions and behavior, which is why understanding them is crucial.
    • What are cognitive biases? Systematic thinking errors that affect decisions and judgments, including confirmation bias (seeking supporting evidence), anchoring bias (overrelying on first information), and loss aversion (feeling losses more than gains).
    • How do they affect daily life? Biases influence relationship conflicts through attribution errors, workplace decisions via the halo effect, investment choices through overconfidence, and health behaviors via present bias and probability neglect.
    • Can you recognize them in yourself? Watch for warning signs like time pressure, emotional arousal, and high stakes situations that increase bias susceptibility – use decision journals and seek diverse perspectives to catch thinking errors.
    • What strategies help overcome them? Slow down important decisions with cooling-off periods, seek contradictory evidence and diverse viewpoints, use data and systematic checklists, and create accountability systems for better choices.
    • When should you trust your gut? Mental shortcuts work well for low-stakes, familiar, or time-sensitive decisions where you have relevant experience, but use systematic analysis for high-stakes, unfamiliar, or complex situations.

    Why Cognitive Biases Are Part of Being Human

    Everyone makes thinking errors – it’s human nature. From the CEO making a multimillion-pound investment decision to a parent choosing their child’s school, we all rely on mental shortcuts that sometimes lead us astray. These systematic patterns of thinking, known as cognitive biases, evolved to help our ancestors survive in a dangerous world, but they can sabotage our decisions in modern life.

    Understanding cognitive biases isn’t about becoming perfectly rational – that’s impossible. Instead, it’s about recognizing when your brain’s autopilot might be steering you wrong and knowing how to course-correct when the stakes are high. Whether you’re trying to make better personal decisions, improve your professional judgment, or simply understand why people behave the way they do, this guide will transform how you think about thinking itself.

    What Are Cognitive Biases?

    Cognitive biases are systematic errors in thinking that affect our decisions and judgments. They represent predictable patterns where our brains deviate from rationality, logic, or optimal decision-making. Unlike random mistakes, these biases follow consistent patterns that researchers can study and predict.

    Many cognitive biases are ingrained in human nature and recognizing them can lead to better decision-making and improved mental well-being.

    Challenging cognitive biases can enhance your understanding of yourself and your interactions with others.

    Being aware of cognitive biases allows for more constructive communication and relationships.

    The term “cognitive bias” was coined by psychologists Amos Tversky and Daniel Kahneman in the 1970s through their groundbreaking research on human judgment and decision-making. Their work revealed that humans don’t think like the rational actors that traditional economics assumed – instead, we use mental shortcuts and make predictable errors that can be mapped and understood.

    The 25 Most Common Cognitive Biases

    Memory & Recall Biases

    1. Availability Heuristic

    The availability heuristic causes us to judge the probability of events based on how easily we can recall examples. Our brains use memory accessibility as a shortcut for frequency or likelihood, but this creates systematic errors because memorable events aren’t always the most common ones.

    Media coverage heavily influences availability bias. Terrorist attacks receive extensive coverage, making terrorism feel more threatening than statistically more dangerous activities like driving. Similarly, shark attacks get more media attention than dog bites, even though dogs injure far more people annually. The vivid, emotional nature of these stories makes them more memorable and thus more “available” when we assess risk.

    This bias affects personal experiences too. If you recently experienced a car breakdown, you’ll overestimate the likelihood of mechanical problems when buying a car. If a friend recently got divorced, marriage might feel riskier than it statistically is. The recency and emotional impact of events increase their availability in memory.

    Recognition strategies: Before making probability judgments, actively seek base rate information rather than relying on examples that come to mind. Ask yourself: “Am I remembering this because it’s common or because it’s memorable?” When assessing risks, look for statistical data rather than trusting your gut reaction based on available examples.

    2. Recency Bias

    Recency bias gives disproportionate weight to recent events over longer-term patterns. Our brains naturally prioritize recent information because it seems most relevant, but this can lead to poor decisions when recent events aren’t representative of overall trends.

    Investment decisions suffer heavily from recency bias. After a market crash, investors avoid stocks even when historical data shows recovery patterns. Conversely, during bull markets, recent gains make continued growth seem inevitable. Portfolio rebalancing based on recent performance often leads to buying high and selling low – the opposite of good investment strategy.

    Job interviews demonstrate recency bias clearly. Interviewers remember the last few candidates more vividly than earlier ones, potentially affecting hiring decisions. Similarly, performance reviews often overweight recent performance compared to consistent patterns throughout the review period.

    Recognition strategies: When making decisions based on trends, actively review longer time periods rather than just recent events. Create written records of patterns over time to counteract memory’s focus on recent experiences. In investment decisions, establish rules-based approaches that prevent reactive changes based on short-term market movements.

    3. Rosy Retrospection

    Rosy retrospection makes past experiences seem more positive than they actually were. We tend to remember the good parts of experiences while forgetting difficulties, disappointments, and negative emotions. This bias affects everything from relationship decisions to career choices.

    Former romantic relationships often seem better in memory than they were in reality. The pain of breakups fades faster than positive memories, sometimes leading people to reconnect with incompatible partners. Similarly, previous jobs often seem more appealing when current work feels challenging, even though you left those positions for good reasons. 

    This bias also affects major life decisions. Parents forget the sleepless nights and stress of early childhood, focusing on cute moments when deciding whether to have another child. Travelers remember vacation highlights while forgetting delays, disappointments, and uncomfortable moments.

    Recognition strategies: Keep written records of experiences, including both positive and negative aspects. When reminiscing about past relationships, jobs, or experiences, actively recall why they ended or why you made changes. Before making decisions based on past experiences, seek objective feedback from others who witnessed those situations.

    4. Peak-End Rule

    The peak-end rule means we judge experiences largely based on their most intense moment and how they ended, rather than the overall experience. This bias significantly affects customer satisfaction, relationship evaluation, and decision-making about future experiences.

    Medical procedures demonstrate this bias clearly. Patients who experience gradual improvement in pain rate procedures more positively than those whose pain decreases quickly then plateaus, even when the second group experiences less total pain. The ending matters more than the overall experience.

    Restaurants, hotels, and service businesses exploit this bias by ensuring positive endings – complimentary desserts, smooth checkouts, or friendly farewells can override earlier service problems. Similarly, presentations that end strongly leave better impressions than those with strong middles but weak conclusions.

    Recognition strategies: When evaluating experiences, systematically review the entire duration rather than focusing on highlights and endings. For future planning, consider the overall quality and duration of experiences, not just peak moments. In customer service situations, recognize that endings disproportionately affect satisfaction.

    5. Serial Position Effect

    The serial position effect makes us remember items at the beginning and end of lists better than those in the middle. This affects everything from grocery shopping to candidate evaluation in interviews.

    In presentations, audiences remember opening and closing points more clearly than middle content. This is why effective speakers put their most important messages at the beginning and end. Similarly, in job interviews, candidates interviewed first or last often have advantages over those interviewed in the middle of the day.

    Shopping lists demonstrate this bias – people frequently remember the first few items they needed and whatever they added last, but forget middle items. This leads to incomplete grocery trips and repeated store visits.

    Recognition strategies: In important communications, place key messages at the beginning and end. When evaluating multiple options presented in sequence, take notes throughout rather than relying on memory. For important decisions involving sequences (like candidate interviews), implement structured evaluation methods that counteract memory limitations.

    Social & Authority Biases

    6. Confirmation Bias

    Confirmation bias represents perhaps the most pervasive and influential thinking error. We actively seek information that confirms our existing beliefs while avoiding, dismissing, or misinterpreting contradictory evidence. This bias affects everything from political opinions to relationship decisions to professional judgments.

    Social media platforms amplify confirmation bias through algorithmic filtering. Facebook, Twitter, and news websites show content that aligns with your previous interactions, creating echo chambers that reinforce existing views. People increasingly consume news from sources that confirm their political leanings, leading to polarization and decreased exposure to alternative perspectives.

    In relationships, confirmation bias makes us notice behaviors that support our existing views of partners while overlooking contradictory evidence. If you believe your partner is inconsiderate, you’ll notice instances of thoughtlessness while minimizing examples of consideration. This creates self-fulfilling prophecies that can damage otherwise healthy relationships.

    Professional settings aren’t immune. Doctors may seek symptoms that confirm their initial diagnosis while overlooking contradictory signs. Managers may interpret employee behavior through the lens of their initial impressions, affecting performance evaluations and development opportunities.

    Recognition strategies: Actively seek out opposing viewpoints and contradictory evidence. Before making important decisions, ask yourself: “What evidence would change my mind?” Create diverse information sources rather than consuming content that only reinforces your views. In professional contexts, implement devil’s advocate processes that systematically challenge prevailing assumptions.

    7. Authority Bias

    Authority bias leads us to defer to experts and authority figures, even outside their areas of expertise. While respecting genuine expertise makes sense, this bias can cause us to accept information uncritically when someone has authority in an unrelated domain.

    Celebrity endorsements exploit authority bias. We trust actors’ opinions about cars, athletes’ views on nutrition, and politicians’ recommendations about products they may know little about. The person’s authority in entertainment, sports, or politics creates a halo effect that extends to unrelated domains.

    Medical settings show both the value and danger of authority bias. Patients appropriately defer to doctors’ medical expertise, but this can prevent them from asking important questions or seeking second opinions. The Stanford Prison Experiment and Milgram’s obedience studies revealed how far authority bias can lead people astray when authority figures abuse their position.

    Professional hierarchies create authority bias in workplaces. Junior employees may defer to senior managers’ opinions about technical matters outside the managers’ expertise. This can stifle innovation and lead to poor decisions when authority doesn’t align with relevant knowledge.

    Recognition strategies: Distinguish between relevant expertise and general authority. Before accepting advice or information, ask yourself whether the person has specific expertise in the relevant domain. Seek second opinions on important decisions, especially when authority figures make claims outside their area of expertise. Create environments where questioning authority is safe and encouraged.

    8. Social Proof

    Social proof makes us look to others’ behavior to guide our own decisions, especially in uncertain situations. This bias serves as a useful shortcut for navigating unfamiliar social situations, but it can also lead to conformity when independent judgment would be better.

    Restaurants use social proof by maintaining reservations lists and showing busy dining rooms. Empty restaurants feel less appealing than full ones, even when the food quality is identical. Online retailers leverage social proof through customer reviews, “people also bought” suggestions, and showing how many others viewed or purchased items.

    Investment bubbles demonstrate social proof’s dangerous potential. When everyone seems to be buying stocks, real estate, or cryptocurrency, FOMO (fear of missing out) drives more purchases even when prices become irrational. The 2008 housing crisis partly resulted from social proof – everyone was buying houses, so it seemed like a safe investment.

    Emergency situations reveal both social proof’s benefits and dangers. In fires or evacuations, following others can lead to safety or to disaster, depending on whether the crowd knows the best escape route. The bystander effect occurs partly through social proof – if no one else is helping, the situation must not require intervention.

    Recognition strategies: In important decisions, gather information independently before observing others’ choices. Ask yourself whether the people you’re following have better information or different priorities than you do. In group situations, be willing to take independent action when your judgment differs from the crowd’s behavior.

    9. Halo Effect

    The halo effect occurs when one positive trait influences our perception of all other qualities. A single impressive characteristic creates a “halo” that makes everything else about a person, product, or company seem better.

    Hiring decisions suffer heavily from halo effect. An impressive educational background might make an interviewer rate all other qualifications higher, even when the education isn’t directly relevant. Physical attractiveness creates halos that affect perceptions of competence, intelligence, and character – research shows attractive people receive lighter criminal sentences and higher performance ratings.

    Brand halos affect consumer decisions. Apple’s reputation for design excellence makes people assume their products are superior in all areas, even when competitors match or exceed specific features. Similarly, prestigious company names create halos that make job candidates seem more qualified regardless of their individual performance.

    Investment decisions show halo effects when successful companies in one area expand into unrelated businesses. Investors may assume that excellence in technology translates to excellence in retail, or that successful domestic operations guarantee international success.

    Recognition strategies: Evaluate different qualities independently rather than letting one impressive trait color your entire assessment. Use structured evaluation methods that consider each relevant factor separately. When making important decisions about people, products, or investments, actively look for evidence about specific qualities rather than relying on general impressions.

    10. Fundamental Attribution Error

    The fundamental attribution error causes us to judge others by their actions while judging ourselves by our intentions. When others make mistakes, we blame their character or abilities. When we make mistakes, we point to circumstances and external factors.

    Road rage often stems from attribution errors. When another driver cuts you off, you assume they’re selfish or reckless. When you cut someone off, it’s because you’re late for an important meeting or didn’t see them. The behavior is identical, but we attribute different causes based on whose perspective we have.

    Workplace conflicts frequently involve attribution errors. When colleagues miss deadlines, we assume they’re disorganized or uncommitted. When we miss deadlines, it’s because of unexpected complications or competing priorities. These different attributions create resentment and prevent effective problem-solving.

    Customer service interactions reveal attribution patterns. Angry customers attribute poor service to incompetent or uncaring employees. Employees attribute service problems to impossible demands, inadequate resources, or unreasonable customers. Both perspectives contain truth, but the attribution error prevents empathy and collaborative solutions.

    Recognition strategies: When judging others’ behavior, actively consider situational factors that might explain their actions. Before attributing behavior to character flaws, ask what circumstances might lead you to act similarly. In conflicts, focus on understanding the other person’s perspective rather than defending your own intentions.

    Probability & Risk Biases

    11. Loss Aversion

    Loss aversion makes losses feel approximately twice as painful as equivalent gains feel good. This asymmetry profoundly affects decision-making, leading to risk-averse behavior that can prevent both potential losses and potential gains.

    Investment behavior demonstrates loss aversion clearly. Investors hold losing stocks too long, hoping to break even, while selling winning stocks too quickly to lock in gains. This “disposition effect” reduces returns because investors realize losses slowly and gains quickly – the opposite of optimal investment strategy.

    Career decisions show loss aversion when people stay in unsatisfying jobs rather than risk uncertainty. The potential loss of current income and benefits feels more significant than the potential gains from career change, even when analysis suggests change would improve long-term prospects.

    Negotiation situations reveal loss aversion through the “endowment effect.” Once we mentally own something, giving it up feels like a loss. Home sellers often refuse reasonable offers below their original price, even in declining markets, because selling feels like taking a loss rather than making a gain.

    Recognition strategies: When evaluating decisions, actively compare potential gains against potential losses using objective measures rather than emotional reactions. Consider the full range of possible outcomes, not just the risk of loss. Frame decisions in terms of gains when possible, and use time limits to prevent indefinite holding patterns based on loss aversion.

    12. Gambler’s Fallacy

    The gambler’s fallacy makes us believe that past events affect future probabilities in independent events. After observing a streak of one outcome, we expect the opposite outcome to become more likely, even when each event is independent.

    Addressing cognitive biases is essential for effective problem-solving and decision-making.

    Casino gambling demonstrates this bias perfectly. After five red spins on a roulette wheel, gamblers believe black is “due,” even though each spin remains an independent 50/50 probability (ignoring the house edge). Lottery players avoid recently drawn numbers, thinking they’re less likely to appear again.

    Investment decisions suffer from gambler’s fallacy when investors believe that falling stocks are “due” for recovery or that rising stocks must fall soon. This leads to poorly timed purchases and sales based on patterns that don’t actually predict future performance.

    Sports betting shows gambler’s fallacy when bettors expect “hot streaks” to end or believe teams are “due” for wins after losing streaks. Professional sports outcomes have some predictive factors, but random variation often gets misinterpreted as meaningful patterns.

    Recognition strategies: Understand the difference between independent events and those with genuine predictive relationships. Before making decisions based on streaks or patterns, ask whether the underlying probabilities have actually changed. Use data analysis rather than pattern recognition to guide decisions about truly random events.

    13. Probability Neglect

    Probability neglect causes us to ignore actual odds when emotions run high. We focus on the magnitude of potential outcomes rather than their likelihood, leading to irrational fear of low-probability events and insufficient concern about high-probability risks.

    Air travel anxiety demonstrates probability neglect clearly. Despite commercial aviation being extraordinarily safe, many people fear flying more than driving, which is statistically far more dangerous. The catastrophic, vivid nature of plane crashes makes them feel more likely than statistics suggest.

    Insurance decisions show probability neglect when people buy coverage for dramatic but unlikely events while skipping protection for common risks. Extended warranties on electronics feel valuable because device failure would be frustrating, even though the probability of failure is low and the coverage cost often exceeds expected benefits.

    Medical screening decisions involve probability neglect when patients avoid or seek tests based on worst-case scenarios rather than actual risk levels. Cancer screening anxiety can lead to excessive testing in low-risk populations or avoidance of beneficial screening in high-risk groups.

    Recognition strategies: When facing decisions involving risk, actively research actual probabilities rather than relying on emotional responses. Compare the likelihood of feared outcomes against everyday risks you accept without concern. Focus on expected value calculations that multiply probability by magnitude of outcomes.

    14. Sunk Cost Fallacy

    The sunk cost fallacy makes us continue investing in failing endeavors because we’ve already invested time, money, or effort. We irrationally consider past investments when making future decisions, even though those costs can’t be recovered regardless of future choices.

    Business decisions frequently fall prey to sunk cost fallacy. Companies continue funding failing projects because they’ve already invested millions, even when analysis shows the projects won’t succeed. The Concorde supersonic jet program continued partly due to massive sunk investments, despite clear evidence that it would never be profitable.

    Relationship decisions show sunk cost thinking when people stay in unfulfilling partnerships because they’ve already invested years together. The time and effort already spent can’t be recovered, but continuing an incompatible relationship prevents both partners from finding better matches.

    Career choices involve sunk costs when people stay in unsuitable fields because of educational investments or years of experience. A lawyer who discovers they hate legal work might continue practicing because law school was expensive, even though career change could improve long-term satisfaction.

    Recognition strategies: When evaluating whether to continue investments, focus exclusively on future costs and benefits rather than past expenditures. Ask yourself: “If I were starting fresh today, would I begin this project/relationship/investment?” Make decisions based on future prospects, not past commitments.

    15. Optimism Bias

    Optimism bias makes us overestimate positive outcomes and underestimate negative ones. While moderate optimism benefits mental health and motivation, excessive optimism leads to inadequate preparation and unrealistic expectations.

    Entrepreneurship demonstrates both optimism bias’s benefits and dangers. Entrepreneurs consistently overestimate their chances of success – research shows 80% of entrepreneurs believe their businesses will succeed, while actual success rates are much lower. This optimism provides motivation to start businesses, driving innovation and economic growth, but it also leads to inadequate planning and insufficient capital reserves.

    Project planning suffers from optimism bias through the “planning fallacy.” We systematically underestimate the time, cost, and effort required for projects while overestimating benefits. Home renovations typically cost twice initial estimates and take twice as long as planned.

    Health behaviors show optimism bias when people underestimate their personal risk for diseases while accurately assessing general population risks. Smokers know that cigarettes cause cancer but believe they’re personally less likely to develop smoking-related illnesses.

    Recognition strategies: Use reference class forecasting – look at how similar projects, businesses, or situations have performed historically rather than focusing on your specific circumstances. Build buffers into plans that account for typical optimism bias. Seek outside perspectives from people who don’t share your emotional investment in the outcome.

    Belief & Reasoning Biases

    16. Anchoring Bias

    Anchoring bias occurs when the first piece of information we receive heavily influences our subsequent judgments, even when that initial information is irrelevant or random. Our minds use this first “anchor” as a reference point, then adjust from there, but the adjustments are typically insufficient.

    Real estate negotiations demonstrate anchoring bias powerfully. Listing prices serve as anchors that influence both buyers and sellers, even when they bear little relationship to actual market value. Studies show that higher listing prices lead to higher final sale prices, even when properties are otherwise identical. Professional appraisers, who should be immune to such influences, still show anchoring effects based on listing prices.

    Salary negotiations reveal anchoring’s importance. The first number mentioned – whether by the employer or candidate – significantly influences the final agreement. Job candidates who anchor high typically receive higher offers than those who let employers anchor with lower initial offers.

    Even random numbers create anchoring effects. In famous experiments, people spun a wheel of fortune before estimating various quantities. Those who spun higher numbers gave higher estimates for everything from the percentage of African countries in the United Nations to the temperature in San Francisco.

    Recognition strategies: Before important negotiations, research fair values independently to establish your own anchors rather than being influenced by others’ initial offers. When making estimates or judgments, actively consider whether you’re being influenced by irrelevant information. In group decisions, encourage multiple people to provide initial estimates before discussing them.

    17. Framing Effect

    The framing effect demonstrates how the presentation of information affects our decisions, even when the underlying facts remain identical. The same choice can seem appealing or unappealing depending on whether it’s framed in terms of gains or losses, successes or failures.

    Medical decisions show dramatic framing effects. Cancer treatments described as having “90% survival rates” seem more appealing than those with “10% mortality rates,” even though the statistics are identical. Patients choose surgery more often when told “90 out of 100 patients survive” than when told “10 out of 100 patients die.”

    Investment choices reveal framing bias when financial products are described as “95% safe” versus “5% risk of loss.” The same investment appears more attractive when framed positively. Marketing teams exploit this by emphasizing benefits over costs, gains over risks.

    Consumer decisions involve framing when products are described as “95% fat-free” rather than “contains 5% fat.” The positive framing makes identical products seem healthier and more appealing.

    Recognition strategies: When facing important decisions, deliberately reframe options in different ways to see if your preferences change. Look for the underlying facts behind marketing language. Before making choices, write down the key information in neutral terms rather than accepting others’ framing.

    18. Representativeness Heuristic

    The representativeness heuristic leads us to judge probability based on similarity to mental stereotypes or prototypes. We assess how closely something matches our mental model of a category, but this can lead us to ignore important statistical information like base rates.

    Profiling situations demonstrate this bias clearly. When describing someone as “shy, withdrawn, and detail-oriented,” people often assume they’re more likely to be a librarian than a salesperson. However, there are far more salespeople than librarians in the population, making the salesperson identification statistically more probable despite the personality description fitting librarian stereotypes.

    Investment decisions suffer when investors assume that successful companies must continue succeeding because they “look like” winners. Small, rapidly growing companies seem more representative of future success than large, established firms, even though historical data shows mixed results for different investment strategies.

    Sports predictions reveal representativeness bias when commentators expect players to continue “hot streaks” because recent performance seems representative of their current skill level. Random variation in performance gets misinterpreted as meaningful patterns.

    Recognition strategies: Before making judgments based on similarity, actively consider base rate information – how common is each possibility in the general population? Look for actual data rather than relying on how well something fits your mental stereotypes. Remember that vivid, specific details can make unlikely scenarios seem more probable than they actually are.

    19. Base Rate Neglect

    Base rate neglect occurs when we ignore background probability information in favor of specific details. We focus on individual characteristics while overlooking how common or rare something is in the general population.

    Medical diagnosis demonstrates this bias when doctors focus on symptoms that match rare diseases while ignoring how uncommon those conditions actually are. A patient with fatigue might have symptoms consistent with a rare autoimmune condition, but fatigue is much more commonly caused by stress, poor sleep, or minor infections.

    Criminal profiling can involve base rate neglect when investigators focus on details that match psychological profiles while ignoring how rare certain types of crimes actually are. The specific details seem compelling, but most crimes are committed by ordinary people rather than individuals matching dramatic psychological profiles.

    Academic predictions show base rate neglect when college admissions officers focus on compelling personal stories while ignoring statistical predictors of success. A student with an inspiring background might seem likely to succeed, but test scores and grades remain better predictors of academic performance.

    Recognition strategies: Before making probability judgments, research the baseline frequency of different outcomes. Ask yourself: “How often does this actually happen?” Combine specific information with general statistical patterns rather than focusing only on individual details.

    20. Conjunction Fallacy

    The conjunction fallacy makes us believe that specific conditions are more probable than general ones when the specific scenario seems more representative or compelling. We incorrectly assume that more details increase probability, when additional conditions actually make outcomes less likely.

    The famous “Linda problem” illustrates this bias. When told that Linda is concerned with social justice and participated in anti-nuclear demonstrations, people rate “Linda is a bank teller and feminist” as more probable than “Linda is a bank teller.” The additional detail about feminism seems to fit Linda’s description, but it actually reduces mathematical probability because it adds another condition that must be met.

    Business planning shows conjunction fallacy when entrepreneurs develop elaborate scenarios involving multiple positive developments. A plan requiring a great product AND successful marketing AND favorable economic conditions AND competitive advantages seems compelling because it addresses many success factors, but each additional requirement reduces overall probability.

    Weather predictions demonstrate this bias when detailed forecasts seem more likely than general ones. “Rain in the afternoon with temperatures around 20°C and light winds” might seem more probable than simply “rain,” even though the specific scenario requires more conditions to align.

    Recognition strategies: When evaluating complex scenarios, break them down into component parts and consider whether each additional detail increases or decreases overall probability. Remember that more specific descriptions are generally less likely than general ones, even when they seem more realistic or compelling.

    Self-Serving & Ego Biases

    21. Overconfidence Effect

    The overconfidence effect makes us overestimate our knowledge, abilities, and chances of success. This bias appears in three forms: overestimation (thinking we’re better than we are), overplacement (thinking we’re better than others), and overprecision (being too certain about our beliefs).

    Investment behavior demonstrates overconfidence clearly. Individual investors trade too frequently, convinced they can beat the market despite evidence that most active trading reduces returns. Men show greater overconfidence in trading than women, leading to more frequent trades and lower net returns. Professional fund managers also exhibit overconfidence, with most failing to beat market indices over time.

    Entrepreneurship involves healthy overconfidence that motivates people to start businesses despite low success rates. However, excessive overconfidence leads to inadequate planning, insufficient capital reserves, and poor risk management. Entrepreneurs consistently overestimate their chances of success while underestimating time to profitability and funding requirements.

    Driving provides a classic overconfidence example – most people rate themselves as above-average drivers, which is statistically impossible. This overconfidence contributes to risky driving behaviors and insufficient safety precautions.

    Recognition strategies: Before making important decisions, actively seek out information that might challenge your assumptions. Track your prediction accuracy over time to calibrate your confidence levels. Seek feedback from others who can provide objective assessments of your abilities and plans.

    22. Hindsight Bias

    Hindsight bias makes past events seem more predictable than they actually were, earning it the nickname “the knew-it-all-along effect.” After learning outcomes, we unconsciously revise our memory of what we expected, making it seem like we anticipated results that actually surprised us.

    Investment decisions suffer from hindsight bias when market movements seem obvious in retrospect. After a crash, investors claim they “saw it coming,” even though they didn’t sell beforehand. This prevents learning from mistakes because people convince themselves they had better information than they actually possessed.

    Performance evaluations involve hindsight bias when managers remember predicting employee outcomes that weren’t actually foreseeable. An employee’s success or failure seems obvious after the fact, even when their performance was uncertain during the evaluation period.

    Medical diagnosis shows hindsight bias when doctors claim they anticipated complications that weren’t clearly predictable. This can lead to unfair malpractice judgments and prevent learning from genuinely unpredictable outcomes.

    Recognition strategies: Keep written records of your predictions and reasoning before outcomes are known. When reviewing past decisions, actively try to remember what information was available at the time rather than incorporating knowledge gained afterward. Focus on decision quality based on available information rather than final outcomes.

    23. Self-Serving Bias

    Self-serving bias leads us to attribute successes to our abilities while blaming failures on external factors. This protects self-esteem but prevents accurate self-assessment and learning from mistakes.

    Academic performance demonstrates this bias when students credit good grades to their intelligence and hard work while blaming poor grades on unfair tests, bad teachers, or insufficient time. This prevents students from identifying areas where they need to improve study strategies or knowledge gaps.

    Sports show self-serving bias when athletes attribute wins to skill and training while blaming losses on referee calls, weather conditions, or luck. While external factors certainly influence outcomes, this bias prevents athletes from identifying weaknesses in their performance.

    Business results involve self-serving bias when managers take credit for successful projects while blaming failures on market conditions, inadequate resources, or team problems. This prevents organizational learning and improvement.

    Recognition strategies: Before attributing outcomes to specific causes, actively consider both internal and external factors that contributed to results. Seek feedback from others who can provide more objective assessments of your role in successes and failures. Focus on learning opportunities rather than protecting self-image.

    24. Dunning-Kruger Effect

    The Dunning-Kruger effect occurs when people with low ability in a domain greatly overestimate their competence, while people with high skill often underestimate themselves. Ironically, the skills needed to perform well are often the same skills needed to recognize good performance, creating a double burden for those with limited ability.

    Professional development demonstrates this bias when new employees feel confident about skills they haven’t yet mastered. Beginning teachers, doctors, or managers often express more confidence in their abilities than experienced professionals who understand the complexity of their roles.

    Social media discussions reveal Dunning-Kruger effects when people with minimal knowledge about complex topics (climate science, economics, medicine) express strong confidence in their opinions, while those with vast knowledge offer qualifiers such as “might be” or “in many cases” versus absolutes. The vast amount of information available online can create an illusion of expertise without actual understanding.

    Consumer decisions show this bias when people feel confident making complex choices (investment strategies, medical treatments, technical purchases) without sufficient knowledge to evaluate options effectively.

    Recognition strategies: When entering new domains, assume you know less than you think and actively seek education and feedback. Be suspicious of your own confidence in areas where you lack extensive experience. Regularly test your knowledge against objective standards or expert feedback.

    25. Planning Fallacy

    The planning fallacy causes us to underestimate the time, cost, and effort required for projects while overestimating benefits. This bias affects everything from home renovations to major infrastructure projects.

    Home improvement projects typically cost twice initial estimates and take twice as long as planned. Homeowners focus on best-case scenarios while failing to account for complications, delays, and scope creep that commonly occur during renovations.

    Software development suffers from planning fallacy when programmers consistently underestimate coding time. Projects that seem straightforward become complex as edge cases, integration issues, and changing requirements emerge during development.

    Academic projects demonstrate this bias when students underestimate research and writing time. The planning fallacy leads to rushed final products and unnecessary stress that could be avoided with more realistic time estimates.

    Recognition strategies: Use reference class forecasting – look at how long similar projects actually took rather than focusing on your specific situation. Add buffers to plans that account for typical underestimation. Break large projects into smaller components that are easier to estimate accurately.

    How Cognitive Biases Might Impact Mental Health

    Understanding cognitive biases helps you recognize when your thoughts are being shaped by mental shortcuts rather than present-moment reality. These biases can quietly fuel anxiety, self-criticism, conflict in relationships, and rigid thinking patterns. When you learn to spot them, you gain space to respond more intentionally instead of reacting on autopilot.

    Put another way, you pause to be present with your thoughts. To examine the “evidence” to support your beliefs. This disrupts the autopilot, automatic process of cognitive bias.

    By recognizing cognitive biases, you can transform your thought patterns and responses.

    Awareness alone can be regulating — especially when you realize that many distressing thoughts are not personal failures, but predictable brain patterns. For example, the negativity bias can skew how you interpret experiences and emotions, often making things feel worse than they are. You can explore this further in my short article, Overcoming the Negativity Bias, a deeper dive into how this bias impacts emotional well-being and ways to balance it out.

    If you prefer video content over reading, you will find numerous videos on my YouTube channel that will guide your efforts by teaching you how to pause and make the “u-turn” inward.

    For additional strategies on recognizing and shifting cognitive biases, visit Early Years TV’s article Overcoming Cognitive Biases.

    Kathy Brodie

    © 2026 Kathy Brodie

    Kathy Brodie is an Early Years Professional, Trainer and Author of multiple books on Early Years Education and Child Development. She is the founder of Early Years TV and the Early Years Summit.


    Early Years TV Cognitive Biases: Complete Guide to Thinking Errors. Available at: https://www.earlyyears.tv/cognitive-biases-complete-guide/ (Accessed: 4 February 2026).

    Scroll to Top