Identifying and addressing logical fallacies in your clients is just as important as identifying and addressing cognitive distortions in your clients. Both are hard work, but they will make you an effective coach. I’ll never forget the message I got from a client about three years into my coaching career.

“I’ve been thinking,” she wrote, “and I don’t think this approach will work for me. I saw Dr. [Instagram influencer] talking about how carbs spike insulin and cause fat storage, and since I’m insulin resistant, I probably need to go keto. Plus, my friend lost 30 pounds doing it, so it clearly works.”

Now, if you’ve been coaching for any length of time, you’ve gotten some version of this message. A client comes to you having been convinced by someone online, someone with credentials or a transformation story, that they need to do something extreme, something that contradicts everything you’ve been working on together.

And if you’re early in your coaching career, you might have responded the way I did back then. I sent her a bunch of research papers. I explained the nuance of insulin’s role in metabolism. I tried to win the argument with evidence and logic.

It didn’t work.

She went keto anyway, lasted about three weeks, binged hard when she couldn’t sustain it, and ghosted me entirely.

It took me a long time to understand what actually happened in that exchange. The problem wasn’t that she didn’t have enough information. The problem was that her reasoning itself was flawed. She’d assembled what looked like a logical argument, but it was built on multiple logical fallacies stacked on top of each other.

Appeal to authority. Post hoc reasoning. Bandwagon effect. All wrapped up in what seemed like a sensible conclusion.

And I missed it entirely because I was arguing against her conclusion instead of addressing the faulty reasoning that produced it.

This is what I want to teach you today. Identifying and addressing logical fallacies in your clients is one of the most important skills you can develop, and one of the least discussed in our field. Because while cognitive distortions get a lot of attention in coaching conversations, and rightly so, logical fallacies are their quieter, more insidious cousin. They don’t announce themselves with emotional language. They don’t feel like errors. They don’t trigger your scepticism alarm. They masquerade as rational thinking. They sound reasonable, even when they’re completely wrong.

And if you can’t spot them and address them, you’ll watch clients make poor decisions while believing they’re being perfectly logical. You’ll see them sabotage their progress toward better body composition, increased strength, improved aerobic fitness, better sleep, lower stress, and ultimately, the health and longevity they’re seeking.

Let’s start with understanding what we’re actually dealing with.

TL;DR

Identifying and addressing logical fallacies in your clients is as crucial as working with cognitive distortions, yet far less discussed. While cognitive distortions are emotionally charged patterns that feel true, logical fallacies masquerade as rational arguments. Your client believes they’ve reasoned correctly when the reasoning itself is flawed. 

These errors stem from evolutionary heuristics that once kept ancestors alive but now mislead us in a world of infinite information and anonymous authorities. 

The most common fallacies fall into clear categories: mistaking correlation for causation, substituting credentials for critical thinking, confusing parts with wholes, cherry-picking evidence, following the crowd, using emotional appeals, creating false choices, and attacking people rather than arguments. 

These aren’t abstract errors, they directly sabotage strength gains, body composition, cardiovascular fitness, sleep, and long-term health by causing program hopping, supplement waste, training inconsistency, and diet cycling. 

The solution isn’t lecturing about logic but using a framework of listening, validating concerns, asking Socratic questions, expanding perspectives, and reframing around evidence and personal fit. When you teach clients to recognise their own fallacies, you’re not just helping them achieve fitness goals, you’re teaching them to think clearly, author their own lives, and develop the metacognitive skills that constitute human flourishing itself.

Table of Contents

What Are Logical Fallacies? 

A logical fallacy is an error in reasoning structure. It’s a flaw in the way an argument is constructed, independent of whether the conclusion happens to be true or false.

What makes this tricky is that a logically fallacious argument can still arrive at a true conclusion. And a logically sound argument can arrive at a false conclusion if the premises are wrong. Logic is about the relationship between premises and conclusions, not about truth itself.

Let me give you an example. “All effective coaches have visible abs. I have visible abs. Therefore, I’m an effective coach.” That’s a logical fallacy (affirming the consequent, if you want to get technical), even if the person making that argument might actually be an effective coach. The conclusion could be true even though the reasoning is flawed.

This is why identifying and addressing logical fallacies in your clients is so critical to their success. They feel convincing. They sound like reasoning. Your client isn’t saying “I feel like I should do this.” They’re saying, “Based on this logic, I should do this.” And if you don’t understand the structure of logical fallacies, you can’t help them see where their reasoning broke down. You have to have a strong theory of mind for your clients to really tackle these too. 

Now, how do logical fallacies differ from cognitive distortions? Cognitive distortions are automatic, habitual thought patterns that are usually emotionally charged. “I’m a failure.” “I always mess up.” “Nothing ever works for me.” These feel true in the moment, and they shape how someone experiences reality.

Logical fallacies, by contrast, are presented as rational arguments. They’re the reasons your client gives for their beliefs or decisions. “I should do keto because Dr. X recommends it.” “Carbs must be the problem because I gained weight after eating them.” “Everyone’s doing intermittent fasting, so it must work.”

From a philosophical standpoint, we’re talking about the difference between epistemology (how we know what we know) and psychology (how we experience what we experience). Cognitive distortions are psychological; they’re about subjective experience. Logical fallacies are epistemological; they’re about the structure of reasoning itself.

In practice, they often work together. A cognitive distortion creates an emotional conclusion, and then your client uses a logical fallacy to rationalise it. Or a logical fallacy leads to a false belief, which then triggers cognitive distortions.

For example, someone catastrophises one instance of weight gain after eating carbs (cognitive distortion). Then they use post hoc reasoning to conclude that carbs cause weight gain (logical fallacy). Now they believe carbs are the enemy, and every time they eat carbs, they spiral into all-or-nothing thinking (back to cognitive distortions).

This creates what Aaron Beck called the cognitive triad: negative views of self (“I can’t control myself around carbs”), world (“carbs are dangerous”), and future (“I’ll never achieve my body composition goals”). The fallacy reinforces the distortion; the distortion makes them vulnerable to more fallacies. It’s a feedback loop I call the Fallacy Flywheel, where one error in reasoning sets the wheel spinning, and each revolution makes it spin faster.

From Daniel Kahneman’s System 1 and System 2 perspective, cognitive distortions are largely System 1 processes. They’re fast, automatic, and emotional. Logical fallacies can appear in either system. Sometimes they’re System 1: quick heuristics that feel like logic but aren’t. Sometimes they’re System 2: deliberate reasoning that happens to be structured incorrectly.

This is why logical fallacies are particularly insidious. When someone is in the grip of a cognitive distortion, they usually recognise, at some level, that they’re being emotional. “I know I’m catastrophising, but…” But when someone is using a logical fallacy, they believe they’re being rational. There’s no “but.” They think they’ve reasoned their way to the truth. The person who thinks they’re being logical is often the most dangerous to themselves.

Your job as a world-class coach is to help them see that the reasoning itself is flawed, and to do it in a way that doesn’t make them feel stupid or attacked. Because the moment they get defensive, the conversation is over.

The Evolutionary Roots of Logical Fallacies

Before we dive into specific fallacies and how to handle them, I want you to understand why these thinking errors are so common. Because they’re not evidence of stupidity. They’re evidence of humanity.

Many logical fallacies are rooted in heuristics, which are mental shortcuts that evolved to help our ancestors make quick decisions with limited information. In the ancestral environment, these shortcuts were often adaptive. They kept you alive.

Take the bandwagon fallacy (following the crowd because everyone else is doing it). In a small tribal group, if everyone’s eating certain berries and not dying, those berries are probably safe. The heuristic “do what others are doing” was a fast, effective way to navigate uncertainty. You didn’t need to personally test every food source. You could learn from the collective experience of the tribe.

But this wasn’t just about social learning; it was about coalitional psychology. Humans evolved in groups where group membership was survival. Doing what the tribe does signalled loyalty, maintained social bonds, and ensured you weren’t ostracised. Being cast out meant death.

In the modern world, this same adaptive mechanism leads to people jumping on every diet trend because “everyone’s doing it,” even though the people you’re observing online have completely different genetics, goals, and circumstances than you do. When your client insists on trying keto because their entire small group gym is doing it, they’re not just following dietary advice; they’re signalling membership in a coalition. This is why evidence often fails to change minds: you’re not just arguing with a belief, you’re threatening tribal identity.

Or consider the appeal to authority. In ancestral environments, expertise was real and observable. The person who’d been hunting for decades knew more about tracking animals than the teenager. Listening to experienced elders was adaptive. This connects to what evolutionary psychologists call prestige bias, where we evolved to learn from prestigious individuals because they had demonstrated competence in ways we could directly observe.

But today, we apply this heuristic to Instagram doctors who’ve never met us, don’t know our medical history, and are selling supplements. Prestige has been hijacked by follower counts and physiques. The appeal to authority fallacy exploits this evolved bias, activating ancient learning mechanisms in an environment where apparent prestige bears no relationship to actual expertise.

We’re running stone-age software in a digital world. The heuristics that kept us alive in small tribes with immediate, physical threats now lead us astray in environments with infinite information, anonymous authorities, and abstract health goals that play out over decades.

The negativity bias also has evolutionary roots, as our brains evolved to weight negative information more heavily than positive information because the cost of missing a threat was death, while the cost of missing an opportunity was just a missed meal.

Our ancestors faced asymmetric costs. A false negative (missing a predator) meant death. A false positive (seeing a predator that wasn’t there) meant wasted energy. So we evolved to err on the side of caution, producing both negativity bias and vulnerability to fear-based appeals.

This creates a vulnerability to certain logical fallacies, particularly those involving fear or threat. When someone uses slippery slope reasoning (“If I eat one cookie, I’ll binge all weekend and ruin everything”), they’re tapping into this threat-detection system. The brain is predicting catastrophe from a small trigger, which would have been adaptive if the small trigger was the rustle of a predator in the bushes. But it’s maladaptive when the trigger is a cookie.

When an influencer says “seed oils are toxic and will destroy your health,” they’re triggering error management bias. Better to avoid than risk, even if evidence is weak. This is the same system that kept your ancestors alive, now exploited to sell you tallow.

Understanding this evolutionary context helps you approach logical fallacies with compassion rather than judgment. Your client isn’t being irrational; they’re being human. Their brain is doing what brains do, what brains evolved to do. Your job is to help them override the autopilot with deliberate reasoning when the stakes warrant it.

How Fallacious Thinking Sabotages Physiological Outcomes

Something that most coaches miss when identifying and addressing logical fallacies in their clients is that these aren’t just abstract thinking errors. They directly impair the physiological outcomes your clients are seeking.

Post hoc reasoning about training leads to program hopping. Your client does a new workout, feels sore, and concludes it must be working. Two weeks later, they see no visible changes, so they conclude it’s not working and jump to a different program. This cycle repeats endlessly. The result is that they never accumulate sufficient volume for muscular adaptation. No progressive overload. No strength gains. Minimal muscle growth. Their reasoning error becomes a physiological limitation.

Appeal to authority manifests as wasted money and misplaced faith in supplements and ergogenic aids. Your client follows an influencer who recommends five supplements, pre-workout formulas, and specialised training gear. They invest hundreds of euro monthly while neglecting the fundamentals: consistent sleep, stress management, and proper nutrition. The result is elevated cortisol from chronic stress that they’re not addressing. Disrupted sleep architecture, reduced REM sleep, and more nighttime awakenings. Impaired glucose metabolism. And their body composition and health suffers because of how they allocated their efforts based on fallacious reasoning.

All-or-nothing thinking produces training inconsistency. Your client misses Monday’s workout and concludes the week is ruined, so why bother with Wednesday or Friday? This pattern repeats across months. The result is reduced training frequency which means insufficient stimulus for aerobic adaptations or muscular adaptations. Their VO2 max plateaus or declines. Their resting heart rate remains elevated. They never develop the cardiovascular fitness they’re capable of. They don’t build the level of strength or muscle they wanted. The fallacy becomes a ceiling on their physical capacity.

Bandwagon behaviour around extreme diets triggers a binge-restriction cycle. Your client goes keto because everyone’s doing it, despite it being unsuitable for their preferences and lifestyle. They white-knuckle it for three weeks, then binge hard. This pattern elevates cortisol chronically, disrupts sleep (particularly the deep sleep stages critical for recovery), impairs glucose metabolism through repeated metabolic whiplash, and often results in losing muscle mass during restriction and gaining primarily fat during binges. Their body composition deteriorates while they believe they’re being “logical” by following what works for others.

The hasty generalisation that “breakfast doesn’t work for me” based on one bad experience might lead to chronic undereating early in the day, resulting in afternoon energy crashes, poor workout performance, elevated evening cortisol, late-night overeating, and disrupted sleep-wake cycles. A single logical error cascades into multiple physiological dysfunctions.

These aren’t hypothetical. I’ve watched clients sabotage their cardio improvements, their strength progressions, their body composition changes, their sleep quality, their stress resilience, and ultimately their trajectory toward health and longevity. And it’s not through lack of effort, but rather through fallacious reasoning that misdirected that effort.

The fallacies we’re about to explore aren’t just thinking errors. They’re physiological sabotage. When you become skilled at identifying and addressing logical fallacies in your clients, you’re not just improving their cognition, you’re removing barriers to the physical changes they’re working toward.

The Major Categories of Logical Fallacies You’ll Encounter

Logicians have identified dozens of fallacies, but in health and fitness coaching, you’ll see the same ones over and over. I’ve organised them into categories based on the type of reasoning error they represent.

Fallacies of Causation: Mistaking Correlation for Cause

These are perhaps the most common fallacies in health coaching because our bodies are complex systems where multiple variables change simultaneously. Your client sees two things happen in sequence and assumes one caused the other.

Post hoc ergo propter hoc (“after this, therefore because of this”) is the classic example.

“I ate carbs at dinner and gained two pounds overnight, so carbs make me gain fat.”

The reasoning is: A happened, then B happened, therefore A caused B. But temporal sequence doesn’t establish causation. Your client is ignoring about a dozen confounding variables: water retention from increased glycogen storage, sodium content of the meal, where they are in their menstrual cycle, digestive contents, normal daily fluctuations, stress, sleep quality, and more.

This fallacy is everywhere in health and fitness. “I started taking this supplement and my energy improved, so the supplement works.” Maybe. Or maybe you also started sleeping better, reduced your stress, changed your diet, and began exercising consistently. Any of those could be the actual cause.

From a scientific reasoning perspective, establishing causation requires more than correlation. It requires temporality (cause precedes effect), plausibility (there’s a mechanism by which A could cause B), consistency (the relationship holds across different contexts), strength of association, and ideally, experimental manipulation where you control for confounding variables.

But your client isn’t thinking like a scientist. They’re using a heuristic: things that happen together are related. And in many contexts, that heuristic works well enough. But in the complex system of human metabolism and physiology, it leads to wildly incorrect conclusions.

Your intervention here is to help them see the confounding variables and to introduce the concept of controlled experimentation. “Let’s test this properly. If carbs cause fat gain for you, we should see consistent weight increases every time you eat carbs in controlled amounts, while holding other variables constant. Want to try that and see what the data actually shows?”

This is engineering root cause analysis applied to coaching. When engineers troubleshoot, they don’t just fix symptoms, they use the Five Whys to find root causes. When a client says “keto didn’t work,” you could ask: Why didn’t it work? “I couldn’t sustain it.” Why? “Too restrictive.” Why did you choose it? “Dr. X recommended it.” Why did that matter? “He has credentials.” Why did credentials override fit? “I didn’t know how else to evaluate it.” Now you’ve found the root: lack of evaluative framework. Teaching critical thinking fixes the root problem, not just the symptom.

Cum hoc ergo propter hoc (“with this, therefore because of this”) is the close cousin of post hoc.

“People who eat breakfast tend to be leaner, so eating breakfast causes fat loss.”

Here, the client is observing a correlation (breakfast eaters are leaner) and inferring causation (breakfast causes leanness). But correlation can arise from many relationships:

  • A causes B (breakfast causes leanness)
  • B causes A (being lean influences breakfast habits)
  • C causes both A and B (conscientiousness causes both breakfast eating and overall healthy habits)
  • It’s coincidence (no causal relationship)

The breakfast research is actually a perfect example of this. Early observational studies showed breakfast eaters were leaner, and everyone concluded breakfast was important for weight management. But when researchers actually did controlled experiments where they randomly assigned people to eat or skip breakfast, it turned out breakfast had minimal effect on weight. The real variable was that people who ate breakfast tended to be more health-conscious overall.

Your client isn’t making up the correlation. The correlation is real. But they’re jumping to a causal conclusion that isn’t supported by the evidence.

This relates to what is called endogeneity, which is when variables are related in complex, bidirectional ways that make simple causal interpretations impossible. For example, the economy affects unemployment, but unemployment also affects the economy. Health-conscious behaviour affects breakfast eating, but it also affects a hundred other variables.

This also connects to systems thinking and feedback loops. Your client’s body is not a simple input-output machine. It’s a complex adaptive system with reinforcing and balancing feedback loops. Post hoc reasoning about supplements might create a reinforcing loop: buy supplements → placebo effect → feel better → confirms reasoning → buy more supplements. But there’s also a balancing loop: spend money → financial stress → worse outcomes → disconfirms reasoning → reduce purchases. The system has its own intelligence that simple linear reasoning misses.

Your job is to help clients think in terms of systems, not simple chains. “Yes, breakfast eaters tend to be leaner. But that might be because the kind of person who prioritises breakfast also prioritises sleep, stress management, and overall nutrition. Breakfast might just be a marker of health consciousness, not a cause of leanness.”

Fallacies of Authority: Mistaking Credentials for Truth

These fallacies are rampant in the age of social media fitness. Everyone has a platform, credentials are easy to display (and fake), and appearance is often mistaken for expertise.

Appeal to authority is the most straightforward.

“Dr. X on Instagram says I need to cut all seed oils, so I should do that.”

The client is outsourcing their critical thinking to someone with perceived expertise. And look, expertise matters. I’m not arguing for ignoring experts. But expertise in one domain doesn’t transfer to all domains, credentials don’t guarantee correctness, and even legitimate experts can be wrong, biased, or giving advice that’s inappropriate for a specific individual.

From an epistemological standpoint, we need to understand different types of authority. There’s epistemic authority (someone who genuinely knows more than you about a topic) and deontic authority (someone with social or institutional power to make decisions). The problem is when we confuse the two, or when we treat epistemic authority as absolute rather than provisional.

A PhD in biochemistry doesn’t automatically make someone an expert in practical nutrition coaching. A champion bodybuilder might know their own body incredibly well, but have no idea what works for an overweight beginner. An Instagram doctor with a million followers might be repeating conclusions from low-quality studies while ignoring the broader evidence base.

This connects to the principal-agent problem. Instagram influencers (agents) have different incentives than their audience (principals). The influencer is maximising engagement and sales, not client outcomes. When you’re identifying and addressing logical fallacies in your clients, you need to help them see these misaligned incentives. The person giving advice might not be serving their best interests, even if that person has impressive credentials.

Your job is to help clients evaluate authority critically:

  • What’s this person’s actual expertise? Are they trained in this specific area?
  • What’s their evidence? Are they citing quality research or cherry-picking studies?
  • Do they acknowledge uncertainty and nuance, or do they speak in absolutes?
  • Does their advice fit your specific context and goals?
  • Do other experts in the field agree, or is this a fringe position?
  • What are their incentives? What are they selling?

Notice that you’re not dismissing the expert. You’re teaching the client to think critically about expertise itself.

This is also where the Stoic concept of the dichotomy of control becomes powerful. Epictetus distinguished between what’s “up to us” (our judgments, our reasoning) and what’s “not up to us” (others’ opinions, outcomes, our genetics). When your client uses appeal to authority (“Dr. X says…”), they’re surrendering what’s up to them (their judgment) to what’s not up to them (someone else’s opinion). Part of identifying and addressing logical fallacies in your clients is helping them reclaim their judgment, their agency. As Epictetus wrote: “What upsets people is not things themselves but their judgments about things.” We need clarity in judgment, and agency over that judgment.

Appeal to accomplishment is a variant where the authority comes from achievement rather than credentials.

“This competitor won their pro card doing this protocol, so it must be the best approach.”

The accomplishment is real. The person achieved something impressive. But that doesn’t mean their specific approach is optimal, necessary, or applicable to your client. There’s a concept in research called survivorship bias, where we only see the people who succeeded, not the countless people who tried the same approach and failed.

Maybe the competitor won despite their protocol, not because of it. Maybe they have genetic gifts that allowed them to succeed with a suboptimal approach. Maybe they’re lying about what they actually did (this is more common than anyone wants to admit). Maybe their protocol would work for a tiny subset of people, but fails for most.

This fallacy is particularly powerful in fitness culture because we can see the results. The person looks the way your client wants to look. So their approach must be the right one. But correlation is not causation, and survival is not evidence.

From a statistical perspective, this is sampling bias. You’re drawing conclusions from a non-representative sample (successful people) and ignoring all the failures (people who tried and failed).

Your intervention: “Yes, this approach worked for them. The question is whether it will work for you, given your different genetics, recovery capacity, lifestyle, and preferences. Let’s design something based on what the broader evidence shows works for most people, customised to your specific context.”

Appeal to celebrity or appearance takes this a step further.

“This influencer has amazing abs, so I should follow their diet advice.”

Physical appearance is treated as evidence of expertise. But appearance can result from genetics, photo editing, performance-enhancing drugs, disordered eating, or legitimate expertise. You can’t tell which by looking.

Moreover, even if their appearance is the result of their approach, that doesn’t mean the approach is optimal, sustainable, or appropriate for someone else. The incentives of social media (engagement, attention, product sales) often favour extreme approaches and dramatic transformations over sustainable, evidence-based coaching.

Social media is performance. What you see is a carefully curated highlight reel, not reality. The person with the perfect physique might be miserable, might have achieved it through unsustainable means, might be selling you something that doesn’t actually work. This is Goffman’s dramaturgical theory in action. 

Your job is to help clients separate appearance from expertise, and to question the incentive structures behind online advice. “That person looks great. But what do we actually know about how they got there, whether it’s sustainable for them, whether it would work for you, and whether they’re being honest about their methods?”

There’s a historical parallel here worth noting. In 1847, Ignaz Semmelweis discovered that handwashing prevented childbed fever. But doctors rejected his findings for decades. Why you might ask, well, it was because accepting them would mean admitting they’d been killing patients. So they used multiple fallacies: appeal to tradition (“we’ve never washed hands”), appeal to authority (“prestigious doctors don’t wash hands”), and ad hominem attacks (Semmelweis was eventually committed to an asylum). Your clients do the same; they reject evidence that threatens their identity or contradicts their invested beliefs. Understanding this resistance is crucial when identifying and addressing logical fallacies in your clients.

Fallacies of Composition: Mistaking Part for Whole

These fallacies involve errors in how we think about relationships between parts and wholes, between individual cases and general rules.

Hasty generalisation draws broad conclusions from limited data.

“I tried eating breakfast once and felt terrible, so breakfast doesn’t work for me.”

One data point becomes a universal rule. From a statistical perspective, this is a sample size of n=1, which is nowhere near sufficient to draw reliable conclusions. You need multiple trials across different conditions to establish a pattern.

But beyond sample size, there’s the question of confounding variables. What did they eat for breakfast? How much? What time? After how much sleep? With what stress level? Were they adapted to eating breakfast, or was their body unaccustomed to morning food?

The human brain loves to find patterns, and it’s surprisingly good at seeing patterns even in random noise. This is called apophenia, and it’s adaptive in many contexts (better to see a pattern that isn’t there than to miss one that is). But in the complex system of nutrition and physiology, it leads to false conclusions based on insufficient evidence.

Your intervention is to slow down the jump from data to conclusion. “You tried breakfast once under specific conditions and had a specific experience. That’s one data point. Let’s gather more. What if we try different types of breakfast, at different timings, over several days? Then we can see if there’s actually a pattern or if that one experience was influenced by other factors.”

You’re teaching them to think like a scientist: hypothesis, multiple experiments, data collection, pattern recognition, tentative conclusion. Not a hypothesis, one trial, and an absolute conclusion.

Composition fallacy assumes what’s true for parts is true for the whole.

“Protein is good for building muscle, so I should eat as much protein as possible.”

Yes, adequate protein is necessary for muscle growth. But more is not always better. At some point, additional protein provides no additional benefit and might even cause problems (displacement of other nutrients, digestive issues, kidney stress in vulnerable populations, unnecessary expense).

This is a failure to understand dose-response relationships and the concept of diminishing returns. In economics, this is called the law of diminishing marginal utility. The first unit of something provides a lot of value, but each additional unit provides less value than the one before, until eventually additional units provide no value or even negative value.

From a physiological perspective, your body can only synthesise muscle at a certain rate, which is limited by factors beyond protein availability (training stimulus, recovery, hormones, genetics). Once you’ve met your protein needs, eating more doesn’t overcome those other limiting factors. Your client might be optimising one variable (protein) while the real constraint is sleep quality, training volume, or stress management.

Your job is to help clients think in terms of optimal ranges rather than “more is better.” “Protein is important, and there’s a range that supports your goals. More than that range doesn’t give you additional benefit. Let’s find your optimal intake based on your body weight, training stimulus, and overall calorie intake.”

Division fallacy is the opposite: assuming what’s true for the whole is true for the parts.

“This diet works for fat loss overall, so every single meal needs to fit the diet perfectly.”

The diet might work on average across a week or month, but that doesn’t mean every individual meal must be perfect. You could have high-calorie days and low-calorie days and still achieve the same average. You could get most of your protein at dinner and still hit your daily total.

This fallacy often underlies all-or-nothing thinking. The client believes that because overall consistency matters, every single moment must be perfectly consistent. But that’s not how systems work. Overall patterns can be robust to individual variations.

From a systems theory perspective, this is about understanding emergence. The properties of a system often emerge from the interactions of components over time, not from the properties of any single component. Your health emerges from patterns of eating, moving, sleeping, and managing stress across weeks and months, not from the perfection of any single meal or workout.

Your intervention: “The plan works because of what happens on average over time, not because every meal is perfect. Missing your protein target at lunch doesn’t matter if you hit it at dinner. Having a high-calorie day doesn’t matter if your weekly average is on target. Let’s focus on the overall pattern, not perfect execution of every moment.”

This connects to what James Clear calls “the aggregation of marginal gains” where small, consistent actions compound into remarkable results. But the division fallacy causes clients to miss this by demanding perfection in each part rather than consistency in the whole.

Fallacies of Evidence: Cherry-Picking and Confirmation Bias

These fallacies involve the selective use of evidence to support a predetermined conclusion.

Cherry-picking is selecting only the evidence that supports your position while ignoring evidence that doesn’t.

“I read a study showing keto improved insulin sensitivity, so keto is the answer for insulin resistance.”

There might be a study showing that. But what about the dozens of other studies looking at different dietary approaches for insulin resistance? What about the studies showing that keto didn’t outperform other diets? What about the individual variation in response to keto? You can find a study to support almost any position in nutrition. Studies vary in quality, population, methods, and funding sources. Cherry-picking studies is easy. Evaluating the totality of evidence is hard.

This is where we get into evidence hierarchies. A single observational study is weaker evidence than a randomised controlled trial. A single RCT is weaker than a meta-analysis of multiple RCTs. And even strong evidence for population averages might not apply to a specific individual.

Your job is to help clients think in terms of weight of evidence, not existence of evidence. “Yes, there’s evidence that keto can help with insulin resistance for some people. There’s also evidence that Mediterranean diets, low-fat diets, and calorie restriction in general improve insulin sensitivity. The question isn’t whether there’s a study supporting keto, it’s what the broader evidence shows, and what’s most likely to work for you specifically, given your preferences and lifestyle.”

Confirmation bias takes this a step further. It’s the tendency to seek out, interpret, and remember information that confirms existing beliefs while ignoring or dismissing information that contradicts them.

“I believe carbs are the problem, so I notice every time I eat carbs and feel bad, but I don’t notice the times I eat carbs and feel fine.”

This isn’t conscious dishonesty. It’s a cognitive bias that operates largely outside awareness. Once someone has formed a belief, their brain becomes a filter that amplifies confirming evidence and minimises disconfirming evidence.

This relates to the concept of predictive processing. Your brain is constantly making predictions about what will happen, and it pays more attention to prediction errors (when reality doesn’t match expectation) than to confirmations (when reality matches expectation). But the predictions themselves are shaped by existing beliefs, creating a self-reinforcing loop.

If you believe carbs are bad, you predict you’ll feel bad after eating carbs. When you do feel bad (whether from carbs or from stress, poor sleep, or anything else), it confirms your prediction and the memory is encoded strongly. When you don’t feel bad, it’s not salient, so it doesn’t update your belief. Your brain is literally filtering reality through your beliefs.

This connects to what Jonathan Haidt calls the rider and the elephant. The elephant (emotion, intuition, heuristics) is strong and fast. The rider (rational thought) is weak and slow. The elephant decides where to go, and the rider’s job is to justify the elephant’s decision after the fact. Confirmation bias is often the rider justifying where the elephant already wanted to go. When identifying and addressing logical fallacies in your clients, you’re not just correcting logic, you’re helping the rider steer the elephant.

Your intervention is to help clients test their beliefs systematically rather than relying on their biased memory. “You believe carbs make you feel bad. Let’s track this properly. For two weeks, record what you eat and how you feel, without trying to avoid carbs. Then we’ll look at the data together and see if there’s actually a consistent pattern or if your belief is based on selective memory.”

You’re introducing the scientific method as a tool for overcoming bias. This is what elite athletes do with film study; they review game footage to spot technical errors invisible in real-time. You’re teaching clients to study their “reasoning film,” reviewing decisions to spot logical errors invisible in the moment.

Fallacies of Popularity: Following the Crowd

These fallacies mistake consensus, popularity, or tradition for truth.

Bandwagon fallacy (argumentum ad populum) argues that something is true or good because many people believe it or do it.

“Everyone’s doing intermittent fasting, so it must work.”

Popularity is not evidence of effectiveness. At various points in history, everyone believed the earth was flat, that bloodletting cured disease, and that smoking was healthy. Consensus can be wrong, especially when that consensus is driven by trends, marketing, and social pressure rather than evidence. The crowd is often wrong but never uncertain.

Even intelligent people fall victim to this fallacy when they conflate consensus with truth. You’ll hear people say “the science is settled” or “there’s scientific consensus” as if agreement among scientists makes something true. But science doesn’t work by consensus, it works by falsification. A scientific claim gains credibility not because many people agree with it, but because it has survived repeated attempts to prove it wrong. Karl Popper showed us that scientific theories are never “proven true”, they’re simply not yet disproven. The moment we treat consensus as the arbiter of truth rather than as a provisional state of current evidence, we’ve stopped doing science and started doing sociology.

This matters in coaching because clients often justify decisions by pointing to what “most experts agree on” or what’s “widely accepted in the field.” But scientific consensus has been catastrophically wrong many times, from ulcers being caused by stress (turned out to be bacteria) to dietary fat causing heart disease (turned out to be far more complex). What matters isn’t how many people believe something, but whether the evidence supporting it is robust and whether it applies to the specific individual in front of you.

From an evolutionary perspective, as I mentioned earlier, following the crowd was often adaptive in ancestral environments. Social learning is efficient. But the mechanism that evolved to help you learn from your tribe now gets hijacked by influencers with millions of followers, none of whom you actually know. The ancient heuristic “do what successful tribe members do” made sense when you could observe those people directly and verify their success. Now it’s being triggered by carefully curated Instagram posts from strangers whose actual lives, genetics, and methods you know nothing about.

From a sociology perspective, this is social proof and conformity pressure in action. We’re social animals, and we look to others for cues about what’s normal, safe, or desirable. But in modern contexts, this leads to herd behaviour that’s disconnected from actual outcomes. The mechanism worked when “everyone” meant your 50-person tribe. It breaks down when “everyone” means an algorithmically-selected feed of people who look like they’re thriving but might be miserable, lying, or genetically exceptional.

This is compounded by homophily which is the tendency for people to cluster with those of similar others, creating echo chambers where bandwagon fallacies go unchallenged. Everyone in the keto Facebook group does keto, not because it’s optimal, but because of selection effects. The group has pre-filtered for people who already believe in keto. Your client sees universal agreement and concludes it must be true, missing that they’re in a self-reinforcing bubble. It’s survivorship bias in social form; they’re not seeing all the people who tried keto and quit, or who never joined the group because it didn’t work for them.

Your intervention is to help clients distinguish between popularity and appropriateness. “Intermittent fasting works for some people. The question isn’t whether it’s popular or whether there’s consensus about it, it’s whether it fits your lifestyle, preferences, and goals. Does the evidence suggest it would work for someone with your biology, your schedule, your food preferences, your relationship with hunger? Let’s evaluate it based on fit for you specifically, not whether it’s trending or what percentage of your social circle is doing it.”

Appeal to tradition argues that something is good or true because it’s been done for a long time.

“Humans have been eating three meals a day for generations, so that’s the natural way to eat.”

Actually, meal frequency has varied enormously across cultures and time periods, but that’s beside the point. The fallacy is assuming that longevity of a practice automatically establishes its optimality for your specific context and goals.

Now, let me be clear: traditions aren’t worthless. There’s wisdom in what Nassim Taleb calls the “Lindy effect”. This is the idea that the longer something has survived, the longer it’s likely to continue surviving. Things that have stood the test of time have often done so because they solved real problems in ways that alternatives didn’t. Traditional practices have been stress-tested across generations, and that deserves respect.

G.K. Chesterton captured this beautifully with his principle of “Chesterton’s Fence”: before you remove a fence, you should understand why it was put there in the first place. If you see a fence across a road and don’t know why it’s there, the wise thing to do is not to tear it down until you’ve figured out its purpose. The person who built it wasn’t necessarily stupid, they likely had a reason you haven’t yet discovered.

Similarly, the philosopher Gustav Mahler wrote: “Tradition is not the worship of ashes, but the preservation of fire.” The point isn’t to mindlessly maintain old practices, it’s to understand what animated them, what problem they solved, what “fire” they were trying to keep alive. Then you can ask whether that fire still needs tending, or whether modern conditions have made it obsolete.

Applied to health coaching: three meals a day might have emerged because it matched agricultural work schedules, social gathering patterns, and insulin response in populations eating mostly whole foods. That doesn’t make it optimal for a sedentary office worker eating a modern diet, but it also doesn’t make it arbitrary. There might be wisdom encoded in the tradition that we’d be foolish to dismiss without understanding.

Traditions persist for many reasons: cultural inertia, convenience, social bonding, religious significance, etc. But some persist because they genuinely worked; they solved coordination problems, they matched human physiology reasonably well, they created sustainable patterns. The challenge is distinguishing between traditions that encode hard-won wisdom and traditions that are merely historical accidents or adaptations to constraints that no longer exist.

However, many traditional practices were adaptations to specific environmental constraints. Fasting traditions in hot climates or during food scarcity made sense. Feast days after harvests made sense. But these were solutions to problems we may not face anymore. Other traditions were arbitrary cultural developments that spread through imitation rather than because they solved problems better than alternatives.

Your job is to separate tradition from evidence, while respecting that long-standing practices might contain wisdom you haven’t yet recognised. “Yes, three meals a day is traditional in many cultures, and that tradition probably emerged for reasons like social coordination, matching energy needs to work patterns, or cultural bonding around shared meals. Those reasons might still be valid for you. But tradition alone doesn’t tell us if it’s optimal for your specific goals, your metabolism, your schedule. Let’s look at what meal frequency actually does physiologically, what the tradition was trying to accomplish, and what would work best for your life. We can honour the wisdom in the tradition while adapting it to your context.”

The key is avoiding both extremes: mindless adherence to tradition (appeal to tradition fallacy) and thoughtless rejection of tradition (assuming everything old is obsolete). The wise approach is to ask: What problem was this tradition solving? Does that problem still exist for my client? If so, is this still the best solution, or do we have better options now? This is Chesterton’s fence applied to coaching: understand before you dismantle.

Fallacies of Emotion: Confusing Feeling with Reasoning

These fallacies use emotional appeals to bypass rational evaluation.

Appeal to fear uses the threat of negative consequences to persuade, often exaggerating the risk.

“Seed oils are toxic and will destroy your health.”

Notice the catastrophic language. “Toxic.” “Destroy.” This isn’t a measured assessment of risk, it’s an emotional appeal designed to trigger your threat-detection system.

Now, there might be legitimate questions about seed oil consumption in high amounts, about processing methods, about omega-6 to omega-3 ratios. But those nuanced conversations get drowned out by fear-mongering that treats seed oils like poison.

From a risk assessment perspective, dose matters. Context matters. The difference between “this substance can cause harm at high doses” and “this substance is toxic” is enormous. Water can kill you if you drink enough of it. That doesn’t make water toxic in any meaningful sense.

The appeal to fear works because our brains are wired to overweight threats. Negativity bias again. A potential threat captures attention and drives behaviour in ways that a potential benefit doesn’t. This is error management theory playing out: better to avoid a non-threat than risk a real threat, even when the probability of actual harm is low.

Your intervention is to restore proportion and perspective. “The evidence on seed oils is mixed and highly dose-dependent. Some concerns are legitimate. But the catastrophic framing isn’t supported by the research. Let’s look at what the evidence actually shows about realistic amounts of seed oil in the context of an overall diet.”

Appeal to nature argues that natural is inherently better than artificial or processed.

“Processed foods are bad because they’re not natural. We should only eat natural foods.”

This fallacy assumes a moral quality to “natural” that isn’t justified. Nature has given us poison mushrooms, cancer and various diseases. Processing has given us refrigeration, pasteurisation, and the ability to preserve foods safely. Natural is not synonymous with good, and processed is not synonymous with bad.

From an evolutionary perspective, yes, our bodies evolved eating whole foods in their natural state. But our bodies also evolved with high infant mortality, short lifespans, and constant parasitic infections. Evolution optimised for reproduction, not for longevity or quality of life.

Moreover, everything we eat has been modified from its “natural” state through selective breeding over thousands of years. Broccoli, cabbage, and kale are all the same species, cultivated into different forms. Bananas and corn look nothing like their wild ancestors. “Natural” is already a construct.

Your job is to help clients think in terms of what actually matters: nutrient density, satiety, sustainability, enjoyment, and how foods fit into the overall diet. “Some processed foods are nutrient-poor and hyper-palatable in ways that promote overconsumption. Others are convenient, affordable, and perfectly healthy. Let’s evaluate foods based on their actual nutritional properties and how they serve your goals, not whether they’re ‘natural.'”

Fallacies of False Choice: Limiting the Options

These fallacies artificially constrain the option space to make one choice seem inevitable.

False dilemma presents only two options when more exist.

“Either I’m perfect with my diet or I’ve failed.”

This is the logical structure underlying all-or-nothing thinking. The client has artificially reduced the option space to two extremes, perfect adherence or total failure, ignoring the entire spectrum of possibilities in between.

From a logic perspective, this is a failure of exhaustive reasoning. When analysing options, you need to consider all possibilities, not just the extremes. But our brains naturally think in binaries (safe/dangerous, friend/foe) because binary categorisation is fast and was often good enough in ancestral environments.

In modern health contexts, this binary thinking is catastrophically inappropriate. Health is not binary. Adherence is not binary. Progress is not binary. There are infinite gradations between perfect and failure, and most of life is lived in those gradations.

Your intervention is to expand the option space. “You’re presenting two options: perfect adherence or failure. But what about following your plan 80% of the time? What about having flexibility on weekends while being consistent on weekdays? What about adjusting your plan to fit your life rather than forcing your life to fit the plan? All of those are options that aren’t on your current menu.”

Perfectionist fallacy is a variant where the client rejects any solution that isn’t perfect.

“If I can’t do the full hour workout, there’s no point doing anything.”

The client is treating partial solutions as worthless. But partial solutions often compound over time. A 20-minute workout is not as good as a 60-minute workout, but it’s infinitely better than zero minutes. And if the choice is between 20 minutes consistently or 60 minutes sporadically (followed by guilt and avoidance), 20 minutes wins.

This connects to what we discussed earlier with the aggregation of marginal gains. Small, consistent actions compound into remarkable results. But the perfectionist fallacy dismisses those small actions as not worth doing.

From a behavioural economics perspective, this is also about loss aversion and mental accounting. The client is framing the 20-minute workout as a loss (40 minutes less than planned) rather than a gain (20 minutes more than nothing). The frame determines whether they take action or not.

Your job is to reframe partial solutions as valuable. “A 20-minute workout isn’t a failure. It’s 20 minutes of stimulus, which is enough to maintain fitness, improve mood, and build the habit of showing up. Perfect is the enemy of good. Consistent imperfection beats sporadic perfection every time.”

This has direct implications for health outcomes. The client who rejects partial solutions ends up doing nothing, which means no improvement in cardiovascular fitness, no strength gains, elevated stress levels, poor sleep quality, and a trajectory away from health and longevity. The fallacy doesn’t just limit their thinking, it actually limits their lifespan.

Fallacies of Irrelevance: Attacking the Person, Not the Argument

These fallacies shift attention from the argument to something irrelevant.

Ad hominem attacks the person making the argument rather than the argument itself.

“You can’t give me nutrition advice, you’re not even lean.”

The client is dismissing your expertise based on your appearance rather than evaluating the quality of your advice. This is a personal attack dressed up as reasoning.

From a logical perspective, the validity of an argument is completely independent of who’s making it. A good argument is good regardless of whether it comes from someone with abs or someone without. A bad argument is bad even if it comes from someone who looks like a fitness model.

This fallacy often masks something deeper: insecurity, resistance to change, or discomfort with authority. The client doesn’t want to do what you’re suggesting, so they find a reason to dismiss you rather than engage with the substance.

This is also related to the halo effect which is the cognitive bias where we assume that people who are good at one thing (looking fit) must be good at everything related (coaching, nutrition science). The inverse is the horn effect, where one negative attribute contaminates our view of everything else.

Your response depends on context. Sometimes you address it directly: “My job isn’t to have the physique you want. My job is to help you build it. Those are different skill sets. What specifically concerns you about this approach?” Sometimes you redirect: “Let’s set aside my appearance and focus on the evidence. Here’s why this approach is supported by research and is likely to work for your goals.”

Tu quoque (“you too”) is a specific type of ad hominem that points out hypocrisy.

“You tell me to track my food, but I know you don’t track yours.”

Even if the accusation is true, even if you don’t track your own food, that doesn’t make your advice wrong. The validity of “tracking can be useful for some people in some contexts” is independent of whether you personally track.

This fallacy is particularly common when people feel defensive about a recommendation that requires effort or change. “You’re asking me to do something you don’t do, so you must be wrong” is easier than “this might be a good idea, but I don’t want to do it.”

Your response: “You’re right, I don’t track my food anymore because I’ve internalised portion awareness and know my eating patterns well enough that tracking doesn’t add value for me. But when I was learning those skills, tracking was incredibly valuable. And given where you are in your journey, it would give you data and awareness you don’t currently have. The question isn’t whether I track now, it’s whether tracking would help you achieve your goals.”

Genetic fallacy judges something based on its origin rather than its current merit.

“Intermittent fasting was popularised by bodybuilders, so it’s only for bodybuilders.”

Or: “That advice comes from a supplement company, so it must be wrong.”

The origin of an idea doesn’t determine its validity. Ideas can outlive their origins and find applications beyond their initial context. And yes, supplement companies often make dubious claims, but that doesn’t mean every piece of advice from someone affiliated with supplements is wrong.

From a philosophical perspective, this is about separating the genealogy of an idea (where it came from) from its justification (whether it’s true or useful). Both matter, but they’re different questions.

Your job is to evaluate ideas on their merits. “Yes, intermittent fasting was popularised in bodybuilding circles. But that doesn’t mean it only works for bodybuilders. The question is whether the approach has merit for your goals and whether it fits your lifestyle. Let’s evaluate it based on that, not based on where the idea came from.”

Join The Coaches Corner Newsletter

The Neuroscience and Psychology of Why Fallacies Persist

At this point, you might be wondering: if these fallacies are so obviously flawed, why do people keep using them? Why is identifying and addressing logical fallacies in your clients so challenging?

The answer lies in how our brains actually work versus how we think they work.

We like to believe we’re rational beings who occasionally make emotional decisions. But the neuroscience suggests we’re emotional beings who occasionally engage in rational thought. The prefrontal cortex, the part of the brain responsible for logical reasoning, is a late evolutionary development. It’s powerful but slow and metabolically expensive.

The limbic system, responsible for emotions and quick heuristic judgments, is older, faster, and runs on autopilot most of the time. This is Kahneman’s System 1 again: fast, automatic, pattern-matching, good enough.

Logical reasoning, System 2, has to be actively engaged. And we only engage it when we have the cognitive resources available (not stressed, not depleted, not overwhelmed) and when we have motivation to think carefully (the stakes are high, we’re interested, or someone is pushing us to examine our reasoning).

Most of the time, we’re running on System 1. And System 1 is vulnerable to all these fallacies because they’re based on heuristics that usually work well enough.

From a neurological perspective, engaging System 2 to override System 1 requires what’s called cognitive control or executive function. This is mediated by the prefrontal cortex, particularly the dorsolateral prefrontal cortex. But cognitive control is a limited resource. It can be depleted by stress, lack of sleep, decision fatigue, and emotional arousal.

This is why your clients are more vulnerable to logical fallacies when they’re stressed, tired, or emotional. Their capacity for careful reasoning is reduced, so they fall back on heuristics that feel like logic but aren’t. You can predict when a client will be most susceptible to fallacious reasoning (e.g. Friday evening after a stressful work week, during periods of sleep deprivation, when facing major life transitions). Their ventromedial prefrontal cortex, involved in value-based decision making, is depleted. They shift to habitual, heuristic-based choices.

There’s also the concept of motivated reasoning to take into account. We don’t reason objectively to discover truth. We reason to support conclusions we want to reach, often for emotional or social reasons. This is why confirmation bias is so hard to overcome. The brain isn’t a neutral computer processing information. It’s an advocate building a case for what we already believe.

Jonathan Haidt uses the metaphor of the rider and the elephant. The elephant (emotion, intuition, heuristics) is strong and fast. The rider (rational thought) is weak and slow. The elephant decides where to go, and the rider’s job is to justify the elephant’s decision after the fact. Logical fallacies are often the rider’s justifications for where the elephant already wanted to go.

There’s also neural entrenchment to consider. Each time a client rehearses fallacious reasoning, they strengthen those neural pathways. This is Hebbian learning: neurons that fire together wire together. This explains why fallacies are so hard to dislodge, as they’re carved into brain structure through repetition. When your client has been telling themselves “carbs are bad” for five years, that belief has strong neural representation. Challenging it isn’t just a logical exercise, it’s asking the brain to rewire itself.

And there’s a neurochemical dimension: the brain’s reward system craves certainty. Dopamine is released when we resolve uncertainty or confirm predictions. Logical fallacies often provide false certainty (“keto is THE answer”). When you challenge them, you create uncertainty, which is dopaminergically aversive. This triggers approach-avoidance conflict in the nucleus accumbens. Your client literally experiences your correction as uncomfortable at a neurochemical level. They resist not because they’re stubborn, but because you’re triggering an aversive brain state.

What this means for coaching is that you can’t just correct logical fallacies with logic. You have to address the motivation underneath. Why does your client want to believe that carbs are the problem? What emotional need is that belief serving? What social identity is it tied to? What does it protect them from having to confront?

Some beliefs are what I call “load-bearing pillars”, where if removed, the whole structure collapses. “Carbs are the enemy” might support “keto is my identity,” “my tribe is keto people,” “my Instagram content is keto-focused,” “my sense of control comes from restricting carbs.” Challenging the fallacy threatens the entire edifice. You’re not just correcting a thinking error, you’re threatening their identity, their community, their sense of control, and their social capital.

Only once you understand the psychological function of the fallacy can you effectively address it. This is where identifying and addressing logical fallacies in your clients becomes as much art as science.

How to Address Logical Fallacies in Coaching Conversations

Alright, so you’ve spotted a logical fallacy. Your client has just told you they need to go keto because Dr. Instagram said so, and their friend lost weight doing it, and anyway, everyone knows carbs are bad.

That’s at least three fallacies stacked together: appeal to authority, appeal to accomplishment, and bandwagon. What do you do?

Here’s what you don’t do: you don’t say “that’s a logical fallacy” and expect them to immediately change their mind. That’s condescending, it triggers defensiveness, and it turns the conversation into a debate rather than a collaboration.

Nobody likes the “well, actually” guy who goes through life pointing out everyone’s reasoning errors like some modern-day Underground Man. The Underground Man was Dostoevsky’s character in Notes From Underground who was so consumed with rational analysis and exposing others’ contradictions that he became paralyzed, bitter, and incapable of genuine human connection. The Underground Man was technically correct about many things, but his compulsive need to correct others made him insufferable and ineffective. He confused being right with being helpful.

Don’t be that coach. Your job isn’t to win arguments about logic. It’s to help people change.

Instead, here’s the framework I use:

1. Listen First, Diagnose Second

Let them fully express their reasoning before you respond. People need to feel heard before they can hear you. And as they talk, you’re listening for the structure of the fallacy, not just the conclusion.

“Tell me more about why you think keto is the answer for you.”

As they explain, they’ll reveal the fallacies themselves. “Well, Dr. X says it’s best for insulin resistance, and my friend lost 30 pounds, and it seems like everyone’s doing it now.”

Now you know what you’re working with.

2. Validate the Underlying Concern

There’s usually a legitimate concern underneath the fallacious reasoning. Find it and validate it.

“I hear that you’re worried about insulin resistance, and you want an approach that’s been proven to work. Those are completely reasonable concerns.”

You’re not validating the fallacy. You’re validating the need the fallacy is trying to meet. This lowers defensiveness and builds alliance.

This connects to Carl Rogers’ person-centred therapy and the concept of unconditional positive regard. You’re separating the person from their reasoning. You respect them even when their logic is flawed. This creates the psychological safety necessary for them to examine their thinking without feeling attacked.

3. Gently Question the Reasoning

Now you can start asking questions that reveal the flaw without directly attacking it. This is the Socratic method in action.

“You mentioned Dr. X recommends keto. What does Dr. X know about your specific situation? Have they worked with people with your history, your preferences, and your constraints?”

“Your friend had great results with keto. Do you know what her life was like? Her genetics? Whether she’s been able to maintain those results?”

“You said everyone’s doing keto. But does everyone mean it’s right for everyone? Or just that it’s popular right now?”

These questions invite reflection. You’re not telling them they’re wrong. You’re creating space for them to see the gaps in their reasoning.

This is also cognitive defusion from Acceptance and Commitment Therapy (ACT). You’re helping the client create distance from their thoughts to evaluate them. “I’m having the thought that carbs are bad” versus “Carbs are bad.” When you help a client spot a logical fallacy, you’re facilitating defusion. You’re helping them see that their reasoning is a construct they’ve assembled, not reality itself. This creates psychological flexibility, which is the ability to choose actions aligned with values rather than fusion with thoughts.

4. Offer Alternative Explanations

Once you’ve created some doubt about the initial reasoning, offer alternative ways to think about the issue.

“There are multiple dietary approaches that improve insulin sensitivity: keto, low-fat, Mediterranean, and calorie restriction in general. The question isn’t which one is ‘best’ in some abstract sense. It’s which one you can actually stick to long-term.”

“Your friend’s success with keto might have been because keto created a calorie deficit and gave her a clear structure. But there are other ways to create deficits and structure that might fit your life better.”

“Keto is popular right now partly because of good marketing. But ten years ago, low-fat was popular. Before that, low-carb in a different form. Popularity is about trends, not about what works for you specifically.”

You’re expanding the option space and introducing nuance.

This is where William James’s pragmatism becomes useful. The pragmatists argued that truth is what works; ideas are tools, to be judged by their consequences. When your client says “Dr. X recommends keto” (appeal to authority), the pragmatist asks: “Does it work for you?” Not “Is Dr. X credible?” but “Does this idea, applied to your life, produce the results you want?” This is instrumentalism in action. Beliefs are judged by their cash value in experience. Bad reasoning produces bad results, that’s how you know it’s bad.

5. Reframe Around Evidence and Fit

Now you can guide the conversation back to what actually matters: evidence for the population and fit for the individual.

“Here’s what the research shows: for people with insulin resistance, various dietary approaches can work. Keto works for some people. Mediterranean works for others. What matters most is finding an approach you can sustain that creates the outcomes you want.”

“Given your lifestyle, your preferences, and your history, let’s think about what would actually work for you. You hate the idea of giving up fruit. You love your morning oats. Keto would require you to eliminate both of those. Is that sustainable for you?”

You’re helping them evaluate the decision based on evidence and personal fit rather than fallacious reasoning.

6. Collaborative Decision-Making

Finally, you want to end up in a place where the client feels ownership over the decision, not like you imposed it.

“What if we tried an approach that improves insulin sensitivity without requiring you to eliminate foods you love? We could focus on consistent meal timing, protein at every meal, plenty of fibre, and a moderate calorie deficit. That gives you the outcomes you want with better sustainability. How does that sound?”

Notice that you didn’t say “keto is wrong.” You said “here’s another approach that might serve you better.” You’ve addressed the fallacies without making the client feel stupid for having used them.

This entire framework can be remembered as: Listen → Identify → Validate → Question → Expand → Reframe → Empower

Advanced Technique: Name and Normalise the Fallacy 

Once you have a trusting relationship with a client, and once they’ve shown openness to examining their thinking, you can sometimes name the fallacy explicitly.

“I notice you’re putting a lot of weight on Dr. X’s opinion. That’s called appeal to authority, and it’s really common. We all do it. I catch myself doing it too. The thing is, even experts can be wrong, or giving advice that’s not specific to your situation. What do you think about evaluating the advice itself rather than just who’s giving it?”

You’re naming it, normalising it (“we all do it”), and inviting reflection rather than imposing correction.

But timing is everything. This only works if:

  • You have established trust
  • The client is in a receptive state (not stressed or defensive)
  • You’ve already demonstrated that you’re on their side
  • You frame it as a thinking pattern, not a personal failing

If those conditions aren’t met, stick with the questioning and reframing approach.

This also connects to what Albert Ellis called “disputing irrational beliefs” in Rational Emotive Behaviour Therapy (REBT). Ellis identified irrational beliefs that parallel fallacies:

  • Demandingness (“I must be perfect”) ↔ Perfectionist fallacy
  • Awfulising (“eating carbs is terrible”) ↔ Appeal to fear/catastrophising
  • Low frustration tolerance (“I can’t handle tracking”) ↔ False dilemma

When you help a client recognise a fallacy, you’re doing disputation work and challenging the irrational belief structure that maintains dysfunction.

Teaching Clients to Recognise Their Own Fallacies

The ultimate goal isn’t just to correct fallacies when you see them. It’s to help clients develop the skill of recognising and correcting their own fallacies. Successfully identifying and addressing logical fallacies in your clients means teaching them to do it themselves.

This is metacognition: thinking about thinking. And it’s one of the most valuable skills you can teach.

Here’s how to build it:

Create a Shared Vocabulary

Over time, give names to the patterns you see. Not in a lecturing way, but in a collaborative “we’re in this together” way.

“I noticed something interesting. You said everyone’s doing intermittent fasting, so it must work. That’s what is often call the bandwagon fallacy. Just because something is popular doesn’t mean it’s right for you. I catch myself doing this too with training methods. Have you noticed other times when you make decisions based on what’s popular rather than what fits you?”

Now you’ve introduced the concept, normalised it, and invited self-reflection.

Model Critical Thinking

Show your own reasoning process out loud. Let them see how you evaluate claims, how you distinguish between evidence and hype, how you catch your own thinking errors.

“When I first heard about [new diet trend], I found myself getting excited because everyone was talking about it. Then I caught myself: that’s bandwagon thinking. So I stepped back and asked, what’s the actual evidence? What’s the mechanism? Would this work for my clients? Turns out the evidence was pretty weak, and it wouldn’t fit most people’s lifestyles.”

You’re demonstrating the skill you want them to develop.

Assign Reflection Exercises

For clients who are engaged with this work, give them thinking homework.

“This week, I want you to notice when you make a decision about food or exercise. Then write down why you made that decision. What was your reasoning? Then ask yourself: is this based on evidence, or is it based on what I’ve heard other people say? Is this based on what works for me, or on what’s popular right now?”

This builds the habit of examining their own reasoning.

You might call this a “Fallacy Audit”, where clients keep a log for one week of every fitness/nutrition decision and the reasoning behind it. Then you code them for fallacies together. This builds metacognitive awareness in a concrete, practical way.

Celebrate When They Catch Themselves

When a client comes to you and says, “I was about to try this new supplement because everyone’s using it, but then I realised that’s bandwagon thinking. So I wanted to ask you what the evidence actually shows,” celebrate that.

“That’s exactly the kind of critical thinking that’s going to serve you for the rest of your life. You caught yourself, you questioned your own reasoning, and you sought out better information. That’s hugely beneficial.”

You’re reinforcing the meta-skill.

Use the “Steel Man” Technique

Before rejecting an influencer’s claim, ask your client to make the strongest possible version of that argument. Then evaluate THAT. This teaches intellectual honesty and reduces straw-manning. It also helps them engage with ideas more carefully rather than dismissing them reflexively.

Create an Evidence Hierarchy Checklist

Give clients a simple decision tree:

  1. Is this based on what’s popular? (Bandwagon: proceed with caution)
  2. Is this based on one person’s credentials? (Authority: verify independently)
  3. Is this based on one person’s results? (Anecdote: insufficient evidence)
  4. Is this based on my own single experience? (Hasty generalisation: need more data)
  5. Is this based on systematic evidence AND fit for me? (Proceed with confidence)

This is choice architecture applied to reasoning; structuring the decision environment to make good thinking easier.

Beyond Goals: Logical Clarity as Constituent of the Good Life

There’s a deeper reason to care about identifying and addressing logical fallacies in your clients beyond achieving a physique goal or improving their VO2 max. Clear thinking is not merely instrumental to eudaimonia, it’s constitutive of it.

Aristotle argued that human excellence (arete) involves the actualisation of our distinctive capacity: rational thought. When we reason well, we’re not just achieving external goals; we’re fulfilling our nature as rational beings. The person who learns to recognise fallacies isn’t just getting fitter or stronger, they’re becoming more fully human.

From Aristotle’s perspective, phronesis (practical wisdom) is a virtue: an excellence of character developed through habituation. Each time a client catches themselves in bandwagon thinking, they’re not just correcting an error, they’re practising the virtue of intellectual autonomy. They’re building character.

From an existentialist lens, particularly Sartre’s, we are “condemned to be free”, thus radically responsible for creating meaning through our choices. But meaningful choice requires seeing reality clearly. When your client makes decisions based on bandwagon thinking or appeal to authority, they’re surrendering their freedom to the crowd or to false prophets. They’re living in what Sartre called “bad faith” and letting others define their values, and abdicating responsibility for their choices.

Teaching clients to think clearly is teaching them to author their own lives. To examine the values underlying their goals. To ask not just “will keto work?” but “what kind of life am I trying to build, and does this serve it?”

This connects to Viktor Frankl’s logotherapy: meaning comes from taking responsibility for our choices in the face of suffering and limitation. The client who learns to think past fallacies can move from “I failed because carbs are bad” (external locus, victim mentality) to “I chose an unsustainable approach, and I can choose better” (internal locus, agent mentality). That shift from passive victim to active author is where meaning lives.

Nietzsche’s concept of the “herd mentality” is precisely what bandwagon fallacies exemplify. The person who does keto because everyone’s doing it is what Nietzsche called “the last man”: comfortable, conformist, and incapable of creating their own values. Teaching clients to question fallacies is teaching them to become “value creators” rather than “value inheritors.” This is Nietzschean self-overcoming: transcending the easy comfort of following the herd to embrace the harder path of thinking for yourself.

The Stoic concept of the dichotomy of control is also relevant here. Epictetus distinguished between what’s “up to us” (our judgments, our reasoning) and what’s “not up to us” (others’ opinions, outcomes, our genetics). When your client uses post hoc reasoning about weight fluctuations, they’re disturbing themselves over what’s not up to them (normal physiological variation) rather than focusing on what is (their consistent actions over time).

As Epictetus wrote: “What upsets people is not things themselves but their judgments about things.” Teaching fallacy recognition is teaching the dichotomy of control. You are helping clients reclaim their judgment from external circumstances and focus their energy where they have actual agency.

When you help someone develop the skill of recognising and correcting logical fallacies, you’re not just helping them achieve fitness outcomes. You’re giving them tools for human flourishing. You’re helping them become autonomous, discerning, capable agents who can navigate complexity without surrendering to manipulation or their own biases.

That transformation ripples out into every area of life.

The Broader Impact: Critical Thinking as Life Skill

When you become skilled at identifying and addressing logical fallacies in your clients, you’re giving them something that extends far beyond nutrition and fitness.

You’re teaching them to think.

The client who learns to question appeal to authority in fitness advice starts questioning it in politics, in marketing, and in medical decisions. The client who learns to avoid post hoc reasoning about carbs starts avoiding it in understanding their relationships, their career, and their finances.

Critical thinking is domain-general. Once you develop it in one area, it transfers.

This is what Socrates was doing in ancient Athens. He wasn’t teaching people what to think. He was teaching them how to think, by relentlessly questioning their reasoning until they saw the flaws themselves.

The Socratic method is uncomfortable because it reveals that we don’t actually know what we think we know. Our beliefs, examined closely, often rest on fallacies. But that discomfort is the price of wisdom.

As a coach, you’re in a position to be a modern Socratic teacher. Not by lecturing about logic, but by asking questions that reveal thinking errors and by modelling better reasoning.

This is world-class coaching. Not just giving people meal plans and workout programs, but helping them develop the cognitive skills to navigate a world full of false claims, manipulative marketing, and their own biased thinking.

When your client can spot a logical fallacy in a supplement ad, when they can evaluate diet trends based on evidence rather than popularity, when they can distinguish between what works on average and what works for them specifically, they’ve developed a skill that will serve them for the rest of their life.

That’s the real transformation.

And it has implications for health that go far beyond what happens in the gym. The client who develops clear thinking experiences lower chronic stress (because they’re not constantly second-guessing themselves or jumping between approaches). Better sleep quality (because they’re not ruminating about contradictory advice). More consistent training (because they’re not program-hopping). Better body composition outcomes (because they stick with sustainable approaches). Improved cardiovascular fitness (because consistency compounds). Enhanced longevity (because they make better long-term decisions about health behaviours).

The fallacies we correct today determine the decades of healthspan tomorrow.

The Paradox of Rationality

Before I close, I want to address a paradox that might be bothering you.

If I’ve just spent this entire article teaching you about logical fallacies and critical thinking, am I suggesting that we should all be perfectly rational beings, always reasoning carefully, never using heuristics?

No. Absolutely not.

Perfect rationality is impossible, metabolically expensive, and often counterproductive. Heuristics exist for a reason. They’re fast, efficient, and usually good enough. The person who analyses every single decision with perfect logic would be paralysed, unable to function in a complex world that requires quick judgments.

From a behavioural economics perspective, we’re “boundedly rational.” We do the best we can with limited information, limited time, and limited cognitive resources. And that’s fine. That’s human.

The goal isn’t to eliminate heuristics or to reason perfectly about everything. The goal is to recognise when the stakes are high enough that careful reasoning is worth the effort, and to have the skills to engage in that reasoning when it matters.

Choosing what to eat for lunch? Heuristics are fine. “I usually feel good when I eat this, so I’ll eat it again.” That’s technically post hoc reasoning, but who cares? It’s lunch.

Deciding whether to make a major dietary change based on advice from social media? That’s worth slowing down and examining the logic. That’s worth engaging System 2.

The skill is knowing when to trust your intuition and when to question it. When to use heuristics and when to override them with careful reasoning.

This is what psychologists call adaptive decision-making: matching your decision process to the demands of the situation.

And this is what you’re teaching your clients. Not to be robots who analyse everything, but to be thoughtful humans who know when to think carefully and how to do it when it matters.

This paradox also connects to what I call “The Reasoning Debt”. Reasoning debt accumulates when you take shortcuts (fallacies) instead of thinking carefully. Eventually, you have to pay it back with interest (failed diets, wasted money, lost trust in yourself). The key is knowing when to invest in good reasoning upfront versus when shortcuts are acceptable.

Putting It All Together: The Framework for Identifying and Addressing Logical Fallacies

Let me give you a simple framework you can use whenever you encounter logical fallacies in coaching:

Listen → Identify → Validate → Question → Expand → Reframe → Empower

Listen: Let them fully express their reasoning without interrupting.

Identify: Recognise which fallacy or fallacies are at play, and understand what psychological function they might be serving.

Validate: Find and acknowledge the legitimate concern underneath the faulty reasoning.

Question: Ask Socratic questions that reveal the gaps in the logic without attacking the person.

Expand: Introduce alternative explanations, additional evidence, or broader context.

Reframe: Guide the conversation back to what actually matters—evidence for the population and fit for the individual.

Empower: Help them make a decision they own, based on better reasoning.

This framework works because it addresses both the logic and the psychology. You’re correcting the thinking error while respecting the person and their autonomy. You’re creating the conditions for them to see the flaw themselves rather than having it imposed from the outside.

And you’re not just identifying and addressing logical fallacies in your clients for this one conversation. You’re teaching them a process they can use for the rest of their lives.

As you do this work, keep these principles in mind:

  • The crowd is often wrong but never uncertain
  • Credentials are not a substitute for thinking
  • Popularity is evidence of marketing, not truth
  • Your friend’s success is data, not destiny
  • Perfect logic can reach false conclusions; flawed logic can stumble on truth. Both are problems
  • Heuristics are good servants but terrible masters

And remember: when a client comes to you with a decision based on fallacious reasoning, they’re not stupid. They’re human. Your job is to help them be more human in the Aristotelian sense; actualising their capacity for reason while acknowledging the emotional, social, and biological factors that make pure rationality impossible.

Become a student of reasoning. Not just what people think, but how they think. Not just the conclusions they reach, but the logic (or lack of logic) that got them there. Study the common logical fallacies. Practice spotting them in real time. Learn to question gently and reframe effectively. Build the skill of helping people think more clearly without making them feel stupid.

Because this is what separates adequate coaches from world-class ones.

Anyone can Google meal plans and workout programs. But helping someone develop the cognitive skills to navigate a world full of false promises and misleading claims? That’s rare. That’s valuable. That’s the work that creates lasting change.

When you help a client recognise that their reasoning is flawed, when you teach them to evaluate claims critically, when you model careful thinking and give them the tools to do it themselves, you’re not just helping them achieve their fitness goals or improve their body composition or increase their VO2 max.

You’re helping them become more autonomous, more discerning, more capable human beings.

And that transformation ripples out into every area of their life.

The client who learns to question fallacies in fitness advice becomes the person who questions fallacies in political rhetoric, in marketing claims, in relationship dynamics. They become harder to manipulate, more capable of trusting their own judgment, and more effective at making decisions that serve their actual values.

They sleep better because they’re not ruminating about conflicting advice. They manage stress better because they’re not constantly second-guessing themselves. They build muscle and lose fat more effectively because they stick with approaches long enough for them to work. They improve their cardiovascular fitness because they train consistently rather than program-hopping. They extend their healthspan because they make better long-term decisions about their health behaviours.

All of this stems from clearer thinking.

That’s the kind of coaching that changes lives.

That’s the standard I’m inviting you to reach for.

Because the world doesn’t need more meal plans. It needs more critical thinkers. And you, as a coach, are in a unique position to create them.

One conversation at a time. One fallacy addressed at a time. One client empowered to think clearly at a time.

That’s world-class coaching.

When you master the art and science of identifying and addressing logical fallacies in your clients, you’re not just improving their outcomes in the gym or on the scale. You’re giving them a compass for navigating reality. You’re helping them build lives of greater autonomy, meaning, and flourishing.

And in doing so, you’re fulfilling the highest calling of coaching: not just changing bodies, but developing humans.

Having said all of that, you do still need a working model of physiology, nutrition and training to actually get results. So, for those of you ready to take the next step in professional development, we also offer advanced courses. Our Nutrition Coach Certification is designed to help you guide clients through sustainable, evidence-based nutrition change with confidence, while our Exercise Program Design Course focuses on building effective, individualised training plans that actually work in the real world. Beyond that, we’ve created specialised courses so you can grow in the exact areas that matter most for your journey as a coach.

If you want to keep sharpening your coaching craft, we’ve built a free Content Hub filled with resources just for coaches. Inside, you’ll find the Coaches Corner, which has a collection of tools, frameworks, and real-world insights you can start using right away. We also share regular tips and strategies on Instagram and YouTube, so you’ve always got fresh ideas and practical examples at your fingertips. And if you want everything delivered straight to you, the easiest way is to subscribe to our newsletter so you never miss new material.

References and Further Reading

Braun LT, Schmidmaier R. Dealing with cognitive dissonance: an approach. Med Educ. 2019;53(12):1167-1168. doi:10.1111/medu.13955 https://pubmed.ncbi.nlm.nih.gov/31532838/

Rollwage M, Loosen A, Hauser TU, Moran R, Dolan RJ, Fleming SM. Confidence drives a neural confirmation bias. Nat Commun. 2020;11(1):2634. Published 2020 May 26. doi:10.1038/s41467-020-16278-6 https://pubmed.ncbi.nlm.nih.gov/32457308/

Elston DM. Confirmation bias in medical decision-making. J Am Acad Dermatol. 2020;82(3):572. doi:10.1016/j.jaad.2019.06.1286 https://pubmed.ncbi.nlm.nih.gov/31279036/

O’Sullivan ED, Schofield SJ. Cognitive bias in clinical medicine. J R Coll Physicians Edinb. 2018;48(3):225-232. doi:10.4997/JRCPE.2018.306 https://pubmed.ncbi.nlm.nih.gov/30191910/

Kunda Z. The case for motivated reasoning. Psychol Bull. 1990;108(3):480-498. doi:10.1037/0033-2909.108.3.480 https://pubmed.ncbi.nlm.nih.gov/2270237/

Klein J, McColl G. Cognitive dissonance: how self-protective distortions can undermine clinical judgement. Med Educ. 2019;53(12):1178-1186. doi:10.1111/medu.13938 https://pubmed.ncbi.nlm.nih.gov/31397007/

Béchard B, Kimmerle J, Lawarée J, Bédard PO, Straus SE, Ouimet M. The Impact of Information Presentation and Cognitive Dissonance on Processing Systematic Review Summaries: A Randomized Controlled Trial on Bicycle Helmet Legislation. Int J Environ Res Public Health. 2022;19(10):6234. Published 2022 May 20. doi:10.3390/ijerph19106234 https://pubmed.ncbi.nlm.nih.gov/35627776/

Norris CJ. The negativity bias, revisited: Evidence from neuroscience measures and an individual differences approach. Soc Neurosci. 2021;16(1):68-82. doi:10.1080/17470919.2019.1696225 https://pubmed.ncbi.nlm.nih.gov/31750790/

Vaish A, Grossmann T, Woodward A. Not all emotions are created equal: the negativity bias in social-emotional development. Psychol Bull. 2008;134(3):383-403. doi:10.1037/0033-2909.134.3.383 https://pubmed.ncbi.nlm.nih.gov/18444702/

Friedman NP, Robbins TW. The role of prefrontal cortex in cognitive control and executive function. Neuropsychopharmacology. 2022;47(1):72-89. doi:10.1038/s41386-021-01132-0 https://pubmed.ncbi.nlm.nih.gov/34408280/

Blumenthal-Barby JS, Krieger H. Cognitive biases and heuristics in medical decision making: a critical review using a systematic search strategy. Med Decis Making. 2015;35(4):539-557. doi:10.1177/0272989X14547740 https://pubmed.ncbi.nlm.nih.gov/25145577/

Rivas SF, Saiz C, Ossa C. Metacognitive Strategies and Development of Critical Thinking in Higher Education. Front Psychol. 2022;13:913219. Published 2022 Jun 15. doi:10.3389/fpsyg.2022.913219 https://pubmed.ncbi.nlm.nih.gov/35783800/

Author

  • Paddy Farrell

    Hey, I'm Paddy!

    I am a coach who loves to help people master their health and fitness. I am a personal trainer, strength and conditioning coach, and I have a degree in Biochemistry and Biomolecular Science. I have been coaching people for over 10 years now.

    When I grew up, you couldn't find great health and fitness information, and you still can't really. So my content aims to solve that!

    I enjoy training in the gym, doing martial arts, hiking in the mountains (around Europe, mainly), drawing and coding. I am also an avid reader of philosophy, history, and science. When I am not in the mountains, exercising or reading, you will likely find me in a museum.

    View all posts