In discussions about GMOs, both proponents and opponents make logical fallacies and claim that the other side is making logical fallacies. Sometimes arguments seem compelling even though they are based on faulty logic. In this post, you can find some common and not-so-common logical fallacies conveniently listed in alphabetical order.
Most of this post was written by Brian Dunning of the excellent podcast Skeptoid, who has generously given us permission to use his work here with some modification. The content can be found in its original form at Skeptoid, in the episodes A Magical Journey through the Land of Logical Fallacies – Part 1, A Magical Journey through the Land of Logical Fallacies – Part 2, and Some New Logical Fallacies.
What is a fallacy and why are they used?
First, what is a logical fallacy? The Wikipedia definition is as good as any:
In logic and rhetoric, a fallacy is a misconception resulting from incorrect reasoning in argumentation. By accident or design, fallacies may exploit emotional triggers in the listener or interlocutor (e.g.appeal to emotion), or take advantage of social relationships between people (e.g. argument from authority). Fallacious arguments are often structured using rhetorical patterns that obscure the logical argument, making fallacies more difficult to diagnose. Also, the components of the fallacy may be spread out over separate arguments.
A shorter definition for logical fallacy is provided by Brian: “the use of rhetoric as a substitute for good evidence.”
Scientific arguments are won or lost by the scientific method. Either the data supports a claim or it does not. Sometimes, people who don’t have data to support their arguments will deliberately employ logical fallacies in an attempt to convince people that their claim is correct. Fallacies can also be accidentally employed when anyone mistakes compelling rhetoric for a sound argument.
Many of the fallacies listed here can be part of a legitimate discussion. The problem comes when we connect one of the fallacies to an unrelated claim. For example, stating a fact about a person is simply a fact. Only when we use that fact in an attempt to support or take down a claim does it become a fallacy.
This list contains many fallacies. Some are “traditional” fallacies and some are new arrivals. Surely you have seen or even used some of them. Are we missing any fallacies? Let us know in the comments, and we’ll add them to the list. Know of any interesting examples of fallacies being used to discuss agriculture or biotechnology? Please share!
The take home message in this post is that, if you’re going to have a debate, stick with valid arguments. In a scientific debate, particularly, stick with the data. Don’t get caught using fallacies. Hopefully, familiarity with these devices will help you to identify them in conversation. And, when you point them out, you will strip your opponent of the tools on which he depends the most.
From the latin for “to the person”, an ad hominem is an attack against the arguer rather than the argument. This doesn’t mean that you simply call the person a jerk; rather, it means that you use some weakness or characteristic of the arguer to imply a weakness of the argument.
Starling: “I think Volvos are fine automobiles.”
Bombo: “Of course you’d say that; you’re from Sweden.”
Starling’s Swedish heritage has nothing to do with the quality of Volvo automobiles, so Bombo’s is an attempt to change the subject and is an avoidance of the issue at hand. Bombo is trying to imply that Starling’s Swedish heritage biases, and thus invalidates, his statement. In fact, one thing has nothing to do with the other. Ad hominem arguments try to point out fault with the arguer, instead of with the argument.
Now, there are cases where it might be appropriate to consider the source of the information. If the authors of a study on plant biology are all physicists, or the author of a book about agriculture is actually a businessman, we might wonder if the person is familiar enough with the subject to design a valid study or to include all the relevant information in a book. If an article is written by or funded by persons who work for an organization with a known agenda we might have concerns about bias. We can then evaluate the work with a skeptical eye. Considering the source isn’t an ad hominem unless you throw out everything by the person simply because they are who they are.
One of the most common ways to support claim is through the fallacious misuse of anecdotal evidence. Anecdotal evidence is information that cannot be tested scientifically, or that could be tested scientifically and has not. In practice this usually refers to personal testimonials and verbal reports. Anecdotal evidence often sounds compelling because it can be more personal and captivating than cold, uninteresting factual evidence.
Many people believe that their own experience trumps scientific evidence, and that merely relating that experience is sufficient to prove a given claim.
Starling: “Every scientific test of magical energy bracelets shows that they have no effect whatsoever.”
Bombo: “But they work for me, therefore I know for a fact they’re valid and that science is wrong.”
Is Bombo’s analysis of his own experience wrong? If it disagrees with well-performed controlled testing, then yes, he probably is wrong. Personal experiences are subject to influences, biases, preconceived notions, random variances, and are uncontrolled. Relating an anecdotal experience proves nothing.
Bombo: “My cousin’s friend took zinc pills and it cured her cold.”
Starling: “Perhaps the cold just went away by itself .”
Perhaps there is something to zinc pills, but without a randomized controlled study, we can’t know for sure. Anecdotal evidence is great for suggesting new directions in research, but by itself it is not evidence.
Anecdotal evidence is not completely useless. You could say “We saw the Bigfoot corpse at this location”, and if that information helps with the recovery of an actual body, then the anecdotal evidence was of tremendous value. But, note that it’s the Bigfoot corpse itself that comprises scientific evidence, not the story of where it was seen.
When anecdotes are presented as evidence or in place of evidence, you have very good reason to be skeptical.
Appeal to Authority
This type of argument refers to a special authoritative source as validation for the claim being made. Every time you see an advertisement featuring someone wearing a white lab coat, or telling you what 4 out of 5 dentists surveyed said, you’re seeing an appeal to authority.
“Acupuncture is based on centuries-old Chinese knowledge.”
“A growing number of scientists say that evolution is too improbable.”
These statements are true. They become a problem only when we use them to make a claim. For example:
“Acupuncture is based on centuries-old Chinese knowledge, therefore we know it works.”
“A growing number of scientists say that evolution is too improbable, therefore we need to question evolution.”
An appeal to authority is the opposite of an ad hominem attack, because here we are referring to some positive characteristic of the source, such as its perceived authority, as support for the argument. But a good authority supports a position because that position has been shown to be otherwise justified or evidenced, not the other way around. If you say that scientists support Theory X, are those scientists claiming that Theory X is true because they believe it?
We often see people appealing to authority when they say things like:
“This article in a peer-reviewed scientific journal says that people are getting fatter.”
“This PhD scientist says that people are getting fatter.”
Being peer-reviewed or having a PhD is not the end-all-be-all. The more important question is whether a particular claim fits within the established body of literature for that subject. If it doesn’t fit, then more research is needed before we can come to any conclusions.
Similarly, if a person has an advanced degree, that does not automatically mean that anything they say is correct. No good scientist attaches significance to their own authority. Theory X needs to stand on its own; an appeal to authority does not provide any useful support.
Appeal to Dead Puppies
Sometimes tugging at the heartstrings with a tragic tale is enough to quash dissent. Who wants to take the side of whatever malevolent force might be associated with death and suffering?
Starling: “Thank you, door-to-door solicitor, but I choose not to purchase your magazine subscription.”
Bombo: “But then I’ll be forced to turn to drugs and gangs.”
The Appeal to Dead Puppies draws a pathetic, poignant picture in order to play on your emotions. Recognize it when you hear it, and keep your emotions separate from the facts.
Appeal to Hitler
This one is inspired by Godwin’s Law, in which Mike Godwin stated “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1.” Ever since, such arguments have become known as the reductio ad Hitlerum, or the Appeal to Hitler. It’s a garden variety “guilt by association” charge, saying you’re wrong because Hitler may have thought or done something similar.
Bombo: “You think illegal aliens should be deported? Sounds exactly like how the Nazis got started.”
Starling gives the common reply:
Starling: “The Nazis also owned dogs and played with their children.”
For good measure, Bombo comes back with a “straw man on a slippery slope” argument:
Bombo: “Are you saying everything about the Nazis was perfect?”
Appeal to Ignorance
Argumentum ad ignorantiam considers ignorance of something to be evidence that it does not exist. If I do not understand the mechanism of the Big Bang, that proves that there is no knowledge that supports it as a possibility and it therefore did not happen. Anything that is insufficiently explained or insufficiently understood is thus impossible.
Starling: “It is amazing that life arose through the fortuitous formation of amino acids in the primordial goo.”
Bombo: “A little too amazing. I can’t imagine how such a thing could happen; creationism is the only possibility.”
Using the absence of evidence as evidence of absence is a common appeal to ignorance. People who believe the Phoenix Lights could not have been simple flares generally don’t understand, or won’t listen to, the thorough evidence of that. Their glib layman’s understanding of what a flare might look like is inconsistent with their interpretation of the photographs, so they use an appeal to ignorance as proof that flares were not the cause.
Appeal to Lack of Authority
Authority has a reputation for being corrupt and inflexible, and this stereotype has been leveraged by some who assert that their own lack of authority somehow makes them a better authority.
Starling might say of the 9/11 attacks: “Every reputable structural engineer understands how fire caused the Twin Towers to collapse.”
Bombo can reply: “I’m not an expert in engineering or anything, I’m just a regular guy asking questions.”
Starling: “We should listen to what the people who know what they’re talking about have to say.”
Bombo: “Someone needs to stand up to these experts.”
The idea that not knowing what you’re talking about somehow makes you heroic or more reliable is incorrect. More likely, your lack of expertise simply makes you wrong.
Appeal to Quantum Physics
This is a form of special pleading, a scientific-sounding way of claiming that the way your magical product or service works is beyond the customer’s understanding; in this case, based on quantum physics. That sounds impressive, and who’s qualified to argue? Certainly not the average layperson.
Bombo: “Quantum physics explains why pressure points on the sole of your foot correspond with other parts of your anatomy.”
Here’s a tip. If you see or hear the phrase “quantum physics” mentioned in a context that is anything other than a scientific discussion of subatomic theory, raise your red flag. Someone is probably trying to hoodwink you by namedropping a science that they probably understand no better than your cat does.
Argument from Anomaly
This one is big with ghost hunters and UFO enthusiasts. Anything that’s anomalous, or otherwise not immediately, absolutely, positively, specifically identifiable, automatically becomes evidence of the paranormal claim.
Starling: “We found a cold spot in the room with no apparent source.”
Bombo: “That must be a ghost.”
Since the anomaly is, well, an anomaly, that means (by definition) that you can’t prove it was anything other than a ghost or a UFO or a leprechaun or whatever they want to say, not without well designed experiments. Since the skeptic can’t prove otherwise, the Argument from Anomaly is a perfect way to prove the existence of ghosts. Or, nearly perfect, I should say, because it’s not.
Of course, argument from anomaly doesn’t just work for UFOs and ghosts.
Starling: “This researcher found hamsters that have strange pouches of hair growing in their mouths.”
Bombo: “It must be due to GMOs.”
Also known as argumentum ad populum (appeal to the masses) or argument by consensus, the bandwagon fallacy states that if everyone else is doing it, so should you. If most people believe something or act a certain way, it must be correct.
“Everyone knows that O.J. Simpson was guilty; so he should be in jail.”
“Over 700 scientists have signed Dissent from Darwin, so you should reconsider your belief in evolution.”
The bandwagon fallacy can also be used in reverse: If very few people believe something, then it can’t be true.
Starling: “Firefly was a really cool show.”
Bombo: “Are you kidding? Almost nobody watched it.”
Consider how many supernatural beliefs are firmly held by a majority of the world’s population, and the lameness of the bandwagon fallacy comes into pretty sharp focus. The majority might sometimes be right, but they’re hardly reliable.
Better Journal Fallacy
It’s common for purveyors of woo to trot out some worthless, credulous magazine that promotes their belief, and refer to it as a peer-reviewed scientific journal:
Starling: “If telekinesis was real, you’d think there would be an article about it in the American Journal of Psychiatry.”
Bombo: “That rag is part of the establishment conspiracy to suppress psi research. You need to turn to a reputable source like the Journal of the American Society for Psychical Research. It’s peer-reviewed.”
And so it is, but its reviewers are people who have failed to establish credibility for themselves, as have such journals themselves. There are actually metrics for these things.
The productivity and impact of individual researchers can be described by their Hirsch index (or h-index), which attempts to measure the number and quality of citations of their publications and research. The number of citations the studies in a journal receive is important because the citations indicate that the study is considered a good source by other researchers in the field. A journal’s reputation can also be shown by its impact factor, which measures approximately the same thing as the Hirsch index. Although these indexes are not perfect, you need not ever lose a “my peer-reviewed scientific journal is better than yours” debate. Look up impact factors in the Thomson Reuters Journal Citation Reports through sciencewatch.com.
Want to terrify people and frighten them away from some product or technology that you don’t like? Mention chemicals. Chemical farming, chemical medicines, chemical toxins. As scary as the word is, it’s almost meaningless, because everything is a chemical. Even happy flowers and kittens consist entirely of chemicals. It’s a weasel word, nothing more, and its use often indicates that its user was unable to find a cogent argument.
Cherry Picking Fallacy
This fallacy is related to the appeal to authority fallacy. Often we read blog posts and articles about a press release, report, or, less frequently, a peer-reviewed article, that go on to state that this individual report or study proves some broad point. The problem here is that there may be many other reports and studies that disprove that point. Occasionally, there is a major change in scientific understanding, but those are rare. Focusing on one report or study while ignoring the rest is a fallacy.
Confusion of Correlation and Causation
Closely related to post hoc, but a little bit different, is the confusion of correlation and causation. Post hoc assumptions do not necessarily include any correlation between the two observations. When there is a correlation, but still no valid causation, we have a more convincing confusion.
Starling: “Chinese people eat a lot of rice.”
Bombo: “Therefore the consumption of rice must cause black hair.”
Due to the nature of Chinese agriculture, there is indeed a worldwide correlation between rice consumption and hair color. This is a perfect example of how causation can be invalidly inferred from a simple correlation.
The excluded middle assumes that only one of two ridiculous extremes is possible, when in fact a much more moderate middle-of-the-road result is more likely and desirable. An example of an excluded middle would be an argument that either every possible creation story should be taught in schools, or none of them. These two possibilities sound frightening, and may persuade people to choose the lesser of two evils and allow religious creation stories to be taught alongside science. In fact, the much more reasonable excluded middle, which is to teach science in science classes and religion in religion classes, is not offered.
The excluded middle is formally called reductio ad absurdum, reduction to the absurd. Bertrand Russell famously illustrated how an absurd premise can be fallaciously used to support an argument:
Starling says: “Given that 1 = 0, prove that you are the Pope.”
Bombo replies: “Add 1 to both sides of the equation: then we have 2 = 1. The set containing just me and the Pope has 2 members. But 2 = 1, so it has only 1 member; therefore, I am the Pope.”
Just keep in mind that if your opponent is presuming extremes that are absurd, he is excluding the less absurd middle. Don’t fall for it.
Fallacy of the Consequent
Drawing invalid subset relationships in the wrong direction is called the fallacy of the consequent. Cancers are all considered diseases, but not all diseases are cancers. Stating that if you have a disease it must be cancer is a fallacy of the consequent.
Listen to how Bombo blames Starling’s failure to heal upon his failure to take one particular treatment, without regard for whether that treatment is a valid one for Starling’s particular condition:
Starling: “I am dying of bubonic plague.”
Bombo: “You did not drink enough wheatgrass juice.”
Even assuming that wheatgrass juice was a suitable treatment for anything, it would still not be a suitable treatment for everything, so Bombo’s suggestion that Starling’s illness is a fallacious consequence for his failure to drink wheatgrass juice.
A loaded question is also known as the fallacy of multiple questions rolled into one, or plurium interrogationum. If I want to force you to answer one question in a certain way, I can roll that question up with another that offers you two choices, both of which require my desired answer to the first question. For example:
“Is this the first time you’ve killed anyone?”
“Have you always doubted the truth of the Bible?”
“Is it nice to never have to hassle with taking a shower?”
Any answer given forces you to give me the answer I was looking for: That you have killed someone, that you doubt the truth of the Bible, or that you don’t shower or bathe. Loaded questions should not be tolerated and certainly should never be answered.
Michael Jordan Fallacy
This one can be used to impugn the motives of anyone in the world, in an effort to prove they are driven by greed and don’t care about anyone else’s problems:
Bombo: “Just think if Michael Jordan had used all his talents and wealth to feed third world children, rather than to play a sport.”
Of course, you can say this about anyone, famous or not:
Bombo: “If your doctor really cared about people’s health, he’d sell everything he owned and become a charitable frontier doctor in Africa.”
In fact, for charitable efforts to exist, we need the Michael Jordans of the world playing basketball. Regular non-charitable activities, like your doctor’s business office, are what drives the economic machine that funds charity work. The world’s largest giver, the Bill & Melinda Gates Foundation, would not exist had a certain young man put his talents toward the Peace Corps instead of founding a profitable software giant.
From the Latin for “It does not follow”, a non-sequitur is an obvious and stupid attempt to justify one claim using an irrelevant premise. Non-sequiturs work by starting with a reasonable sounding premise that it’s hoped you will agree with, and attaching it (like a rider to a bill in Congress) to a conclusion that has nothing to do with it. The sentence is phrased in such a way to make it sound like you have to accept both or neither:
“Corporations are evil, thus acupuncture is good.”
“The government is evil, thus UFOs are alien spacecraft.”
“Allah is great, thus all Christians should be killed.”
When we do science, it takes more than simply connecting two phrases with the word “thus” to draw a valid relationship. Thus, non-sequiturs are not valid devices to prove a point scientifically.
Observational selection is the process of keeping the sample of data that agrees with your premise, and ignoring the sample of data that does not. Observational selection is the fallacy behind such phenomena as the Bible Code, psychic readings, the Global Consciousness Project, and faith healing. Observational selection is also a tool used by pollsters to produce desired survey results, by surveying only people who are predisposed to answer the poll the way the pollster wants.
Bombo: “The face of Satan is clearly visible in the smoke billowing from the World Trade Center.”
Starling: “And in one of the other 950,000 frames of film, the smoke looks like J. Edgar Hoover; in another, it looks like a Windows XP icon; and in another it looks like a map of Paris.”
Remember that one out of every million samples of anything is an incredible one-in-a-million rarity. This is a mere inevitability, but if observational selection compels you to ignore the other 999,999 samples, you’re very easily impressed.
Poisoning the Well
When you preface your comments by casually slipping in a derogatory adjective about your opponent or his position, you’re doing what’s called poisoning the well. A familiar example is the way Intelligent Design advocates poison the well by referring to evolution as Darwinism, as if it’s about devotion to one particular researcher. Or:
“And now, let’s hear the same old arguments about why we should believe UFOs come from outer space.”
“Celebrity television psychic Sylvia Browne tells us in her new book.”
If you listen to Skeptoid, you know that Brian poisons the well all the time. It’s one of his favorite devices. But he does it obviously, for the entertainment value, and not as a serious attempt at argument.
Post hoc ergo propter hoc means “after this, therefore because of this”. This fallacy is similar to the confusion of correlation and causation. Post hoc arguments are often the parents of superstition.
“When I wear my lucky shirt, I do much better on tests.”
“The incidence of allergies has risen after the introduction of GMOs into the food supply. Therefore, GMOs have caused the increase in allergies.”
Many things happen all the time. Choosing two practically at random does not make for a strong argument.
Proof by Lack of Evidence
This one is big in the conspiracy theory world: The lack of evidence that would support their conspiracy theory is due to the evil coverup. Thus, the lack of evidence for the conspiracy is, in and of itself, evidence of the conspiracy.
Bombo: “The passengers on Flight 93 were taken off the plane and executed by the government.”
Starling: “But there’s no evidence of that.”
Bombo: “Exactly. That’s how we know it for a fact.”
There are certainly things in the world that are true but for which no evidence exists, but these are in the minority. If you want to be right more often than not, stick with what we can actually learn. If instead your standard is that anything that can’t be disproven must therefore be true, like Russell’s Teapot, you’re one step away from delusional paranoia.
Proof by Mommy Instinct
Made famous by anti-vaccine activist Jenny McCarthy, this one asserts that nobody understands health issues better than a mom. Mothers obviously have experience with childbirth and with raising children, but is there any reason to suspect they understand internal medicine (for example) better than educated doctors, many of whom are also mothers? Not so far as I am able to divine.
Remember that Mommy Instincts are no different than anecdotal experiences. They are driven by perception and presumption, not by science.
Proof by Verbosity
The practice of burying you with so much information and misinformation that you cannot possibly respond to it all is called proof by verbosity, or argumentum verbosium. To win a debate, I need not have any support for my position if I can simply throw so many things at you that you can’t respond to all of them.
This is the favorite device of conspiracy theorists. The sheer volume of random tidbits that they throw out there gives the impression of their position having been thoroughly researched and well supported by many pillars of evidence. Any given tidbit is probably a red herring, but since there are so many of them, it would be hopeless (and fruitless) to respond intelligently to each and every one of them. Thus the argument appears to be impregnable and bulletproof. It may not be possible to construct a cogent argument using proof by verbosity, but it is very easy to construct an irrefutable argument.
Proof by Victimization
Beware of claims from those lording their victimization over you. They may well have been victimized by something, be it an illness, a scam, even their own flawed interpretation of an experience. And in many cases, such a tragedy does give the victim insight that others wouldn’t have. But it doesn’t mean that person necessarily understands what happened or why it happened, and should not be taken as proof that they do.
Bombo: “My neighbor’s wifi network gave me chronic fatigue.”
Starling: “But that’s been disproven every time it’s been tested.”
Bombo: “You don’t know what you’re talking about; it didn’t happen to you.”
Victimization does not anoint anyone with unassailable authority on their particular subject.
A red herring is a diversion inserted into an argument to distract attention away from the real point. Supposedly, dragging a smelly herring across the track of a hunted fox would save him from the dogs by diverting their attention away from the real quarry. Red herrings are a favorite device of those who argue conspiracy theories:
Starling: “Man landed on the moon in 1969.”
Bombo: “But don’t you think it’s strange that Werner von Braun went rock hunting in Antarctica only a few years before?”
Starling: “9/11 was perpetrated by Islamic terrorists.”
Bombo: “But don’t you think it’s strange that Dick Cheney had business contacts in the middle east?”
Red herrings are fallacious because they do not address the point under discussion, they merely distract from it; but in doing so, they give the impression that the true cause lies elsewhere. The wrongful use of red herrings as a substitute for evidence is rampant, absolutely rampant, in conspiracy theory arguments.
A slippery slope argument presumes that some change will inevitably result in extreme exaggerated consequences. If I give you a cookie now, you’ll expect a cookie every five minutes, so I shouldn’t give you a cookie.
Starling: “It should be illegal to sell alternative therapies that don’t work.”
Bombo: “If that happened, any minority group could make it illegal to sell anything they don’t happen to like.”
No matter what Starling suggests, multiplying it by ten or a hundred is probably a poor proposition. Bombo can use a slippery slope argument to exaggerate any suggestion Starling makes into a recipe for disaster.
The slippery slope is probably the most common subset of the larger fallacy, argument from adverse consequences, which is the practice of inventing almost any dire consequences to your opponent’s argument:
Starling: “They should remove ‘Under God’ from the Pledge of Allegiance.”
Bombo: “If that happened, all hell would break loose. Students would have sex in the hallways, school shootings would skyrocket, and we would become a nation of Satan worshippers.”
An argument by special pleading states that the justification for some claim is on a higher level of knowledge than your opponent can comprehend, and thus he is not qualified to argue against it. The most common case of special pleading refers to God’s will, stating that we are not qualified to understand his reasons for doing whatever he does. Special pleadings grant a sort of get-out-of-jail-free exemption to whatever higher power lies behind a claim:
Starling: “Homeopathy should be tested with clinical trials.”
Bombo: “Clinical trials are not adequate to test the true nature of homeopathy.”
No matter what Starling says, Bombo can claim that there is knowledge outside of Starling’s experience or at a level that Starling cannot comprehend, and the argument is therefore ended. Bombo might also point out that Starling lacks some professional qualification to discuss the topic, thus placing the topic out of Starling’s reach.
Bombo: “You’re not a trained homeopath, so you shouldn’t be expected to understand it.”
A special pleading makes no attempt to address the opponent’s point, it is just another diversionary tactic.
Statistics of Small Numbers
You really have to take a statistics class to understand statistics, and I think the part that would surprise most people is the stuff about sample sizes. Given a population of a certain size, how many people do you have to survey before your results are meaningful? I took half of a statistics class once and learned just enough to realize that practically every online poll you see on the web, or survey you hear on the news or read about in the newspaper, is mathematically worthless.
But it extends much deeper than surveys. Drawing conclusions from data sets that are too small to be meaningful is common in pseudoscience. Listen to Bombo make a couple of bad conclusions from invalid sample sizes:
“I just threw double sixes. These dice are hot.”
“My neighbor’s a Mormon and he drinks wine, so I guess most Mormons don’t really follow the no-alcohol tradition.”
“I went to a chiropractor and I feel better, so chiropractic does work after all.”
Straw Man Argument
This fallacy is the most common and also one of the easiest to spot. This is where you state your position, and your opponent replies not to what you said, but to an exaggerated and distorted caricature of what you said that’s obviously harder to defend.
Starling says: “People who commit minor offenses should be let out of jail sooner.”
Bombo replies: “Emptying out all the jails would create havoc in society.”
Well, maybe Bombo’s right, but that’s not relevant, because “emptying the jails” is not what Starling advocated. In fact Bombo did not refute Starling’s point at all — he invented a different point that was easier to argue against. He created a straw man — one of those dummies stuffed with straw that soldiers use for bayonet practice. It’s too weak to fight back. And Bombo can then take satisfaction in having made a point that no reasonable person would argue with, and he appears to have successfully defeated Starling’s argument, when in fact he dodged it.
Giving a controversial concept like creationism a new, more palatable name like Intelligent Design is what’s called the use of weasel words. Calling 9/11 conspiracies “9/11 Truth” is a weasel word; their movement is more interested in unlikely conspiracies than with truth, yet they give it a name that claims that’s what it’s all about.
Weasel words are a favorite of politicians. Witness the names of government programs that mean essentially the opposite of what they’re named: the Patriot Act, No Child Left Behind, Affirmative Action. By the way certain programs are named, it sounds like it would virtually be criminal to disagree with them.
Weasel words can also refer to sneaky wording in a sentence, like “It has been determined”, or “It is obvious that”, suggesting that some claim has support without actually indicating anything about the nature of such support.
Really an appeal to authority – and if so is it logically fallacious to do this? (assuming that there is an article in a peer-reviewed scientific journal that says this)
Most arguements about the safety of GM foods come down to “these articles in peer reviewed scientific journals say GM foods pose no more danger than non GM foods”
I dunno maybe I’m reading it wrong – that just seemed out of place.
I think Brian meant that simply saying that something was peer reviewed doesn’t mean it’s true. We’ve talked about that idea a lot here. One study can be terribly flawed, and it’s only when we look at the body of literature that we can make conclusions. Still, your point is noted and I’ve removed that example. Thanks for the keen eye 🙂
Is removing an author’s original text not some sort of fallacy? Yikes!
Why not ask the author ro explain – or just give your take as you so nicely did. Removing it? This isn’t a breeding population where you rogue out the uglies.
A very good working list of logical fallacies but a little too many examples involving those “other” people. Seems like it would have a much more powerful effect on the biofortified group if there were more examples taken from the scientific and research worlds.
With so many examples from the “alternatives” world there might be a tendency to focus on how silly “they” are in contrast to “our right thinking science views.”
btw, I have been a bit slow in responding in the last “ethics” post because I suffered a major attack from Nigerian scam artists yesterday and have spent all my discretionary time beating off that nuisance. Though I have solved the major problem, I am a bit exhausted mentally and emotionally.
But really the basic fallacies list is good, I have printed out a copy to keep it at hand to refer to.
Matthew, I don’t think removing the text is a fallacy, but I will explain. I have permission from Brian to edit his posts in order to create this post. I’ve already made quite a few changes prior to posting. I had hesitated about the example in question for the very reason I explained above. I could have just removed it and deleted Ewan’s comment but I think that that would have been dishonest since I’d already posted. I hope that’s ok.
Duncan, I agree! Brian’s podcast covers everything from conspiracy theories to pseudoscience to actual science, as you can tell from the examples. My original intent was to take out all of the non-science examples and replace them with examples from agriculture, food, and medicine. Unfortunately, that was taking much longer than I’d wanted and with a ton of experiments coming up at work that I just have to do as quickly as possible, it was either post as is or wait for months. I hope you understand.
Appeal to Dead Puppies example:
“If the world does not accept and encourage gmo crops, billions of people are going to starve to death.”
The logical fallacy here is the tendency to think of gmo crops as the “only” or “main” way to prevent a starving world when it is possible that better storage of water, better food storage during productive years, providing easier access to money for peasant farmers and better protection from landlord exploiters, etc. might be as or more effective ways.
I actually considered using that as an example, with Indian farmer suicides as a second example, but then I had to go change the media on my cells, etc, etc, etc. Grad school is busy 🙁
The fallacy that you’re referring to is exactly what I was talking about in my post Why I’m not pro-GMO. I do think there are people who hold this false belief (or who have done such a good job of promoting the idea that I can’t help but believe them), such are Dennis Avery, but I think those are a very small minority of all the people who might consider themselves pro-GMO whatever that means.
I think there are also people who say things like this for two other reasons. First, there’s the obvious spin. Of course Bio (the biotechnology industry group) is going to tell us that we need GMOs or horrible things will happen. Second, there’s a reaction to the opposite fallacy “only organic will save the world”. Faced with enough anti-technology anti-science speak, one can feel compelled to slap the table in front of one and say “look, we need technology and science or we’ll never be able to feed everyone.” I’ve said it before myself. I didn’t mean that without (fill in the blank – precision breeding, genetic engineering, fertilizer – whatever was the topic of discussion that the person I was talking to was dismissing as garbage or worse) the world would end, but that we needed to at least consider using all the available tools especially when people’s lives are at stake.
Both of these together (“we need GMOs” and “we don’t need any technology”) make a fallacy of the excluded middle. In reality, we need a careful combination of technology and traditional methods, as I discuss in Toward a better agriculture… for everyone.
Then I would recommend you use a “line out” edit so people can see changes you have made, Unless these articles are all working drafts? Making changes prior to posting is indeed editing, making changes after publishing is “retracting” – again, unless you consider this a working draft. Calling your retracting of that statement editing is indeed a fallacy. I’m not trying to pick on you for this. I like your writing very much, but this is slippier slope you are on.
One of the prime misconceptions about the word fallacy is that it means “false” as in, a false fact. Matthew above said that “calling your retracting of that statement is indeed a fallacy” when the appropriate word is “false”, not “fallacy.” Fallacy refers to the logic, false refers to the truth value. Slippery slope, now that’s a fallacy. 🙂
I’ve missed you Karl (no longer on twitter). And appreciate your English lessons. Wish I could go back and delete and change 😉
But aren’t you editor of this site? What about the line-out tool for changes to blog posts? Do you think it’s appropriate? For one, it allows commenters who came in late to understand the context of the discussion.
I changed it again, I hope this better meets your expectations. No, all of these posts aren’t working drafts, but posts and pages are updated when appropriate, and changes are indicated with a note or comment. For example, the sugar beet post I wrote recently was updated to include additional information and to remove incorrect information, but the changes were noted. You could call it a retraction if you wish, but I think it’s a correction, and as long as changes post-publishing are noted I don’t see anything wrong with it.
I was going to check what the text that was removed said but I saw Anastasia editing it at the same time so I figured some change was going to be made. I think that lined-out-text is appropriate for claims that turn out to be false and more controversial stuff, but I often find a couple small errors in my own posts and change them without note and I don’t see a problem with that. It’s the big stuff that matters, and I think this was much ado about nothing. Personally, I think the argument-from-authority example needs a better name, because I don’t think that citing just one paper as the end-all-be-all for a claim is necessarily an fallacy of appeal to authority, but there is an error in logic involved. Something more like a fallacy of argument by cherry-picking or unconfirmed science or something like that…
It’s interesting how, in a discussion involving logical fallacies, the activists always think you’re talking about them… Gotta wonder why that is.
I’m really glad you are ok. Those scams are terrible.
You always think you won’t fall for it but sometimes the scams are really good. In the last few months, we had a visiting scientist who, after her time here, visited Europe for conference. Everyone in my lab got a weird email saying that all her money and credit cards were stolen and that she needed a few hundred dollars wired to her to cover her hotel for a few days until things got straightened out and she’d pay us right back. We were all ready to wire the money, thinking about our poor colleague stranded there (appeal to puppies, right?)! Thankfully we talked about it and replied with a message asking something personal only she would know before sending any money. Turned out her email password had been compromised and she was fine. It’s terrible. I bet lots of people fall for that one.
That’s a good idea. A fallacy by cherry picking is different from an appeal to authority. If I have time later, I’ll create the new fallacy on the list and move the relevant text that I added to the AtA as well as adding a little more information (perhaps links to some of our previous discussions about peer-review and why we can’t look at just one paper).
The following two statements are not logical fallacies.
“A growing number of scientists say that evolution is too improbable.”
“Wired Magazine says that Skeptoid is an awesome podcast.”
They would only be logical fallacies if you stated something along the lines of “because Wired Magazine says that Skeptoid is an awesome podcast, it must actually be awesome.” It’s not a fallacy to just repeat the opinion of experts. It’s only a fallacy if you assume because an expert holds an opinion it *must* be true.
Ad Hominem’s are also not necessarily a logical fallacy. For example you could argue that a person isn’t appropriate to be the teacher of your children because they were once convicted of pedophilia. This would be a non-fallacious ad hominem.
Ad hominem arguments are central to an understanding of ‘the debate’ over agro biotech.
The most strident opponents of GE crops proceed from an inherent ethical narcissism. They are personally enthralled with the notion that they are Truly Concerned About The Most Important Things. These are things like biodiversity, environmental health, etc. To prove their concern for such things, they oppose GE crops.
That is, literally, their internal logic. They ‘ad-hominem’ themselves first. Then they ascribe the same to you. Once you admit to liking GE crops, or at least being neutral about them, there’s only one remaining question that’s relevant: what has corrupted you? Corporate money? Callous disregard of Mother Nature? Something must account for your ethical inferiority.
Meanwhile, they can add opposing you to the list of things that prove Their Beneficent Qualities.
If you find yourself debating with someone who takes that approach, you will be wasting your time. They can only be persuaded if they first relinquish what makes them feel good about themselves, and that’s not very likely.
To learn more about those “absorbed in the endless struggle
To think well of themselves” (T.S. Eliot) and narcissistic personality disorder, visit: http://www.narcissism101.com/
Firstly please understand that I am describing the technical nature of logical fallacies, and not their bearing on the issue of GMOs, where it appears I hold the same opinion as you.
The examples you describe are not ad hominem logical fallacies, although they are both ad hominem arguments and probably logical fallacies.
Argumentum ad hominem is when you use the personal traits of the opponent to attack their argument. Doing this is always an ad hominem but not always a logical fallacy. The classic example is to say that a persons legal testimony lacks credibility because they have been convicted of fraud in the past. Ad hominem but not a logical fallacy.
You commit an ad hominem logical fallacy when you draw an unsubstantiated conclusion from the personal trait. “This person has blond hair, therefore they are stupid.”
However in the examples you gave the logical fallacy is not in the conclusions drawn from the personal trait. In fact the logical fallacy is either “begging the question,” or just a lie (or potentially no logical fallacy at all).
Take this example you gave; “You have been corrupted by money and that’s why you support GMOs.” If the premise given is correct, then this obviously isn’t a logical fallacy. The debate comes not from the ad hominem, but from whether that statement is actually true or not. If you’re assuming it is true without evidence that’s a separate logical fallacy – begging the question, or else it’s just a lie. Telling lies aren’t really logical fallacies, they’re just falsehoods.
This post is obviously pedantic regarding the taxonomy of poor arguments, rather than any attempt to critique GMOs.
Your points are good ones. Please note how with just a few substitutions, the same statements can be made by those who question the wisdom of promoting gmo crops. The reality of conflicts is that whichever side one most identifies with there is a strong tendency to view one’s own side to be the more intelligent, humane and ethical.
My modified version of your piece:
[Ad hominem arguments are central to an understanding of ‘the debate’ over agro biotech.
The most strident “proponents of GE crops proceed from an inherent ethical narcissism. They are personally enthralled with the notion that they are Truly Concerned About The Most Important Things. These are things like biodiversity, environmental health, world hunger, etc. To prove their concern for such things, they “promote” GE crops.
That is, literally, their internal logic. They ‘ad-hominem’ themselves first. Then they ascribe the same to you. Once you admit to “disliking” GE crops, or at least being neutral about them, there’s only one remaining question that’s relevant: what has corrupted you? Lack of intelligence? Callous disregard of the need to increase world food production? etc. Something must account for your ethical inferiority.
Meanwhile, they can add opposing you to the list of things that prove Their Beneficent Qualities.
If you find yourself debating with someone who takes that approach, you will be wasting your time. They can only be persuaded if they first relinquish what makes them feel good about themselves, and that’s not very likely.]
Hopefully we can all learn a little about not letting our narcissism control our intelligence.
Eric nicely uses fallacies, “The most strident opponents of GE crops proceed from an inherent ethical narcissism. They are personally enthralled with the notion that they are Truly Concerned About The Most Important Things. These are things like biodiversity, environmental health, etc. To prove their concern for such things, they oppose GE crops.
That is, literally, their internal logic.”
Oh, he qualifies it by saying “most strident”, but that hardly changes anything. Many, if not most, people reading it will probably associate it with all opponents of GMO’s. How does he know that it is because of “narcissism” ethical or other?
What do you call the fallacy that is applied to only one group? I am against GMO’s, not for medical research, but for generalized food production. Many of the fallacies mentioned in the article have been used against me and my “narcissism” by pro-GMO’s.
But the implication is that only anti-GMO’s can use such fallacies. And then, what if GMO’s turn out to be bad?
Not to put too fine a point on it, I actually do think we’re dealing with ad hominems. But with a twist. Allow me to slightly rephrase the example given above:
Starling: ”I think GMOs are safe and nutritious.”
Bombo: ”Of course you’d say that; you obviously don’t care about the environment.”
The twist is adding a ‘straw man’ in at the end. Bombo might actually be a highly accomplished environmentalist (whatever that means). Starling then gets to attack the callous attitudes, etc. imputed to Bombo, who is by then a straw man incarnate.
Bombo might respond by defending his environmental credentials, but by then, Starling has Bombo trying to prove a series of negatives; no connection to multinationals, no conflicts of interest in the food industry; no motives to undermine efforts to etc.; and so forth, all of which are actually irrelevant to the food value of GMOs. But The anti-GE person will have accomplished the goal of changing the topic from science to one involving personal moral worth.
You are of course correct that ethical narcissism could arise just as easily on one side of ‘the debate’ as on the other. I have in fact met three ethical narcissists on the ‘pro-GE’ side of the debate in the last dozen years or so, and they’re actually rather embarrassing and uncomfortable.
I put ‘pro-GE’ in quotes because, aside from those three narcissists, I have not met anyone who is ‘pro-GE’. Seriously.
I have, on the other hand, met countless individuals who are staunchly in favor of using the best technology available, who have found, on the basis of evidence, that GE is the best available solution for certain challenges in food production.
That being the case, I’d say that ‘the debate’ over GE can reasonably be described as being between the narcissists on one side, and the pragmatists on the other.
Thank you. Your objective assessment is totally convincing and certainly not self-serving on your part in the slightest.
(Why is my tongue firmly placed in my cheek?)
As for my part, I will certainly admit that I regularly fall into narcissism and other cognitive sins: my only redeeming value in this regard is that I am well aware of this. I literally feel sorry for anyone who does not understand that everyone make significant perceptual and cognitive errors daily. This is literally part of being human and any scientist who is not continually aware of this is using the title of scientist or researcher delusionally.
Anyway, on a different note, I was very happy to see you making a recommendation to T.S. Eliot with a link.
Eric, “I have, on the other hand, met countless individuals who are staunchly in favor of using the best technology available, who have found, on the basis of evidence, that GE is the best available solution for certain challenges in food production.”
The appeal to numbers and anecdotal evidence are two of the fallacies mentioned above. There are at least equally serious people who do not consider GMO’s as the solution, but when you add “certain challenges”, it is difficult to argue against because I don’t know which ones you mean.
One of the serious opponents who is worth reading, is Indian Vandana Shiva. It is easy to find others by simple search.
Your definition of fallacy isn’t quite right, but you do offer a nice description of what people do when they are arguing fallaciously. A fallacy is an argument in which the truth of the grounds does not imply the truth of the claim being made. All arguments, even good ones, are rhetorical. Also, an argument can be fallacious but the claim can be true. So,attacking an opponent’s argument as fallacious to prove they are wrong about the claim they make is also a fallacy. At best, the charge that an opponent’s argument is fallacious casts doubt on her/his claim, but does not prove the opponent is actually wrong.
bernarda, your trust in Vandana Shiva is very telling.
Most of us do not spend our time looking at complicated scientific, economic or socialogical data. We leave that to experts in whom we place our trust. It takes some judgement to decide who to trust. But once you trust someone who tells lies, it doesn’t require logical fallacies to reach wrong conclusions.
I first encountered Ms. Shiva on a televised panel, CSPAN. See my comment under Ethics of Labeling, September 9, 2010 at 3:03 pm. To save you the trouble of looking up that post, let’s just say that I trusted her, then found out that she had been lying, and no longer trust her. Neither should you or anyone else.
The reason ad-hominem attacks are important is that they are an attempt to undermine inappropriate trust.
Charles, I cannot comment on the Shiva case you mention as I don’t know the details. But I know that George Bush Sr., when he was VP or Pres was given a guided tour by Monsanto and he told them that one of his goals was to reduce government regulation to make it easier for them to market their products. There is also the revolving door of employees between groups like Monsanto and government agencies like the FDA.
Therefore we get the FDA policy as “generally recognized as safe”. Given the situation ad hominem against Monsanto and the FDA can be considered as legitimate. There is reason to suspect the honesty of the research and conclusions of the two organizations, and others.
You might look up the film “The World According to Monsanto” which can be found on youtube with the title “Controlling our food”. It was produced by the Franco-German cultural channel ARTE so it less likely to have an axe to grind.
Vandana Shiva a “serious” critic? It’s her considered opinion that modern science is like a dragon orbiting the earth. She thinks that malnutrition in India would end if people would eat the appropriate wild herbs. She can’t tell the difference between rice plants and weeds. But then, she’s a physicist.
Maybe you should have mentioned GE-critic Jeffrey Smith instead. I mean, the man can levitate. He learned it from the Maharishi Mahesh Yogi.
Interesting lines of reasoning, bernarda.
George Bush motivated to reduce regulatory burden on GE crops… What does that prove? Oh, yes; Bush = Bad, Bad = reduced regulatory burden.
Guided tour by Monsanto… Monsanto = Bad, Bush = Bad, tour = cronyism
Monsanto/FDA ‘revolving door’… Monsanto = Bad, Former Monsanto employee = Bad, Former Monsanto employee at FDA = Bad.
Former Monsanto employee at FDA = Bad, generally recognized as safe = Bad, collusion.
Reason to Expect Dishonesty = Bad, Bad = Bush, FDA, GRAS, employment with Monsanto at any time = all Bad all over
GE crops = Bad, German and French opposition to GE crops = Good, Movie against GE crops = Good, Germans and French movies = Good.
There is an incipient paranoia, sometimes manifesting as overt conspiracy theory, which also underlies anti-GE activism.
Compare your experiences with anti-GE activists with the Symptoms of Paranoid Personality Disorder found at
http://psychcentral.com/disorders/sx37.htm You’ll either be amazed, or surprised.
Nicely reasoned. In many ways, well-phrased fallacies appear merely to be nuance without a closer look.
Great review of fallacies. I especially like the addition of the ‘appeal to quantum physics’ fallacy. I hear that one all of the time.
The only thing I would object to is that you seem to be applying an ad hominem when you say under ‘weasel words’ that 9/11 truthers ‘clearly have nothing to do with truth’. If I was making a page like this, I would use this sentence of yours as an example of an ad hominem argument. After providing all of these fallacies for your readers to make them more reasoned, I wouldn’t expect you attack an argument by criticizing the intentions of those making the argument. Furthermore, even if you disagree with the arguments made by that group and have good reason to criticize the intentions of some of the group’s members, surely you realize that broad statements cannot be made about the intentions of all members of a group. There are plenty of people, myself included, who are very interested in the truth of 9/11 who do not have an agenda or ideology and simply have a problem with believing accounts that seemingly require the suspending of physical laws or do not adequately address other available data. I find it utterly discouraging and bewildering to repeatedly observe that even rational individuals seem unwilling or unable to maintain a reasoned discussion on this topic, regardless of what ‘side’ they are on in this discussion.
On another note, I’m confused about why you say you are ‘starting’ with the straw man when it comes at the end of the article.
I have also commented on Vandana Shiva under Ethics of Labeling, and I just wrote a piece in French which I might translate for this blog. I have seen and heard her performing… To put it mildly, Vandana Shiva is an inexhaustible source of examples for fallacies.
There is also a lot to be said about “The World According to Monsanto”. The first fallacy is perhaps: Monsanto has been bad in the past (Agent Orange, etc.), so it can only be bad today; and it will be bad for ever.
But I will not dwell into this further. However, as a Frenchman who watches ARTE, particularly its airings on current topical issues, I can tell you that the channel is by no means a source that can be blindly trusted.
bernarda, this is not about whether you trust Monsanto or FDA less than I trust Vandana Shiva. We both want information we can trust. One way to tell if you can trust information is to check it out. Or you can reason it out. Shiva was telling farmers that Monsanto Bt cotton contained the sterile seed gene when it did not. She was saying to American audiences that farmers were borrowing heavily to buy the GMO cotton seeds at a time when they were not being sold. She lied on American television and did it so smoothly and persuasively that she seemed like a saintly woman, and she fooled many people, including me. But only once.
Not only does Dr. Shiva say things that aren’t true, but she also says things that are shockingly extreme. She believes, for example, that the government of a famine stricken country should turn away food aid that might contain GMOs.
I think it would be progress if you stopped considering this person someone to admire.
There are people opposing GMOs who are honest. I would mention Jane Rissler, who knows her science and points out problems. That’s a good thing, because every time a real problem is anticipated, it can be prevented or at least managed.
Eric, you again use two fallacies. The first is the straw man with your “good” this and “bad” that. The second might fall in different categories: non’-sequitur, red herring, correlation and causation, and even ad hominem.
“Compare your experiences with anti-GE activists with the Symptoms of Paranoid Personality Disorder found at…” Why not compare pro-GMO’s to Dr. Strangelove?
As to Bush, he and others deliberately made it so regulations were less strict for GMO’s. The revolving door lends to mistrust because a regulator goes to work for those he was supposed to regulate, and maybe back again. Others go from their company to work to regulate the company they just worked for, and maybe back again. Not to mention financial interest that they might retain in the company.
Charles, perhaps you are right, I first heard Shiva on a BBC 4 radio program called the Reith Lectures and I don’t remember her lying. But suppose she did, that doesn’t necessarily make her wrong on other things.
Andre, you don’t need to translate for me, I speak and read French fluently. You misrepresent the documentary as the reference to the past of Monsanto is part of a short history of how agriculture has changed. Also, I don’t simply take everything produced by any one source as always true. People, including researchers and documentary producers, can make mistakes or even have biaises.
bernarda wrote “Charles, perhaps you are right, I first heard Shiva on a BBC 4 radio program called the Reith Lectures and I don’t remember her lying. But suppose she did, that doesn’t necessarily make her wrong on other things.”
Of course that doesn’t necessarily make her wrong on other things. It makes her someone you shouldn’t trust. It should make you very queasy about calling her “one of the serious opponents who is worth reading”.
Being wrong is different from lying. Years ago I wrote an essay about genetic engineering in agriculture and posted it on the web. A few people contact me from time to time and tell me I’ve made a few errors. I always correct them. Ms. Shiva, by contrast, makes the calculation that some arguments are too useful to abandon just because they are wrong. Being a GMO opponent is her business, just like selling seeds and herbicides is Monsanto’s business.
Thanks everyone for your input on this post. A few changes have been made to reflect your input.
I would disagree. When you say “A fallacy is an argument in which the truth of the grounds does not imply the truth of the claim being made.” this is actually whether or not a argument is valid or strong. A fallacy has to meet certain conditions for it to be a fallacy for example it usually must be generally understood as a bad argument. Why are all arguments rhetorical?
Um republicans are always for deregulation how does that imply anything about Monsanto? Furthermore, refering to the fda thing you forget there are plenty of independent scientists in the relevant fields who think GMOS are safe (i think its even a consensus of independent scientists)
Comments are closed.