Tips for Writing: The Baloney Detection Kit

UCSC aerial photo of monterey bay and campus

Perhaps you are familiar with Carl Sagan and his 1996 book, “The Demon Haunted World, Science As A Candle in The Dark” and perhaps not. While not written specifically for a scientific audience but rather aimed as targeting a more wide reaching audience, Sagan describes that as part of standard, rigorous scientific training, one develops a set of skills and “tools” that serve as a “Baloney Detection Kit.” A combination of both an understanding of common fallacies as well as how to investigate in a rigorous and reproducible manner, Sagan writes:

The kit is brought out as a matter of course whenever new ideas are offered for consideration. If the new idea survives examination by the tools in our kit, we grant it warm, although tentative, acceptance. If you’re so inclined, if you don’t want to buy baloney even when it’s reassuring to do so, there are precautions that can be taken; there’s a tried-and-true, consumer-tested method.

Carl Sagan

As with all frameworks of approaching a problem, it can be used incorrectly (or outright abused and utilized for manipulation) and can easily be used as confirmation bias or otherwise used inappropriately:

Like all tools, the baloney detection kit can be misused, applied out of context, or even employed as a rote alternative to thinking. But applied judiciously, it can make all the difference in the world — not least in evaluating our own arguments before we present them to others.

Carl Sagan

Thus these tools can be used to strengthen your own scientific writing; presentations; and other communications to aid in delineating objective information from “fluff” and otherwise aid in your writing and communication being clear, concise, defensible, and effective.

Nine Tools of Investigation

  1. Wherever possible there must be independent confirmation of the “facts.”
  2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  3. Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
  5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
  6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
  7. If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
  8. Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
  9. Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.

Common Fallacies

  1. ad hominem — Latin for “to the man,” attacking the arguer and not the argument
    Example: The Reverend Dr. Smith is a known Biblical fundamentalist, so her objections to evolution need not be taken seriously.
  2. argument from authority
    Example: President Richard Nixon should be re-elected because he has a secret plan to end the war in Southeast Asia — but because it was secret, there was no way for the electorate to evaluate it on its merits; the argument amounted to trusting him because he was President: a mistake, as it turned out.
  3. argument from adverse consequences
    Example: The defendant in a widely publicized murder trial must be found guilty; otherwise, it will be an encouragement for other people to openly murder others.
  4. appeal to ignorance — the claim that whatever has not been proved false must be true, and vice versa
    Example: There is no compelling evidence that UFOs are not visiting the Earth; therefore UFOs exist — and there is intelligent life elsewhere in the Universe. Or: There may be seventy kazillion other worlds, but not one is known to have the moral advancement of the Earth, so we’re still central to the Universe.) This impatience with ambiguity can be criticized in the phrase: absence of evidence is not evidence of absence.
  5. special pleading, often to rescue a proposition in deep rhetorical trouble
    Example: A rule exists stating “All members must attend meetings on every Saturday.” One of the members often skips these meetings and when confronted says, “Well, I should be exempt from this rule because I have a very busy schedule.” Special pleading attempts to preserve a particular belief or rule by arbitrarily excluding certain cases from it without good reason.
  6. begging the question, also called assuming the answer
    Example: The stock market fell yesterday because of a technical adjustment and profit-taking by investors — but is there any independent evidence for the causal role of “adjustment” and profit-taking; have we learned anything at all from this purported explanation?
  7. observational selection, also called the enumeration of favorable circumstances, or as the philosopher Francis Bacon described it, counting the hits and forgetting the misses
    Example: A state boasts of the Presidents it has produced, but is silent on its serial killers.
  8. statistics of small numbers — a close relative of observational selection
    Example: “I’ve thrown three sevens in a row. Tonight I can’t lose.”
  9. misunderstanding of the nature of statistics
    Example: President Dwight Eisenhower expressing astonishment and alarm on discovering that fully half of all Americans have below average intelligence.
  10. inconsistency
    Example: Prudently plan for the worst of which a potential military adversary is capable, but thriftily ignore scientific projections on environmental dangers because they’re not “proved.” Or: Consider it reasonable for the Universe to continue to exist forever into the future, but judge absurd the possibility that it has infinite duration into the past.
  11. non sequitur — Latin for “It doesn’t follow”
    Example: A friend says “Cats are animals. Dogs are animals. Therefore, all cats are dogs.” Often those falling into the non sequitur fallacy have simply failed to recognize alternative possibilities.
  12. post hoc, ergo propter hoc — Latin for “It happened after, so it was caused by”
    Example: Before Smash Mouth released “All Star”, these were no social media companies.
  13. meaningless question
    Example: What happens when an irresistible force meets an immovable object? But if there is such a thing as an irresistible force there can be no immovable objects, and vice versa.
  14. excluded middle, or false dichotomy — considering only the two extremes in a continuum of intermediate possibilities
    Example: “If you’re not part of the solution, you’re part of the problem.”
  15. short-term vs. long-term — a subset of the excluded middle, but so important I’ve pulled it out for special attention
    Example: Why explore space or pursue fundamental science when we have so huge a budget deficit?
  16. slippery slope, related to excluded middle
    Example: “If we start a recycling program, then more government officials will be employed to manage it. Then, taxes will have to be raised to pay their salaries; and once taxes are raised, local businesses will move away to avoid the higher taxes. This will lead to job losses and eventually a ghost town. So, starting a recycling program will destroy our town.”
  17. confusion of correlation and causation
    Example: A survey shows that more college graduates drive Hondas than those with lesser education; therefore education makes people drive Hondas.
  18. straw man — caricaturing a position to make it easier to attack
    Example: Scientists suppose that living things simply fell together by chance — a formulation that willfully ignores the central Darwinian insight, that Nature ratchets up by saving what works and discarding what doesn’t. Or — this is also a short-term/long-term fallacy — environmentalists care more for snail darters and spotted owls than they do for people.
  19. suppressed evidence, or half-truths
    Example: A spokesperson for the pharmaceutical company argues that clinical trials show that their new drug significantly reduces symptoms in patients with chronic migraines; therefore the drug is safe and effective for treating chronic migraines. However, the spokesperson fails to mention that the drug also has a high incidence of severe side effects including liver damage and cardiovascular issues.
  20. weasel words
    Example: A politician is asked about their stance on a controversial issue, and they respond with: “Many people say that this issue is of great concern and should be looked at very closely. There’s a lot of evidence to suggest that action may be necessary, and we are committed to exploring all avenues to ensure the best possible outcomes for everyone involved.” Vague phrasing (i.e. “many people say”; lack of specificity; and false impression of consensus all can be used to obscure the truth, avoid commitment, or inflate the importance or consensus around a particular issue without providing substantive evidence or clear statements.