David L. Martin

in praise of science and technology

Archive for the month “June, 2016”

What a Jerk He Is! – Fundamental Attribution Error

Back in the 1960’s, researchers set up an experiment in which subjects read essays expressing opinions about Fidel Castro.  They were then asked to rate the attitudes of the writers about Castro (for or against).  When the subjects believed the writers had taken their positions based on their own views, naturally the subjects tended to rank to pro-Castro writers as being genuinely pro-Castro.  But here’s the thing – When the subjects believed the writers had written their essays for or against Castro based on nothing more than a coin toss, they STILL tended to rate the writers of pro-Castro essays as being genuinely pro-Castro.  In other words, they tended to believe that factors intrinsic to the writers themselves were responsible for the writing, not something external like a coin toss.


This bias seems to reflect a general human tendency to believe that we are completely in control of our own behavior.  We like to believe that we all have free will.  This is of course a fallacy.  We can completely control neither our thoughts nor our actions.  Try not to think about the image above for the next minute.  We carry baggage, we have wants and needs, we can be manipulated.  That doesn’t mean we shouldn’t be held responsible for our actions.  It just means that when looking at someone else’s behavior, we tend to think that nothing in their past or present is affecting them.


A good example of this is when we catch a television show or movie at a point where one of the characters is doing something really atrocious.  Most people have a gut reaction to this.  We tend to immediately view the character with tremendous animosity.  How could they do such a thing?  But when we watch the show from the beginning, we usually see that there are external factors affecting them.  They may be desperately trying to save a loved one.  They may have been brainwashed.  We realize that we may have been unfair in our assessment.  But chances are, we will repeat the same mistake in the future.  It’s a bias that most of us suffer.


Interestingly, we tend to be much more charitable when we can put a label on someone.  If they are “mentally ill” or “incompetent,” we tend to accept that they are driven by forces beyond their control.  If they have a “syndrome,” like post-traumatic stress disorder, or post-partum depression, suddenly we are quick to excuse almost any transgression.  This is a special case of the false dichotomy fallacy I have discussed before.  All of us are driven by forces beyond our control.  It is all a matter of degree.  But if we can put an extreme label on someone, suddenly they become an exception to our natural tendency to pin responsibility on them.  If not, we tend to judge them harshly.


Naturally, it takes effort to overcome this sort of thing, and the first step is realizing it’s there.  Like many biases, it’s often subconscious.  Conquering it means bringing it to the surface where it can be thoroughly examined and dealt with.  The next time you are tempted to judge someone else’s actions or statements harshly, step back and ask yourself some tough questions.  Do you really understand what they’ve gone through?  If not, perhaps you should reconsider.




Cognitive Biases

Many logical fallacies are due to what are called cognitive biases.  These are human tendencies, psychological factors, that lead to invalid conclusions.  Actually, the term cognitive bias is itself somewhat misleading, because it implies that these are factors that come into play when we are trying to be rational.  That’s true, but they also come into play when we aren’t.  They are there, all the time, fogging and coloring our conclusions, our perspectives, and even our perceptions.


Our understanding of cognitive biases comes from years of often ingenious psychological research.  In many cases these biases are quite unintentional.  They are there, often subconsciously, whether we like it or not.  So we need to be aware of them and find ways to counteract them.


There are 6 common cognitive biases that I will discuss in detail.  Don’t be intimidated by the fancy language.  As always, memorizing a bunch of names is not what’s important.  What’s important is becoming aware of these tendencies and dealing with them.  So here they are.

peanuts 1.Fundamental attribution error – This is also called correspondence bias.  It refers to our tendency to attribute someone else’s behavior to things intrinsic to THEM, rather than external factors that might be affecting them.  We prefer to believe that they are completely in control of their thoughts and their actions, that everything they do is freely chosen.  If we experienced everything they experienced, of course we might believe differently.  But our initial tendency is to attribute everything they do to things intrinsic to them.


2.Confirmation bias –  I have discussed this one before in the context of cherry picking – our tendency to pick and choose the premises that already favor our preformed conclusion.  Confirmation bias is strong, and psychological research suggests that we are all guilty of it.  We tend to seek out sources of information and opinion that reinforce what we already believe.  And we are pretty damn good at ignoring or minimizing those that don’t.


3.Self-serving bias –  This one is pretty self-explanatory.  It’s our tendency to favor a position that boosts our own self esteem.  We like people to tell us what we want to hear.  The problem is, what we want to hear may not be what’s good for us.  Or it may simply be untrue.    This bias opens the door for hucksters and manipulators who want to sell us something, whether it’s a product, a theology, or a political position.  Getting past the self-serving bias is tough – But then growth is often a painful process.  It can also be very rewarding.


4.Belief bias – Belief bias is an interesting phenomenon and very pervasive.  It is our tendency to accept an argument as valid even if the conclusion doesn’t follow logically from the premises, simply because we already agree with the conclusion or find it reasonable.  And conversely, if we already believe something is false, we tend to reject sound arguments in its favor.  To this day there are people who believe that the earth is flat.  They come up with ingenious explanations for the evidence, to support this belief.  These explanations can be extremely convoluted, and ultimately, untestable.  And therein lies the problem.  When nothing, no evidence and no argument, can shake your conclusion, an alarm bell should go of in your head.

Word on keyboard

Word on keyboard made in 3D

5.Framing – Framing affects how we interpret everything in the world, starting with our very perceptions.  Every piece of information we receive is filtered, if only by our own perceptual processes.  We never “capture” the world with 100% accuracy.  And just as where we happen to be standing affects our perspective, how the information we gather is framed can have an enormous effect on what conclusions we draw.  A lot of information often comes to us in the form of words.  Words turn into thoughts, and what words are chosen is critically important.  We tend to think within a certain box, and it can be difficult to step out of it.


6.Hindsight bias – This is best expressed as the I-knew-it-all-along effect.  It simply means that once the results are in, we tend to think that we could have predicted them easily, whether we could have or not.  Human beings want control, which is what prediction gives us.  If we can’t have actual control, we will often look for the illusion of control.  And so we often have a distorted view of what we were thinking and saying before the event in question.  Again, manipulators can take advantage of this tendency, by claiming after the fact that they predicted what happened, hoping that we don’t remember too well what they actually said or wrote beforehand, and thus gaining our support.  Since we want the control that seems to go with that, we tend to go along.


There are many other forms of cognitive bias, but these are a good start.  There are mountains of psychological research on these, which give us tremendous insights into our own ability to delude ourselves, to rationalize, to make ourselves vulnerable to clever manipulators.  And make no mistake – Many of the manipulators are well aware of this body of research, and are happy to use this knowledge against us.  We need the tools of critical thinking to defend ourselves.

Are You With Us or Against Us? – False Dichotomy

From time immemorial, human beings have put things in categories.  Language itself is a process of categorizing things, and since much of our thought process is verbal, our thoughts tend to split things into categories too, even when they fall along a continuum.  The tendency to put things into conceptual boxes is called typological thinking.  Up, down, left, right, good, bad.


Logic itself, at least the kind of logic most of us use, is typological.  True and false.  In fact, there are other varieties of logic.  For example, in what is called many-valued logic, there is not simply true and false, but a range of “truth values.”  One example of this is probabilistic logic, in which the truth of a proposition is not an all or nothing, but falls somewhere along a range of certainty.


But even if we stick with old-fashioned 2-valued logic, it doesn’t follow that we have to accept an extreme position in a given situation, simply because it’s a sort of “opposite” of another position that we find distasteful.  The classic example is, “You’re either with us or against us.”  False dichotomies are often used by groups to squash dissent and increase conformity.  Often times, when someone tries to raise a valid criticism of a group position, they are met with, “Who’s side are you on?” or “If you don’t like it, why don’t you go somewhere else?”  Blind loyalty to a group is not a valid argument.


In a previous post, I talked about the “authoritarian personality.”  These are people who tend to attach themselves to a particular authority, and once attached, show great loyalty to that authority.  Loyalty of course is not necessarily a bad trait.  But uncritical loyalty can produce some very unpleasant things.  At the Nuremberg trials, one Nazi after another tried to argue that their hearts were not in the Holocaust, they were just being loyal to their superiors.  When an authority is issuing unethical orders, loyalty has to give way to larger ethical considerations.  The argument that “I was just following orders” doesn’t cut it.


At the risk of sounding like a broken record, I’ll say it again.  An argument has to stand on its merits, not on the fact that it is the position of a group that demands conformity.  And there is often a middle ground, a gray area between extreme positions or conclusions.  There is a segment of American society that bemoans “situational ethics.”  But the fact is, almost ALL morality is situational.  Is killing another human being wrong?  Almost everyone would say that it depends on the circumstances.  Most people believe that killing in war or for self-defense is justified.  How about stealing?  Lying?  The most basic ethical admonitions, virtually universal among human societies, are almost without exception situational.  Saying so doesn’t make you “less moral.”  Moral absolutism turns out to be a kind of “reverse straw man” – what is sometimes called an iron man argument – a position that is not actually held by the person making the argument, yet defended as if it is.  “If you don’t have my morals, you don’t have any.”  Expressed that way it sounds very unimaginative.  That’s because it is.


Ironically, absolutism can produce the very kind of nightmarish consequences that many absolutists fear, because extreme inflexibility can blind us to very real, very unpleasant consequences.  During the Vietnam War, a military figure notoriously said “It became necessary to destroy the town to save it” as a justification for the massacre of civilians.   In 1978, blind devotion led more than 900 people to commit suicide.  Many of them murdered their children before taking their own lives.  There was no middle ground for them.  It was their leader, their community, their way, or it was death.


A common variant of the false dichotomy is the slippery slope argument.  This is when the person acknowledges that there is an intermediate position, but claims that it will inevitably lead to an extreme situation.  The extreme situation is unacceptable, so the intermediate position must be as well.  Slippery slope arguments were once popular in high school anti-drug films.  The idea was that you should avoid marijuana, because marijuana would inevitably lead to more dangerous drugs.  The validity of the slippery slope of course depends on how slippery it really is.  Slippery slope promoters often make the same mistake as moral absolutists – If I abandon my extreme position, nothing will stop me from going to the opposite extreme.  This seems to come about because many people are clinging to a rock of absolutism, fearful that if they let it go, they will be carried off in the current, out of control, to be dashed against some other rock.  But if you’re just clinging to something out of fear, how do you know what you believe?


In any case, there is no reason to think that a given position, intermediate between extremes, will necessarily lead to one or the other.  Freedom does not necessarily lead to anarchy.  Women voting does not necessarily lead to hamsters voting.  Nuanced ethics do not necessarily lead to rampant immorality.  The movie Pleasantville is an excellent visual exposition of this.   The world is complex, far more so than can be expressed in only 2 colors, black and white.  Don’t be bamboozled by false dichotomies.





I’m Not a Doctor, But I Play One on TV – Argument from Authority

Years ago, a now-famous series of experiments was done, the Milgram experiments.  One of the participants was called the Teacher.  The Teacher presented a series of simple word problems to a person they could not see, called the Learner.  A third person, the Experimenter, supervised.  Each time the Learner got a problem wrong, the Teacher would give them an electric shock.  With each wrong answer, the voltage would increase.


What the Teachers didn’t realize was that they were the actual subjects of the experiment.  The Learner was not actually getting shocked.  But as the voltage increased, they would pretend to be – banging on a wall, screaming in pain, begging for the experiment to stop.  Naturally, as the Teacher began to hear these protestations, he/she usually hesitated to continue.  But the Experimenter would prod them on.  “Please continue.  The experiment requires that you continue.  It is absolutely essential that you continue.  You have no other choice, you must go on.”  And the Experimenter said that they would take responsibility for any negative effects of the experiment.  But they never physically coerced or threatened the Teacher.  The result?  65 PERCENT of Teacher subjects continued to administer shocks, despite hearing protests from the Learner, up to the maximum voltage of 450 volts.  By that time the Learner had long since gone from repeatedly screaming in agony and complaining of a heart condition, to DEAD SILENCE.


These Teacher subjects were neither sociopaths nor psychotics.  They were very ordinary people.  Many of them experienced great stress and needed counseling afterward.  During the experiment, some of them would have laughing fits or even seizures.  But the point is, most of them continued to give the shocks, long after they had obviously “killed” the Learner.  The Milgram experiments, which have been repeated many times, are a powerful statement about the human tendency to obey authority.  These people did something most of them would normally never have considered doing.  Why?  Only because a total stranger, not a parent, policeman, or clergyman, but only an “authoritative” stranger, insisted that they keep going.  How much more are we vulnerable to authority when it is a recognized one?


Of course, most of us don’t participate in experiments where total strangers try to get us to administer electric shocks.  Yet every day, manipulators use our tendency to uncritically accept authority on us.  An actor puts on a lab coat and tries to sell us some medicine.  A politician gets an endorsement from an army general.  A religious figure holds up the Bible to justify their doctrines.  “What’s wrong with that?” you might ask.  “Don’t authorities tend to know what they’re talking about, more than random people?”  Maybe, maybe not.  The point is often made that an appeal to authority can be useful.  Most of us can’t be experts on every subject.  We often have to rely on the expertise of others.  And that’s fine, as far as it goes.  But it’s not something to used as a crutch for lazy-mindedness.  An appeal to authority is not a substitute for a valid argument.


It is important to note that “authority” here does not necessarily mean someone or something you consider “above” you.  The authority of one’s peers can be equally persuasive, if not more so.  Testimonials are often cleverly staged to present the target audience with “one of their own.”  A commercial for the Texas lottery comes to mind, featuring a woman with a strong Southern drawl:  “I weyent to the stawur to bah some sawur crame – and gayess what?  I wan the lattery!”  Lotteries are not targeted toward doctors, lawyers, or business executives.  They are targeted at “regular folks.”  People “just like you and me” can win, and it changes their lives, for the better of course.


Testimonials, celebrity endorsements, actors in lab coats, they all amount to the same thing – the argument from authority.  And every authority, every single one, must be questioned.  The alternative is to be manipulated by clever people who have your number – scammers, hucksters, charlatans, media clowns, and just plain salespeople.




Everyone Else Is Doing It – Bandwagoning

Logicians call bandwagoning argumentum ad populum (argument by popularity).  How many times have you seen a product advertised as “the number one seller,” or “America’s favorite”?  Not too long ago I heard a politician make reference to “Louisiana values” to make an argument.  All of these rely on the same principle – My ideas or products are popular, therefore you should buy them.


Successfully bandwagoning you depends on your assumption that if you believe something and large numbers of people disagree, you must be wrong.  It depends on your own lack of confidence in your ability to judge the merits of something.  Of course, if large numbers of people disagree with you, you should question yourself.  But you should question yourself regardless!  Rule number 1 of critical thinking:  Question EVERYTHING.  The point is that a proposition must stand ON ITS OWN MERITS, not its popularity.

Let me give you an example.  Look at this image:


Is square A darker than square B?  Most people say yes.  In fact almost all people say yes.  Seeing is believing, right?  In fact, they are both exactly the same color, as we can see by connecting the two with a bar of the same color:


If you’re still not convinced, go ahead and clip out a piece of one square and place it over the other.  There are many optical illusions like this, which take advantage of your brain’s expectations about the world.  If squares A and B were on an actual board, they would of course be different colors, because a color in shadow would appear darker than the same color in bright light.  That is how your brain interprets this image.  But it’s just a 2-dimentional image.  Here’s an even more dramatic example, from the famous artist M.C. Escher:


Escher was very good at these kinds of illusions, and produced many of them.  Again your brain interprets this as a 3-dimensional structure.  But it is merely a 2-dimentional pattern, cleverly made to resemble a 3-dimentional structure.  Of course in 3 dimensions you can’t have a stairway that goes continually upward (or downward) and ends up right where it started.  The existence of such illusions should provide us with two of the most important lessons we can learn in life – question everything, and just because everyone thinks so doesn’t make it so. DON’T JUMP ON BANDWAGONS.  Think for youself.  As Lincoln observed, you can fool all of the people some of the time.  Judge everything on its merits, not its popularity.




Did You Hear About THOSE People? – False Generalization

Many arguments are generalizations.  Most of science is in fact generalization – taking specific observations and drawing a general inference.  This is called induction, as opposed to deduction, which is common in mathematics.  Induction is always a tricky business, because it’s based on the idea that what we didn’t look at will conform to the general pattern of what we did look at.  If not, we are making a false generalization.

As with many other logical fallacies, false generalizations often come about because our minds have a tendency to draw conclusions too quickly.  Our ancestors’ lives often depended on a quick judgment about whether someone was a friend or foe.  So we tend to generalize from limited information.  Perhaps more significantly, our ancestors didn’t live in a world of mass communication, in which events are presented to us vividly, right in our living rooms.  If something appeared threatening to them, it was close by, and probably was a threat.  By contrast, we are routinely presented with images of happenings that are either far away or very rare, simply because they are spectacular.  Spectacular events, like plane crashes or terrorist attacks, pull in viewers and readers.


Plane crashes are an excellent example because commercial airplane crashes have become so incredibly rare.  At this moment there are thousands upon thousands of commercial planes in flight, taking off, and landing.  This goes on 24 hours a day, every day.  (You can see for yourself here:  https://flightaware.com/live/.)  Yet crashes almost never happen.  When they do, nowadays they are often due to malicious intent, not equipment failure.  Air travel is highly regulated, with numerous levels of safety built into the planes, the pilots, and air traffic control.  But when a plane crash does occur, we see intense media attention.  The problem is that our minds, evolved in a much simpler time, are predisposed to think that something so vivid must be important, and we then generalize, greatly overestimating the risk to us of such occurrences.  This is what logicians call misleading vividness.


News media reports are full of rare happenings, presented to us every day, which are minds naturally tend to overestimate in importance.  When a mass shooting occurs, it evokes a huge amount of national hand-wringing.  Yet in the last 10 years, the total number of Americans killed in mass shootings numbers only in the hundreds.  By contrast, every-day, mundane shootings, many of them involving family members, numbered in the HUNDREDS OF THOUSANDS.  Your risk of dying in such a murder is far, far greater than in a mass shooting.  But your risk of dying from a gun, period, is far lower than you might be led to believe, watching media reports.  Your odds of dying in a car crash are much greater.  But car crashes do not get the kind of media attention that mundane gun murders get, which in turn get far less attention than mass shootings.  Just as our minds tend to overestimate the importance of these very rare events, they tend to underestimate the importance of overlooked risks in our lives.


The other very common overgeneralization people make is that based on anecdotal information.  We take a few observations, either directly made, or provided to us, often by friends or family over the internet, and make sweeping judgments.  Often times we combine this with confirmation bias – our tendency to focus on information that confirms our beliefs, and ignore that which doesn’t.  This kind of thing is all over the internet.  “A friend of ours had his house broken into.  A bunch of their stuff was stolen.  A black man tried to sell it at a pawn shop.”  Conclusion – Black men are thieves, and must be viewed with suspicion.  This may seem like an obvious error, but it’s the kind of error we make all the time, often subconsciously.  We always have to be on our guard against our own tendency to draw hasty generalizations.  We have a strong tendency toward stereotyping – assigning characteristics, often caricatures, to broad groups.


Our tendency to view the world in caricatures is a huge vulnerability.  A caricature is a distortion of reality.  Often this results from our attempts to simplify the complexities of reality.  These distortions become “facts” in our minds, and lead us to draw conclusions and even form world views that have little relationship to the real world.  Some simplification of reality is often necessary for us to get our minds around the complexities.  But caricatures don’t help us at all – Valid arguments are not built on distortions.  By promoting caricatures, scammers and manipulators can take advantage of us.  As usual, the most powerful and most basic principle is to question our conclusions.  Have we been too hasty in making generalizations?  Are our conclusions even based on direct, solid knowledge of the issues, or on limited pieces of information filtered through limited sources?  Maybe we should dig deeper.  Our health, our pocketbooks, our very lives might depend on it.





It Only Proved What I Already Knew – False Premises

Psychologists have made enormous progress in understanding how the human mind works.  Our minds did not evolve in a world of high tech mass communications.  They are designed to deal with a virtually no tech world.  It was a world in which anything you could see clearly was close to you.  A world in which quick decisions were often a matter of life and death.  A world in which your survival often depended on the success of your group.  Your ancestors and mine were the ones whose brains responded accordingly.  Otherwise we wouldn’t be here.


As a result, our minds are quite vulnerable to what’s called the anchoring fallacy.  It’s a fancy name for something quite simple and familiar – first impressions are hard to change.  We are naturally prone to make rapid, snap judgments based on limited information.  Once they are formed, they tend to stick.  Numerous psychological studies reveal how strong this tendency is.  In science, a single observation is not considered very useful, for obvious reasons.  It could be a fluke, an outlier.  The scientific method demands that we make additional observations, and give the first observation no more weight than any other.  But the anchoring fallacy means that our minds resist this in everyday life.  There is no magic trick to defeating this.  Like most any bias, the best way to counter it is to be aware of it, to accept that it’s there, it’s strong, and strong, conscious effort is required to overcome it.


Perhaps even more pernicious is what’s called cherry picking.  This is our strong tendency to pick and choose facts that tend to confirm what we already believe, and ignore those that don’t.  In science we call this confirmation bias.  It is absolutely ubiquitous in the world of politics, where ideological camps constantly wage war against one another using bogus arguments built on carefully chosen facts.  The facts themselves may be quite correct.  But even if they are, are they being twisted and contorted to create a premise that is not?


Mark Twain famously said, “There are lies, damned lies, and statistics.”  Persuaders know that numbers are more impressive than vague, qualitative statements.  The problem is that it’s all too easy to manipulate numbers to conform to your conclusions.  As usual, Homer Simpson expresses it hilariously:  “People can come up with statistics to prove anything.  14% of people know that.”  Because information is presented in the form of a figure or a set of numbers does not give it legitimacy.  The information may in fact be correct, just not supportive of the conclusion being drawn.


At this point, dear reader, you might be throwing up your hands and saying, if I can’t trust “the facts,” where am I supposed to start?  One of the most effective techniques is to avoid looking at just one source, or only a few sources, of information.  In politics that’s not too hard, since for almost any position there is usually an opposing position (or often more than one).  Once the arguments have been presented to you, dig deeper and see how the premises hold up.  The point is never to take anything for granted – not the so-called “facts,” and not the conclusion.  Dig deep.  Question.  Everything.






Look over there! A shiny thing! – Distraction

I’ve always liked magic.  To be more precise, illusions.  Illusionists can do amazing things right in front of you.  When I was a kid, I had a little magic guillotine.  It had an actual metal blade, quite sharp, that could cut through a small carrot.  The trick behind it was quite simple, and the effect was amazing.  The blade would seem to pass right through your finger without so much as a scratch.  Illusionists do a lot of what they do by understanding human psychology.  Recently I was watching a show that revealed how a lot of classic illusions are done.  Some of them are so ridiculously simple that, once you know the secret, you feel pretty sheepish.  Many illusions involve distraction – the ability to pull your attention away from something critical to performing the trick, often for just a moment.  Great illusionists are absolute masters of this.  Of course, the vast majority of illusionists do not claim to have any supernatural ability.  They are not trying to make us believe in Santa Claus, the Easter Bunny, psychic powers, or spirit communication – only in their ability to amaze us.


Distraction in arguments is quite another matter, and it comes in many, many forms.  Politicians are perhaps the most egregious offenders when it comes to distraction.  They often resort to it when asked to defend an unpopular or controversial position.  Or simply just STATE what their position actually is.  In early 2016, presidential candidate Donald Trump met with the editorial board of the Washington Post.  One of the board members asked the following question:

“If you could substantially reduce the risk of harm to ground troops, would you use a battlefield nuclear weapon to take out ISIS?”

Trump’s response, word for word, was:

“I don’t want to use, I don’t want to start the process of nuclear. Remember the one thing that everybody has said, I’m a counterpuncher. Rubio hit me. Bush hit me. When I said low energy, he’s a low-energy individual, he hit me first. I spent, by the way he spent 18 million dollars’ worth of negative ads on me. That’s putting….”

At this point the questioner interrupted him:

“This is about ISIS. You would not use a tactical nuclear weapon against ISIS?”

Trump then responded with:

“I’ll tell you one thing, this is a very good looking group of people here.  Could I just go around so I know who the hell I’m talking to?”

The question was never answered.  Although this is a particularly blatant example, politicians use such methods all the time.  It’s not uncommon for an interviewer to repeat a question 5 or 6 times before getting an answer out of a politician, particularly one running for office.


When someone is making an argument, ask yourself, are they really addressing the issue at hand?  Appeals to emotion are very common.  “Tatooed parents are 400% more likely to introduce their children to drugs.”  “Choosy mothers choose Jif.”  “Why are we listening to that guy?  He doesn’t even have hair.”  We are particularly susceptible to fear-based distractions.  “Do you really want to pass up this security system, and leave your family vulnerable to rapists and murderers?”  “We need to keep homosexuals out of our group because we don’t want our children molested.”  Many urban myths are fear-based, such as the classic “kidney-stealing” story that was so popular when the internet was young.


Another very common distraction variant is what logicians call the straw man.  This consists of misrepresenting an opposing view, and then shooting down this characterization, the straw man.  This argument against the straw man may be perfectly valid.  But the straw man is not the position in question.  A classic example is to attack Darwinism on the basis that it promotes racism and class warfare.  Darwinism has nothing to say about racism or society.  It merely tells us that populations change over time in response to the pressures of natural selection.  SOCIAL Darwinism, on the other hand, is a completely different thing.  It says that some people and some groups are inherently superior to others.  Unsurprisingly this leads to racism and a host of other isms.  Social Darwinism is Darwinism’s perennial straw man.


Yet another common distraction is the appeal to motive, or Bulverism.  Instead of addressing the merits of the argument, the person questions the motives of the person making the argument.  “The only reason you oppose this landfill is that it’s in your back yard.”  “Why would I listen to anything a Muslim says?”  It is very tempting to try to invalidate someone’s argument because you think they have ulterior motives, or a hidden agenda.  But the argument must stand or fall on its merits, regardless of who is making it.


Again, dealing with distraction comes down to one very simple thing.  At the end of the argument, however convoluted, you must ask yourself, did the person in question actually address the issue?  Did the conclusion even relate to the premises?  Was there even a conclusion to the argument, or did the person simply drift off into another subject?  It’s not that complicated.  But sometimes it can seem like it, when dealing with clever wordsmanship.






logical fallacies

There are literally thousands of ways an argument can fail.  Many of them have names – red herring, straw man, ad hominem, argumentum ad populum, etc., etc., etc.  But that’s part of the problem.  Learning to think critically is not about memorizing a bunch of names.  Not that there’s anything wrong with delving into the depths of logic and logical fallacies.  But in practical terms, critical thinking is about developing habits of mind that can cut through questionable claims and conclusions.


Arguments that fail are called logical fallacies.  That is, to the extent that arguments use logic at all.  If an “argument” isn’t even making a pretense at logic, well, you’re on your own.  That’s not an ARGUMENT.  It’s all well and good to go with your gut feelings or your intuition.  But if you do, there’s really nothing more to say about it.  The trouble is, everyone seems to have different gut feelings, different intuitions, and different deeply held beliefs.  What everyone seems to agree on is logic, which is precisely why persuaders and manipulators try to use it on us.  When was the last time you saw a politician, religious figure, or corporation tell you to “just go with your feelings” in making a decision, and leave it at that?  The whole point of persuasion is to try to get your position to correspond with someone else’s.  Persuaders always have an argument, even if they’re trying to appeal to emotion.


I will focus on 6 common forms of failed arguments, because these cover an awful lot of the persuasion in our lives, and learning to deal with them is a good start in developing the ability to defend yourself from nonsense.


1.Distraction – Distraction is perhaps the most commonly encountered phenomenon in persuasion.  Logicians have a fancy name for arguments that involve distraction.  They call them red herrings.  Distraction can take many different forms – personal attacks, appeals to emotion, appeals to tradition, guilt by association, the list goes on and on.  In sales, distraction is often called “bait-and-switch.”  The customer is lured with something attractive, or something they actually want, then directed to what the salesman really wants them to buy.

In argument, distraction is often used to get you to overlook faulty reasoning or inconvenient facts.  Sometimes it’s even used to conceal a conclusion!  A candidate running for office has one basic position – Vote for me.  If his/her particular position on a particular issue is unpopular, he/she will often use distraction to avoid speaking it.  Personal attacks are very popular.  “You’re just saying that because you’re a _____.”  Fill in the blank.  A personal attack is not an argument.  It’s just a distraction.


2.False Premises – Any argument is built on a foundation.  The question is, is the foundation solid?  All it takes is for one premise to be incorrect, and the whole argument will fail.  Human beings are quite vulnerable to first impressions.  Logicians call this the anchoring fallacy.  The first impression tends to “anchor” you, coloring your subsequent observations and conclusions.  Another very common failure is what’s called cherry picking – choosing the facts that support your conclusion, while ignoring those that don’t.  Psychologists are well aware of the phenomenon of confirmation bias – our tendency to screen out anything that rebuts our preformed conclusions, and focus on anything that confirms them.


3.False generalization – The basic facts and premises may be perfectly sound, but in the process of making generalizations from them, we often overreach.  A classic example is what’s called misleading vividness.  When a spectacular event occurs, such as a plane crash or major terrorist attack, it is vividly presented to us in the media.  Our minds tend to think that something right in front of us, especially something threatening, must indeed be an important threat.  We greatly overestimate the risks of such events, while underestimating the much more prevalent risks of mundane, everyday events that receive little media attention.  Another very common example is reliance on anecdotal information – drawing broad conclusions based on just a few observations, which are usually not collected systematically.  Radio call-in shows are a classic example.  Trying to build a generalization based on the statements of people who call radio shows is like building a tower from a leaning foundation.  It is anything but representative of the larger population.


4.Bandwagoning –  To me, The Simpson’s is like an instruction manual for the human species.  If an alien civilization wants to understand us, they can’t do much better.  There’s a great line from an old Simpson’s episode, in which Bart complains to his father, “Dad, you’re giving in to mob mentality!”  Homer’s response is, “No I’m not, I’m jumping on the bandwagon!”  Going with the herd is perhaps one of the most pernicious logical fallacies.  Human beings are social beings, and the urge to join the group is strong, very strong in many people.  In politics, one of the most obvious examples is what journalists call “momentum.”  When a candidate begins to win elections, these wins will often start to snowball.  Many people, especially those who don’t feel strongly about any particular candidate, feel the urge to “get with the winning team.”  And of course there is the behavior of mobs.  People will do things within a group that they would never even consider doing alone.  It is precisely this kind of thing that made the founding fathers add the Bill of Rights to the Constitution – the so-called “tyranny of the majority.”  The majority is often tempted to deprive a minority of their rights, to impose conformity, to squash individuality.


5.Argument from Authority – This one and number 4 might be considered special cases of distraction, but I list them separately because they really deserve special treatment.  There are many scources of “authority” in our lives – parents, teachers, police officers, scientists, churches, government.  Of course most of us obey rules because we know there are consequences if we don’t.  But there’s much more to it than that.  We tend to rely on some source of authority to tell us what we should do, regardless of consequences.  And whatever the source of authority, we tend to believe it when it makes an argument.

Advertisers and others use this principle all the time, in the form of the testimonial.  Someone who looks or sounds authoritative gives the product their endorsement.  One of the old Simpson’s episodes has Mr. Burns running for governor, and trying to convince the audience that the 3-eyed fish caught near his nuclear plant is actually a superfish.  What does he do?  “Let’s ask an actor portraying Charles Darwin what he thinks!”  Actors routinely drape themselves with lab coats to peddle medicines on television.  If it didn’t work, they wouldn’t be doing it.

I am reminded of the scene in Miracle on 34th Street, in which little Tommy Mara points out Santa Claus in the courtroom.  The defense attorney asks him, “Tommy, how can you be so sure that’s Santa Claus?”  “Because my daddy told me,” Tommy replies.  “And you believe your daddy, don’t you Tommy, he’s a very honest man?”  “Of course he is,” the boy replies.  “My daddy wouldn’t tell me anything that wasn’t so.”  All of us have a little bit of Tommy Mara in us.  But authoritative or not, any argument must stand or fail on its own merits, nothing else.


6.False Dichotomy – This is definitely one of the most common forms of argument.  You see it all the time in politics and religion.  “You’re either with me or against me.”  “If you vote for him, America is doomed.”  “If you’re pro-choice, you’re not a Christian.”  False dichotomies take advantage of our inherent tendency to engage in what’s called typological thinking – in other words, to split things into categories, even if they lie along a continuum.  Language itself is an exercise in splitting concepts into categories.  No wonder we are vulnerable to these kinds of arguments.


All of these fallacies take advantage of the fact that we are not blank slates.  We are born with biases, needs, and expectations.  There are mountains of psychological and sociological research that give us insights into our own vulnerabilities.  Critical thinking gives us tools to keep us from becoming our own worst enemies.


It is worth noting that all of these so-called fallacies become a lot more justifiable if you don’t have much information.  That’s one reason children are susceptible to them.  Take bandwagoning for example.  Small children don’t even question what the adults around them are doing.  They simply imitate.  If you don’t have access to the facts, you may have to rely on discerning someone’s hidden agenda when they are making an argument.  Or if you only have a few facts, and you must make a generalization – well, that’s life, sometimes.  Life is full of decisions that have to be made based on limited information.  But the point is that we should be as informed as possible, and apply the skills of critical thinking when making them.  As I have said, human beings are not blank slates.  They come preprogrammed with all kinds of predispositions, many of which worked just fine when people had little or no technology.  But in our modern, technologically advanced, highly interconnected, information society, many of these predispositions are obsolete and can hurt us.

Religion and critical thinking – Is there a conflict?

The 20th century saw amazing technological advancements.  They must have been astounding to many Americans born the previous century.  Telephone, radio, television, vaccines, antibiotics, washing machines, refrigerators, air conditioning, automobiles, airplanes, helicopters, rockets, nuclear technology – All of these things were met with a mixture of awe and dread.  The seemingly innocuous telephone was seen by some as an instrument of the devil.  But most Americans embraced new technology because generally it made life better.


In the early 20th century there was a huge move away from farms into cities and suburbs.  Modern agricultural methods and machinery made vast numbers of farm workers obsolete.  In 1900, 41% of the U.S. workforce was employed in agriculture. By 1930, this figure was cut in half, and by 1945 it was down to 16%.  This percentage continued to decline during the course of the 20th century.  Today less than 2% of the U.S. workforce is employed in agriculture, yet America continues to be one of the world’s great breadbaskets.


Along with this transition came increasing levels of education and mobility.  Americans became much more exposed to a variety of people and ideas.  A tension developed between rural people, who remained relatively isolated and parochial, and urban people, whose experiences inevitably made them less rigid and more tolerant in their thinking.  The fight for and against the prohibition of alcohol was largely a struggle between rural and urban America, between white Protestant rural America and much more ethnically, religiously, and culturally diverse urbanites.


During this time, there was a struggle within American Protestantism, between mainline churches who wanted to modify their doctrines to accommodate the changes in society, and fundamentalists, who fought fiercely against this.  In fact, the word fundamentalist comes from a series of essays published between 1910 and 1915, essays intended to express the “fundamentals” of Protestantism that should be defended.  These essays attacked liberal Protestantism, Catholicism, Mormonism, a whole host of isms, but most notably, modernism itself.  Fundamentalism insisted on such things as the historicity of Biblical accounts, such as the stories in Genesis, and in general, a literal interpretation of scripture.  It is for this reason that the subject of the origin of man became such a point of contention, leading to the infamous Scopes “monkey trial” in 1925.


Meanwhile, mainline Christian denominations, such as Methodism and Presbyterianism, were moving away from such positions.  Faced with incontrovertible evidence of the age of the earth, the origin of man, and extensive scholarship concerning the origin of scriptures, these churches began to embrace the view that while the scriptures were indeed God’s word, they must be interpreted through the lens of the cultures in which they were written, and examined using God-given reason.  They began to reject the view that everything in the Bible was literally true; that the Genesis accounts, for example, should be viewed allegorically, not as literal history.


The response of fundamentalists was to either establish separate fundamentalist congregations within these denominations, or more often, create new denominations.  Thus Pentecostalism was born.  Religious revivals spread across the country, building enthusiasm for fundamentalism.  Along with Biblical literalism, these movements reinforced their adherents’ rejection of many aspects of progressive thought – particularly tolerance of outside religious, ethnic, and cultural groups.  In the early 20th century there was a struggle between the fundamentalists and mainline Protestants, mirroring the battles between rural and urban America.  By mid-century, it seemed that mainline Protestantism had won, at least insofar as the fundamentalists were not able to have great influence in the national political sphere.


Camera: DCS420A Serial #: 420-2040 Width: 1524 Height: 1012 Date: 11/24/97 Time: 11:39:45 DCS4XX Image FW Ver: 081596 TIFF Image Look: Product ———————- Counter: [88] ISO: 100 Aperture: F2.8 Shutter: 60 Lens (mm): 28 Exposure: M Program: Po Exp Comp: 0.0 Meter area: Mtrx Flash sync: Norm Drive mode: S Focus mode: S Focus area: Wide Distance: 3.4m

After World War II, the baby boomer generation was born.  In 1957, the Soviet Union launched the first artificial object into orbit.  This had an enormous impact on American politics and culture.  America went into a virtual panic over the fact that this supposedly backward, communist country had achieved such an advanced technical feat.  Immediately there was an emphasis on education in general, and science fields in particular.  This carried through the 1960’s, and the baby boomers went to college in numbers never before seen.  College enrollment in the U.S. increased 120% during the 1960’s.  There was a pervasive sense that old parochialisms needed to step aside and make room for the “Space Age,” a new age in which science and technology would further improve lives.  Mainline religious institutions moderated many of their doctrines further, and religious fundamentalism seemed to be a voice in the wilderness.  There was a general understanding that sectarian religion was to be kept out of politics.


But as America beat the Soviet Union to the moon, and high profile assassinations made many Americans cynical about the future, this push for education faded as fast as it had waxed.  The idealism and optimism of the 1960’s gave way to a strong cynicism and materialism in the 1970’s.  Mainline Christian denominations lost membership, and many young people felt adrift, wanting something to attach themselves to, something that would promise rewards here and now.  That something was just waiting to provide it – a new variety of fundamentalism, the charismatic movement.  It used many of the same methods as the religious revivals of the early 20th century – music (this time with a contemporary feel), noisy, lively services, faith healing, and inspiring messages.  But this time there was an important difference.  The prosperity gospel.


Both Protestantism and Catholicism had long taught humility and service to others, and mainline Christianity was still teaching that.  But the charismatic movement brought a very different message.  Not only would your faith heal your body and bring you closer to God, but it would give you material wealth.  Your shiny new house and shiny new car meant that you were one of God’s chosen.  Your rewards were not just in the afterlife, but here and now.  And you didn’t need an education, they said.  All you needed was your faith.  In fact, most of them vilified higher education, as a bastion of secular, ungodly forces.


Millions of young Americans joined these churches.  At the same time, educational attainment in the U.S. dropped.  After years of dramatic increases in the percentage of Americans obtaining Bachelor’s degrees, between 1975 and 1980 that number actually declined sharply.  At the same time, fundamentalists began to dominate the religious airwaves.  Although televangelism was familiar to many people in the 1960’s, in the form of Billy Graham’s crusades, Graham’s message was not a fundamentalist one.  He did not spend his crusades ranting against abortion, evolution, liberalism, or the welfare state – his message was overwhelmingly focused on Christ and salvation.  By contrast, televangelists that sprang up in the 1970’s, such as Pat Robertson, Jerry Falwell, Jim Bakker, and Jimmy Swaggart, were overwhelmingly fundamentalist, and in many cases, overtly political.


Then along came Ronald Reagan.  Not only did Reagan legitimize the materialism that had already established itself in many young minds, inspiring many young people to go into business and law rather than engineering and science, he ushered in the intrusion of religion into American politics that we have been dealing with ever since.  Fundamentalist churches catering to white suburbanites and exurbanites exploded during the 1980’s.  The megachurch, emphasizing entertainment and excitement and greatly minimizing self-sacrifice and social justice, came into its own.  Many fundamentalist churches, and of course televangelists, were highly political, emphasizing abortion, intolerance of homosexuality, and other “culture war” issues.


Today, Christianity is deeply split.  There are liberal, mainline denominations, emphasizing religious and ethnic tolerance, humility, social justice, and service to others, many of which are organized under the National Council of Churches.  And there are fundamentalist churches, emphasizing Biblical literalism, abortion, evangelism, material prosperity, and political conservatism.  Almost every mainline Christian denomination has its liberal wing and fundamentalist wing.  In many cases these are actually separate denominations.  Fundamentalists tend to be fragmented, isolated from other church organizations.  The Southern Baptist Convention, the largest fundamentalist Baptist denomination in the country, is completely separate from other BAPTIST organizations.  But fundamentalist churches tend to be very politically active and often have a presence on radio and/or television.  Fundamentalist organizations like the American Family Association, The Christian Coalition, and the now-defunct Moral Majority are not denominations of Christianity, or even, strictly speaking, religious organizations.  They are political advocacy groups who have a strong presence in the media and in halls of power, constantly fighting for the intrusion of fundamentalist Christianity into the public sphere, and promoting nonsense such as young earth creationism that was rejected by mainline Christian churches decades ago.


By contrast, The National Council of Churches not only includes an array of Protestant denominations, but even some Catholic and Orthodox churches.  Yet its political presence is much less apparent, partly because the very doctrines of these churches include the separation of church and state.  Although these churches do fight for tolerance and social justice, they do so more quietly, and by and large do not proselytize on the airwaves.


Why have I taken you through this recent history of American religion?  Because the question at hand is whether religion is in conflict with critical thinking, and I wanted you to understand that the word religion encompasses a huge range of theologies and political positions, even within Protestantism, which is only one branch of Christianity.  Critical thinking, by and large, is something Americans learn in college.  About 33% of Americans get Bachelor’s degrees these days – let’s see how this relates to religious groupings.  Fully 35% of Methodists and 51% of Presbyterians in America get Bachelor’s degrees.  These are mainline Christian denominations.  Among Jews and Hindus the percentages are even higher – 58% and 67%.  Yet only 16% of Pentecostals and 24% of Assemblies of God members obtain these degrees.  These are fundamentalist denominations.  Fundamentalists have consistently devalued higher education as secular and ungodly, rejected basic science when it conflicts with Biblical literalism, and preached that only faith is necessary for success, here and in the afterlife.


When fundamentalism came into prominence in American culture in the 1970’s, and captured the minds of many young Americans, what happened?  After years of dramatic increases in the percentage of Americans obtaining Bachelor’s degrees, between 1975 and 1980 that number actually declined sharply.  And over the next 20 years, the percentage of Americans receiving Bachelor’s degrees didn’t increase at all.  The flattening of educational attainment after 1975 was particularly dramatic among American men.  Over the next 10 years, the number of American men receiving Bachelor’s degrees declined by more than 5 million, despite the fact that the U.S. male population grew by more than 10 million.  After 35 YEARS, in 2010, the percentage of American men who were obtaining Bachelor’s degrees still had not reached the level seen in 1975.  State support of higher education has declined steadily since the late 1970’s.  Political conservatives have consistently fought against funding for higher education, and encouraged the intrusion of religion into American politics.  Many of the politicians on the forefront of such efforts have come right out of fundamentalist Christianity.


Some would argue, and do argue, that religious belief in general is incompatible with critical thinking, which demands evidence and reasoned argument, not blind faith, in making judgments and forming conclusions.  But I think this misses the point, much as the debate between modernism and postmodernism misses the point.  Because someone belongs to an organized religion does not mean they are incapable of critical thinking.  And more importantly, organized religion per se is not an impediment to higher education.  It is in fact religious, often devout people who have done much of the scholarship leading away from fundamentalist interpretations of scripture.  Religious fundamentalism, on the other hand, has demonstrated time and again that it is an adversary of critical thinking and higher education.  Fundamentalism demands adherence to absurdities that any person of intellectual integrity, religious or not, must reject.


I will finish with an excerpt from an essay written by a religious man, Martin Luther King.  During his time in seminary, he wrote a paper on Old Testament stories and their relationships to Babylonian and Sumerian myths.  His conclusion:  “Modern archaeology has proven to us that many of the ideas of the Old Testament have their roots in the ideas of surrounding cultures….we must conclude that many of the things which we have accepted as true historical happenings are merely mythological….If we accept the Old Testament as being ‘true’ we will find it full of errors, contradictions, and obvious impossibilities….”  Dr. King accepted that the Old Testament contains “truths.”  But because he was a man of intellectual integrity, he refused to say it was literally “true.”  Many devout, religious people feel the same way.

Post Navigation