Sunday, December 06, 2009

A fallacy in need of a name

At a party I attended earlier this year, one particularly vehement guest announced that, "Russia was justified in bombing civilian targets in Georgia because Georgia had killed ethnic Russian civilians in South Ossetia." I pointed out to the vehement fellow that neither Russian law, nor Georgian law, nor international law, nor moral reason allow for collective capital punishment based on ethnicity or nationality, and therefore of course it doesn't make any sense that if the Georgian military killed ethnic Russian civilians, the Russian military is allowed to kill Georgian civilians. Unable to refute this logic, he switched to another argument, using the same logical fallacy, claiming that we as Americans cannot object to the behavior of the Russian government, because the American government has killed so many civilians in so many countries. When I pointed out to him the possibility that the Russian and Georgian and American governments may all, simultaneously, be culpable for their own actions, and that we need not be apologists for any war crimes, despite the multiplicity of criminals, my wife suggested it was time we leave the party. This was a very sensible suggestion, so we did.

I've recently written about the importance of having names for common logical fallacies, and this got me wondering what the name for this guy's fallacy is. It is an extremely common one, and is in no way original to him. The fallacy is this: assuming or implying that if one side in a debate/argument/conflict is wrong/guilty the other side is therefore right/justified. This most often comes up in moral contexts, where the misdeeds of one party are used as a defense for the misdeed of the other party, but is also used in the context of disagreements about fact. For example when it comes out that an evolutionary biologists was wrong about anything, creationist make the leap to this being proof of Creation, rather than simply a flaw in the thinking of a biologist. One side is wrong, so the other must be right, even if the flaw is immaterial to the disagreement. Similarly, it is common in interpersonal disagreements to hear one party respond to an accusation of misbehavior only by pointing out a misbehavior of the other party without ever replying to the statement about their own behavior. This often goes well beyond "two wrongs make a right" to "my behavior is necessarily right because yours is wrong." This is also distinct (and almost opposite) from the idea of moral equivalence. In the Arab-Israeli conflict, for example, some people argue that the crimes committed by supporters of the two sides are morally equivalent, and therefore neither side can be held responsible. Each side often responds to this nonsense with the counterfallacy that our behavior, whatever it may be, is justified by their crimes. They are criminals therefore we are just. I would tend to assume that each party or individual is responsible for its own actions, and whether or not the crimes are equivalent is morally irrelevant.

So this raises three sets of questions for me:

1. What is this fallacy called? If we want to describe a response as following this pattern, how should we refer to the pattern?

2. Why do humans do this so freely? On some deep level are we programmed by evolution to use this type of rationalization? Is this culture specific, or do all humans do this?

3. Why is this particular type of illogic useful/successful in arguments? Perhaps it is particularly effective because it allows one to move from being on the defensive to being on the offensive, to accuse rather than excuse (or to effectively do both at once)? Or maybe it is just very hard for people to find fault in both sides of a conflict at once, so by making them see the fault in the other side you make them forget about the fault in your own?

I'd appreciate hearing your thoughts on this, particularly what this type of fallacy should be called, or if it already has a name. I've checked Wikipedia's list of fallacies, and it ain't there.


gml said...

I searched the web, and found no clear explanation, and no satisfactory term; "excuses for revenge" merely labels the phenomenon, without explaining anything. The best I can do is to guess that this form of illogic often comes from the natural human desire to hurt others who have hurt us ("an eye for an eye\"; "lex talionis"; "tit for tat"), followed by the threshold of justification being lowered in response to the intense desire. It can be called rationalization, though that term applies to two nearly opposite actions. The first is inventing a superficially rational excuse for something we want to do anyway; the second is actually making something that is irrational and poorly organized into something rational and better organized.

Dan Levitis said...

It is, in a sense, a form of rationalization, but that is a name for the purpose for which the fallacy is employed, rather than for the fallacy itself.

jte said...

If you're going to try to come up with terms that specify types of fallacies, you should keep them distinct. You clearly are describing more than one type of fallacy here. There's the "two wrongs" fallacy, which is distinct from the "they wrong therefore we right" fallacy (because in the first, both wrongs are made "right" whereas in the second the claim is only that one of the wrongs is made right by virtue of the existence of the other wrong). Some of this reminds me of the notion of reciprocity--maybe you want to specify a "reciprocal" fallacy, one that encompases fallacies of the sort: "our behavior is justified so long as it matches the behavior of others." An alternative name might be the "ungolden rule" fallacy--morality means treating others as they have treated you. These more closely match the "two wrongs" type fallacy than the "they wrong/we right" fallacy. I can't think of a compact name for that second type... Maybe the "rubber-glue" fallacy? (From the youthful taunt, "I'm rubber, you're glue, whatever you say bounces off of me and sticks to you.") (So, not to be confused with some other conceivable "rubber cement" fallacy--something that might be troublesome in translation.)

As for why: my personal and longstanding hypothesis is that rationalization is a deep psychological need among human beings, certainly the Western culture human beings I've known. It's almost as though "rationality" is an emotional state, rather than a method for thinking through a situation. People want to feel as though they are rational, and will grasp at whatever straws are handiest to attach that feeling to whatever illogical belief they are simultaneously expressing.

Dan Levitis said...

I'm not clear which fallacies your suggesting I'm conflating. I make the clear distinction between the "two wrongs" fallacy (justification through moral equivalence) and the other fallacy for which we don't have a good name. Perhaps the Concentrated Fault Fallacy? All fault must lie in one place and because you are at fault, I am not.

gml said...

The point may not be concentration of fault, exactly; it may be balance of fault, as symbolized by the scales of justice. "Thou art weighed in the balance and found wanting." Perhaps enough load of badness in another lifts up your (good) side of the scales; it is much more comfortable to locate the badness outside yourself. GML

GreenEngineer said...

Hi Dan! I somehow lost track of your blog, but I've added it to my Google Reader list now.

Anyway, you're looking at what I tend to think are two different effects that often overlap.

The tendency to respond to an accusation with a counteraccusation is, I think, just a very common defensive/coping mechanism. There are any number of ways to redirect an accuser's attention, but putting them on the defensive is often one of the most effective.

The other effect is, I think, more profound. I think of it as a propensity for false dichotomy: the tendency to think of everything in either/or terms. It's so very common that I think it probably reflects a basic limitation in human cognition. Also, it has obvious utility as a form of reflexive reductionism, a way to reduce the complexity of a situation to a few binary possibilities. In many cases, particularly simple ones, it's a pretty decent approximation.

It is not well adapted for the complex thinking required to understand modern technical, economic, or political issues, but that's true of most of humanity's thinking tools. We're very much better with proximate, concrete, immediate problems than we are with abstract, distant, future problems.

As far as I can tell, this cognitive limitation is more easily overcome by "smart" people than by "average" people, but education and training is really the more important factor. (In fact, insofar as "smart" people tend to expect to actually understand things, and have definite answers, those expectations can encourage this failing.) It's something that I have tried to teach myself to be on the lookout for, but I still fall into the pattern if I'm not careful.