The Futility of Argument Nicholas Fletcher Marion Technical College February 23, 2014
Futility 2 People argue a lot. There is quite a wide range of topics people argue about. From incredibly simple topics like where to order pizza from or what color to wear to extremely polarizing topics like gun control or the death penalty. Everyone has an opinion on these topics, and some will argue about their chosen side for hours. However, almost no one ever changes their mind about anything important because of these arguments. This is due to a long list of faults in human reasoning; these faults are known as Cognitive Biases (Heuer Jr., 1999). These cognitive biases result in most arguments being mere exercises in futility. Before even taking these biases into account, there is a huge stumbling block in the path of any intelligent argument. Everyone is different; they have different values and different experiences. Combine different values and experiences with the charged emotions that come with an argument, and each person arguing sees the other as stupid, crazy or even both. By assuming the other person is crazy or stupid, people can just ignore the ideas of the person they are arguing with. Even if the people arguing can understand how each other think, theres still a long way to go before anyone could be talked into changing their mind. Oftentimes, people think the person they are arguing with is being deceitful about their claims, even if they are being truthful; this gap between how truthful someone seems and how truthful they are is known as the trust gap (Dean, 2010). However, the real problem lies in the fact that when someone decides that someone is lying, they rarely discover that they are wrong; they just feel good about having spotted a liar so easily (Dean, 2010). The most dangerous part of this effect is the self-reinforcing nature of it. If someone thinks everyone is lying to them, they will be convinced that what they hear is a lie, Futility 3 which in turn gives them more evidence that everyone is a liar, and so on in an endless loop (Dean, 2010). The trust gap is where claims like Those liberals dont really want to help the poor, theyre just secretly communists! come from; their rivals do not trust them and therefore assume they are lying. As before, when someone assumes the person they are arguing with is lying, they do not need to address what that person actually said; they can just ignore the other persons argument. Then there is issues with probability. The human mind actually has several issues with probability and statistics. One such problem is known as the Availability Rule. Under the availability rule, people see events they hear more about as being more likely, availability refer[ing] to imaginability or retrievability from memory (Heuer Jr., 1999). This normally works rather well; things that you hear about more often are often happening more. However, the availability rule can result in some very skewed priorities. For example, events that receive a lot of media coverage will be seen as a bigger problem than other events, even if those other events have a greater impact overall. Think about school shooting compared to industrial accidents; people hear more about the former, and therefore care about the former more, even though the latter kills more people. But, if someone tries to argue that the latter is more important, most people would react by saying something in the vein of But think about the kids! This is how emotions can amplify the filtering of the availability rule; if they care more about one thing, they will discount other things even further. Futility 4 In addition to being skewed by hearing about something all the time, experiencing something personally can really skew your perception of probabilities (Heuer Jr., 1999). For example, imagine two people, identical in every way, except one of them had a sibling die in a skydiving accident. If these two were both offered a free skydiving trip, the one with the dead sibling would see it as riskier, even though the risk is the same for both of them. This too is caused by the availability rule; the one with the dead sibling is more personally aware of the risks of skydiving and so sees such risks as more likely. Of course, the availability rule is not the only issue the human mind has with probability and numbers. There is also a phenomenon known as Anchoring; in the process of Anchoring, people attach great significance to the first number they hear in a given situation (Heuer Jr., 1999). By attaching significance to the first number, other numbers are distorted by comparison. If the first number is too low, further adjustments are likely to all be too low (Heuer Jr., 1999). For example, car prices are listed as being higher than the salesmans target price, so when the buyer talks the salesman down from the listed price, they feel good about getting a discount; but the salesman got exactly what they wanted. Just about any number can form an anchor and skew further results. Even random and unrelated numbers can form this anchor and skew logic (Heuer Jr., 1999). There are a range of other issues the mind has with probability and numbers, but even these two can sink an argument. One person tries to use statistics and the other misunderstands them or ignores them entirely. When emotions get into the mix, such as in an Futility 5 argument about a polarizing topic, the effect of these number errors are amplified; making statistics useless in heated arguments. Even if people could truly understand the numbers their argumentative partner was using, there is a good chance they might ignore them anyway. This is due to an effect known as confirmation bias. Confirmation bias is strengthening ones argument in response to those of rivals and is often caused ignoring evidence contrary to your position while focusing on evidence that confirms it (Crowley & Zentall, 2013, pp. 286-287). For example, think about the common superstition that emergency rooms have more admissions during the full moon. Someone who has heard of this might notice a night with lots of admissions that happens to be a full moon and treat it as evidence that the superstition is true. However, that same person would not notice or care about high admission nights that were not on the full moon or full moon nights with low admissions. That is confirmation bias in action. In arguments, people often apply confirmation bias to any evidence that people they disagree with use while arguing. With people disregarding and ignoring evidence that does not fit with what they have chosen to believe, it is quite difficult to get someone to change their mind. Another common issue is that more vivid information is more likely to be taken into account (Heuer Jr., 1999). This leads into the availability rule, but is a problem on its own as well. Information is easier to remember if it is easier to relate to and understand. Stories about individuals are treated as more important than data about large groups; things seen directly as treated as more important than things heard about indirectly (Heuer Jr., 1999). This usually acts as a fair guide, after all You can prove anything with statistics, but sometimes it can throw Futility 6 ideas well out of balance and inflate their importance well above what it should have in a purely logical mind. Imagine a new model of car. According to surveys of thousands of people, this car is a nice and reliable car; according to one guy talking on the street, the car is a horrid car that barely works. The logical thing to do is to consider that one guy as part of a larger data set and maybe try and figure out why he is such an outlier; however, most people would consider that one guys words to be worth as much as the surveys (Heuer Jr., 1999). They may even consider that guys words to be worth more than the surveys, if they are friends of the man or even related to him. If two people are arguing about things they saw, both have very vivid evidence and will find it very hard to convince the other about what they saw. Conversely, if two people are arguing about some abstract idea, neither has vivid evidence to use, leading to the same problem. Either way, the vividness of evidence can cause great issues to any attempted intelligent argument. Overall, it seems like the human mind is not built for intelligent arguments. Between difficulties understanding others, not trusting people and difficulties understanding what evidence is important, the human mind just doesnt seem capable of reasonable discussion of heated topics. So, why do people argue? One common theory to answer this question is known as the Argumentative Theory of Reasoning. The Argumentative Theory of Reasoning states that argument was invented and developed, not as a method to convince others that you were right, but as a method to establish dominance in communication and to spread information by force (Crowley & Zentall, Futility 7 2013, pp. 282-238). As a result, people consider their enemies to be liars (leading to the trust gap) and evidence counter to their position to be worthless (thus confirmation bias). Therefore, a rational, intelligent argument is simple something the human mind is near incapable of having.
Futility 8 References Crowley, P. H., & Zentall, T. R. (Eds.). (2013). Comparative Decision Making. Retrieved March 23, 2014, from http://rave.ohiolink.edu/ebooks/ebc/9780199856800 Dean, J. (2010). The Trust Gap: Why People Are So Cynical. Retrieved March 23, 2014, from Psych Central: http://psychcentral.com/blog/archives/2010/03/30/the-trust-gap-why- people-are-so-cynical/ Heuer Jr., R. J. (1999). Psychology of Intelligence Analysis. Retrieved March 23, 2014, from The Air University: http://www.au.af.mil/au/awc/awcgate/psych-intel/index.html