Arguments in Context Unit III An Introduction to Evaluation Chapter 9 Bias and Motivated Reasoning

Explore the Arguments in Context Unit III An Introduction to Evaluation Chapter 9 Bias and Motivated Reasoning study material pdf and utilize it for learning all the covered concepts as it always helps in improving the conceptual knowledge.

Subjects

Social Studies

Grade Levels

K12

Resource Type

PDF

Arguments in Context Unit III An Introduction to Evaluation Chapter 9 Bias and Motivated Reasoning PDF Download

CHAPTER Bias and Motivated Reasoning SECTION I INTRODUCTION We know how to unpack and analyze arguments and we know the relevant features for evaluating them . Now all we need to do is put our knowledge and skill to work , right ?

no . Unfortunately , things are not so simple , since we are subject to a variety of biases and cognitive illusions that can make accurate and objective evaluation difficult . Although these illusions and biases are pervasive and often unconscious , we can adopt a number of strategies to limit their effects over us . In this chapter we will start with a consideration of one of the most pervasive forms of bias , bias , and see that simply having a preference for an idea can bias our thinking about it . We will then look at some ofthe factors that influence our preferences , and conclude with some techniques to mitigate the effects of bias . SECTION BIAS We have all seen cases of biased thinking . Take for instance the serious sports fan that unfairly dismisses the possibility that her team might lose , or the political partisan who ca see past his own views , or the mother who simply wo believe that her child could have done such a thing . Although we see it in others , we tend to think that are , on the whole , fair and impartial . Other people are biased , but we are not . Unfortunately , we are often wrong about this . At least sometimes , we are not the neutral detached we think we are , and are guilty of the same biased thinking we see in other people . In general , a bias is a preference that inhibits impartial evaluation . In this chapter we will focus on a broad category of bias , namely bias . bias is the propensity to let our impressions , beliefs , and interests preferentially influence our evaluation of evidence . In general , we have a preference for views that conform to our existing beliefs and interests , and it can be difficult to separate our preferences from the evidence . What is especially troubling about bias is that biased reasoning can feel the same from the inside as unbiased reasoning ! That is , the fact that like we are being fair and neutral in our ofthe evidence is no guarantee that we are actually being fair and neutral . The effects of bias can vary . In general , however , the more personally invested we are in the truth or falsity of some claim , the more difficult it is to objectively assess the evidence for or against it . When we are personally invested in the truth or falsity of some claim , we tend to focus selectively on information that supports our preferred view . Moreover , we tend to to the evidential value of this supportive information , and to the evidential value of information that challenges our preferred view . For example , a person who has politically liberal beliefs will tend to pay closer attention to information that 99

100 THADDEUS ROBINSON supports his political views , and less to information that raises doubt about them . Furthermore , such a son will tend to see the evidence for his political beliefs as being stronger than it really is , and the evidence against his view as being weaker than it really In this vein consider the old saying that , a man who is his own lawyer has a fool for a client . Unlike many proverbs , this one offers good advice and bias explains why when your guilt or innocence is on the line it can be very difficult to objectively consider the merits and faults of the case against you , as well as the strength of your own defense . Thus it is best to step aside and let someone who has less personal ment take over . However , we need not have a strong personal investment in some claim in order for the effects of bias to kick in . The impartiality of our evaluations can be undermined by our own perspective in a variety of ways . A particularly infamous case of this is for the led invasion of Iraq in 2003 . In February of 2003 the Secretary of State , Colin Powell , stood before the United Nations and made a case for the invasion of Iraq . Central to this case was his assertion that Iraq was in possession of weapons of mass destruction . Soon thereafter Iraq was invaded , but no evidence of such weapons was ever found . A subsequent review of how the intelligence community could have made such a mistake came to the sion that what we have called bias played a big role . The review noted that The Intelligence Community ( IC ) from a collective presumption that Iraq had an active and growing weapons of mass destruction ( program . This dynamic led Intelligence Community analysts , collectors and managers to both interpret ambiguous evidence as conclusively indicative of a program as well as ignore or minimize evidence that Iraq did not have active and expanding weapons of mass destruction According to this report , the intelligence community collectively expected Iraq to have such a program , and this shared expectation led to an unintentionally biased evaluation of the evidence , and ultimately to a decision to take military action . Consider another case . Scientists who study the efficacy of new drugs or treatments know that peoples wishes , desires , and expectations can bias their studies . Obviously , a patient will want the new drug or ment to work , and this can affect how they evaluate their own state . In this way , a patient might feel as if the drug or treatment is working even when it is not , for example . In part to limit these kinds of effects , researchers normally split subjects into two groups a group that gets the experimental treatment and a group that does not . The group that does not get the experimental treatment gets a placebo instead ( a treatment with no medicinal value , a sugar pill ) Neither group knows whether they ve received the experimental treatment or a are blind to this factor as researchers put it . Since the test subjects do not know what treatment they received , researchers can separate biased effects that are the result of peoples preferences from effects that are the result of the treatment in question . Note we will talk more about the reasoning behind this kind of experimental design in the unit on Scientific Reasoning . Subjects are not the only possible source of bias in this kind of context . The researchers own preferences for one result or another can lead to an unconsciously impartial evaluation of the evidence , as well . After all , researchers want these experimental drugs or treatments to work too . In order to prevent these biases from coloring the results , researchers can effectively blind themselves so the do not know who has received the placebo and who has not ( until the end of the study ) Studies conducted in this way are called studies since neither patients nor researchers know who is getting the treatment and who is merely getting a placebo .

BIAS AND MOTIVATED REASONING 101 Moreover , we know that makes a difference . Partly to emphasize the importance of blinding , a group of researchers studying treatments for multiple sclerosis ( a debilitating and currently incurable disease that attacks the nervous system ) decided to conduct two versions of the same study a blinded version and an The medical aim of the study was to discover whether a ing new treatment was really more effective than a placebo . Here is how they set up the two versions of the study first , researchers divided patients into different categories some patients received actual ment while others received a placebo . Second , they split the researchers into two groups . One group knew which patients had received the placebo and which had is , they conducted the study . The other group of researchers were is , they were prevented from knowing who had received the treatment and who had . Both blinded and researchers were asked to examine the test subjects over a period of years at month intervals . wader Unfortunately , the treatments turned out to be no more effective than a placebo . However , on the whole , the researchers determined quite the took the treatments to be effective ! The researchers surely did not set out to make biased judgments . Presumably , they wanted to know whether the treatment was effective as much as anybody else and sought to be as objective as possible . Nonetheless , their assessment was biased , and this illustrates the deceptiveness of bias . Again , the problem is notjust that can be biased by our own preferences , but that they can be biased despite our sense that they are not ! SECTION SOCIAL INFLUENCES Given that even slight preferences can , unbeknownst to us , influence our evaluations , it is important to have a sense for the forces that influence our preferences . Of course , we live in a complex world , and there are all kinds of factors that influence us . Nevertheless , perhaps the most important influence on our is social . The crucial observation here is that beliefs can have social value . We first raised this in Chapter , where we noted that our desire be seen by others in a particular way can undermine the goals of cooperative dialogue . In addition , the social value ofan idea can create preferences that undermine fair evaluation . Let see how this works . The social value of a belief has to do with what other people will think upon finding out that you have it . More specifically , when somebody else thinks better of you because you have a particular belief , that belief may have positive social value for you , and when they think worse of you in virtue of it , the belief

102 THADDEUS may have negative social value for you . Whether a specific belief or idea has positive or negative social value for you depends on a number that it is worth making explicit . First , it is important to understand that not every idea has social fact , many do not . For example I think the grocery store should get new grocery carts . Here is expressing her opinion , and while people might agree or disagree with her about this , it is unlikely that anybody is going to think better or worse of herself in virtue ofthis opinion . Second , the social value of an idea depends on who is doing the judging . Suppose , for example that says Ex . I think that people who go to church on Sunday are pretty much wasting their time , We can easily imagine that people response to claim might vary . Suppose that upon hearing this friend Maddy responds with disappointment , saying Geez , I did realize you had become so Whereas another friend of , Lori , responds in a different way saying , At last , somebody brave enough to tell it like it is . In this case , belief will have positive social value for with respect to Lori , but negative social value with respect to Maddy . Third , the social value of an idea depends on whether , and to what extent , we care what the person who is doing the judging thinks . In general , we want to be by other people , but this does mean that we care what every single one ofour acquaintances thinks of us . Moreover , we care to varying degrees . What your mom or best friend thinks probably matters a lot more to you than what your neighbor or dentist thinks about you . To illustrate suppose that has recently fallen out with Maddy , no longer regards her as a friend , and as a result does particularly care what Maddy thinks . In this case , belief may have little to no social value with respect to though Maddy thinks less of because of her belief . How does the fact that ideas or beliefs can have social value relate back to the question of bias ?

The answer is that our desire to be by people who matter to us can give us a preference for ideas and beliefs that we think might have positive social value for us , and a preference against ideas that we think might have negative social value for us . As we have seen , simply having a preference for an idea can unknowingly lead us to biased and unfair thinking . More specifically , this means that we will tend not to fully investigate or fairly evaluate ideas that have negative social value for us , and not be adequately critical of beliefs that have positive social value for us . This is something we need to keep in mind , so that , at least when it really matters , we stop to think about the extent to which our thinking is being influenced by our desire to be liked ( or not disliked ) by other people . SECTION PEOPLE AND IDEAS Ideas can have social value for us because we care about what our friends and family think . However , we can turn this around and think about it from the opposite perspective as well . After all , just as you care what your friends and family think about you , so too your friends and family care about what you think about

BIAS AND MOTIVATED REASONING 103 them . As such , you contribute to the social value of ideas for your friends and family . That is , if you think less of one of your friends for believing and they care about you think , then believing may have tive social value for your friend . The fact that we can have this effect raises a question about what kinds of attitudes we should take towards people on the basis of their ideas and beliefs . How should we think about people when they disagree with us , propose dubious ideas , or ask questions we do like ?

The short answer to this question is that we should only think worse of a person on this basis if we are justified in doing so . Of course , it is a much tougher question to settle when we in doing so , and like many questions we will take up , there is no answer . However , there are a couple of observations it is worth keeping in mind . First , recall from Chapter that we have a tendency to unfairly vilify people who disagree with us , or who disagree with our ( a group of people we identify with as a member ) That is , we have a tendency to unjustifiably think less of people in these circumstances . There is no doubt that there are people who are ignorant and unethical in ways thinking less of them as people . However , it is important to emphasize that simply because somebody disagrees with us does , all by itself , give much reason to think less of them as a person . After all , there are many explanations for why someone might disagree with you outside of some moral or intellectual flaw it may be that you have relevant information or experiences they do not have . It may be that they have relevant information or experiences that you do alternatively , it may be that you have the same information , but draw on different but reasonable principles or values to evaluate it . Given this , in many cases , leaping to the conclusion that there is something wrong with a person or thinking less of them solely because they disagree with you is not only poor reasoning , but is also unfair to the other This brings us to the second point . We should be careful about thinking worse of people in cases of agreement for purely because doing so can undermine our own ability to think clearly and fairly . After all , judgments about a person character or intelligence often themselves embody and generate preferences for and against individual people . As we have seen , once we have even a slight preference for or against a particular person , the engine of bias can kick in with respect to other things that person says . Once a person disagrees with us , for example , we might begin to think less of them as a person . If we think less of them , then we may be more inclined to unfairly evaluate things they subsequently tell us . This can hurt us after all , when we think less of others and thereby prematurely discount what they say , we can unnecessarily lose out on relevant information or points of view that might otherwise inform and improve our own thinking . In sum , the fact that ideas can have social value raises a question about what we should infer about a person when they disagree with us . This is complicated , but given that ( i ) we have a natural tendency to unfairly think worse of those who disagree with us , and ( ii ) doing so can lead to us to think in biased ways , together suggest that we should be very careful about thinking less ofa person on the basis of disagreement . SECTION PREFERENCES FOR INFLUENCES We have just seen that a preference for or against a person can influence how we think about what they say . As it turns out , this is very common , and happens in all kinds of ways and circumstances . People who have worked in sales , for example , know that if customers are favorably inclined toward them personally , they are more likely to accept what they say ( all things considered ) and ifthey are less inclined toward them , their customers will be less likely to accept what they say ( again , all things considered )

104 THADDEUS In his book Science and Practice Robert summarizes a number of the elements which can influence our feelings about other He begins by pointing out that we tend to be favorably inclined toward people we find physically attractive in some way . notes that a great deal of psychological research has shown that we have a propensity to unconsciously attribute a of favorable traits to people we find physically attractive . We tend to see people we find as attractive in some way as being more intelligent , more honest , and more talented than we would There is evidence that attractive job applicants are more likely to get hired than less attractive , but equally qualified , job applicants , and that attractive defendants ( in criminal cases ) tend to get lighter sentences than unattractive people in the same position . Moreover , there are a number of studies that suggest that physical appearance is an element in our decisions about who to vote for . In a recent study , Alexander and his colleagues presented people with pictures of candidates for the Senate and House of Representatives and asked them , on the basis of the picture alone , to rate each candidate competence . He then compared the perceived of the candidates to election outcomes and found that the more competent looking candidate for Senate won of the time , while the more competent looking candidate for the House of won of the These results suggest something surprising namely that the mere physical appearance ofa candidate is contributing to our voting choices . A second factor is similarity . We tend to be favorably inclined toward people who are similar to us in some notable way , dress , background , beliefs , hobbies , etc . Thus , we are likely to be favorably inclined toward people who are from the same area of the country as we are , or like the same music we do , or have the same political beliefs . On one level , this is probably not all that surprising . What may be surprising is the extent to which this influences our behavior . For example , draws attention to a study which that people are twice as likely to complete and return a survey if the survey is sent by a person with a similar name ! Third , a great deal of research in social psychology has shown that we tend to be favorably inclined toward people with whom we are working on a shared task . Thus , for example , players on a team tend to be ably inclined toward their teammates . Again , this is probably not too surprising . What is important to note about this is that people can take advantage of this phenomenon by creating a shared task or goal for us . notes that a common technique among car salespeople during price negotiations is to act as ifthey have taken your side against the sales manager . In doing so they have created a shared task together you and the salesperson are working against the sales manager . As such , you will tend to be more favorably inclined toward the salesperson than you might have been , and consequently more likely to be less skeptical of his or her claims . Another example is the police tactic of playing Good Cop during interrogations of suspects . In these cases , one officer plays the role of Bad Cop by adopting an actively hostile and suspicious stance towards the suspect . The other officer plays the role of the Good Cop by adopting a friendly and helpful stance toward the suspect . In this situation it is easy for the suspect to see the officer playing the Good Cop as an ally against the Bad Cop , and to consequently be favorably inclined and less guarded toward them . As these examples show , we have a propensity to assume that when a person has one positive feature , they probably have other positive features as well . Psychologists refer to these kinds of associations as halo effects . The idea is that positive qualities radiate a halo that makes other features ofa person look positive too ( note that this works in reverse features can radiate a halo that makes other features look negative as well )

BIAS AND MOTIVATED REASONING 105 moon tonight . With Jupiter , palm trees , and by It is important to emphasize that in these cases nobody is thinking to themselves Since he is handsome , I bet what he says is true too ! Put so explicitly we all recognize that this is a poor argument . Nor are most people consciously deciding who to hire , convict , or vote for on the basis of their appearance . Most of us recognize that making decisions on such a basis would be manifestly unfair . These halo effects , like the biases we have looked at , occur largely below the level of conscious awareness . Further , these factors do not dictate or determine our choices . The effects are much more limited . Nevertheless , they are still ing . After all , factors like appearance and similarity are not relevant in most cases , and so should not ence and decisions . In light of this , we need to keep in mind that when we walk away from a personal interaction with vague fondness for or aversion to a person , we may well be experiencing a halo effect based on their similarity to us , their attractiveness , their agreement with our ideas , etc . This is not to say that our interactions with other people can not give us good evidence that they are honest or trustworthy or surely can . The problem is that our impressions are not always based on relevant features of people , and over that often we do have any idea what features or cues our impressions have been based on in the first place . Ultimately , then , when it comes to important decisions involving other people ( hiring ) we should take the extra time to explicitly identify relevant factors as a way of limiting the impact of ate halo effects . SECTION COUNTERING THE EFFECTS OF BIAS As we have seen in our discussion of biases , we are often unaware of our preferences and unaware that they are our reasoning . There is something disturbing about this we think that we are in control of what we believe and decide , but these biases suggest otherwise . They suggest that our thinking and are often influenced by external forces outside ofour conscious awareness . When it comes to biases , then , the question is how can we take control of our thinking and avoid biased thinking and The most important thing we can do is be aware that we are subject to these biases , and pay attention to cases where they are especially likely to be at work . Simply knowing that you have a real personal stake in a conclusion , and that as a result , you are likely to be biased towards it , can have dramatic effects . After all , this realization puts you in a position to monitor your own thinking , and to ask yourself whether you

106 THADDEUS the evidence in favor of your preferred view , ignored relevant information , or the evidence against it . This might sound like obvious advice , and it it is advice that we tend not to follow ! As we saw in Chapter , in life we naturally respond to our environment in a variety of ways . Automatic and reasoning processes generate intuitive reactions to , and impressions of , people , places , and ideas , and these impressions inform our conscious reasoning processes . Halo effects are good examples it is a largely automatic reasoning process that gives us a favorable impression of a person that we find appealing or similar to us in some way . Many of the intuitions and impressions formed through these processes rise to the level of consciousness ( though not all ) at which point they become available for use in conscious reasoning processes . Think back to the ball and bat example from Chapter . When you first looked at the question , the answer presumably came immediately to mind . At the conscious level we can either accept the result of automatic processes , that is , we can take for granted our intuitions , impressions , and ways of thinking or we can treat them skeptically , as we should in the case of the bat and the ball . As it turns out , however , people rarely question their impressions , intuitions , and ways . We normally take our impressions for granted and use them as the starting points for our thinking . We do this partly because overriding our impressions and normal ways is hard work , and in general we tend to avoid expending mental energy ifwe can . The Nobel laureate Daniel explains A general law of least effort applies to cognitive as well as physical exertion , The law asserts that if there are several ways of achieving the same goal , people will eventually gravitate to the least demanding course of action . In the of action , is a cost , and the acquisition of a skill is driven by the balance of and cost . Laziness is built deep into our natures just as we tend to complete physical tasks using as little energy as possible , so too do we tend to avoid mental exertion if we can . puts this in terms of laziness , but other psychologists have used terms . Keith , for example , makes this point by saying that we are cognitive misers who are stingy with our energy . makes a particularly effective case for the importance of learning to distinguish and evaluate our automatic impressions and ways of thinking . He writes Humans are cognitive misers because their basic tendency is to default to automatic processing mechanisms of low computational expense Nevertheless , this strong bias to default to the simplest cognitive be a tive that humans are often less than rational . Increasingly in the modern world , we are presented with decisions and problems that require more accurate responses than those generated by automatic processing These processes often provide a quick solution that is a first approximation to an optimal response . But modern life often requires more precise thought than this . Modern technological societies are in fact hostile environments for people reliant on only the most easily computed automatic response . Think of the advertising industry that has been designed to exploit just this tendency When we are on automatic processing we lose personal autonomy . We give up our thinking to those who manipulate our environments , and we let our actions be determined by those who can create the stimuli that best trigger our shallow automatic processing point is that it is important that we learn to question our intuitive responses and ways of . to learn that we can not always trust the way things seem to us . Not only are our impressions times inaccurate , but people can take advantage of these largely automatic processes to manipulate us . The obvious solution to this problem is to treat our impressions more skeptically . Of course , we can not do this all of the time . We have too much to do , our environment changes too fast , and we do not have enough energy to stop and think about all our impressions . The simple fact is that we will have to trust most of our intuitive responses and ways of thinking . However , when something important is at stake we should slow down and carefully identify and evaluate the impressions that drive our thought . We do so by

BIAS AND MOTIVATED REASONING 107 ing down and identifying the sources of our impressions What about this person or product or situation is striking me as good or bad ?

Are those features really indicators of goodness or badness ?

We will not always be able to isolate these features , but in taking the time to briefly ask ourselves these questions we thereby exercise more control over our thinking and than we would have otherwise . In addition to simply being aware of the possibility of bias , what other steps can we take to mitigate bias ?

The most important step you can take is to separate yourself from the inquiry or idea . In some cases , you can literally do this , as in the case of the lawyer who trusts his defense to someone else , or the researcher who blinds herself to the experimental identity of the subjects . This strategy , however , is not realistic for most everyday situations . A second strategy for mitigating bias is to discuss the issue with other is , to seek cooperative dialogue . A person without the same set of preferences may be able to see weaknesses ( or strengths ) that you would have had a hard time seeing on your own . Unfortunately , cooperative dialogue is not always an option either . In this event , the best we can do is try to separate ourselves from the idea or our imagination . The idea is to try to envision the claim in question from a critic How would the critic object ?

Put otherwise , the idea is to play advocate with your own views . There are a number of ways to do this . One way to do this is to imagine that you have to defend your view in front of a critical audience . What objections would such an audience raise ?

How would they argue against your position ?

What evidence would they draw upon ?

An alternative is to pretend that you have the opposite view to the one you actually have . What would you say on behalf of your view ?

How would you criticize your opponents ?

Alternatively , you might imagine a future in which it turns out your view was mistaken or your decision the wrong one . You can then think about the you would have to gather in order to understand why you made the mistake . Once you have that information in hand , you can take it into account in coming to your view or making your decision . In sum , taking a different perspective in these ways can reveal unseen strengths and weaknesses in our view . There is no doubt that playing devil advocate with your own views is hard cognitive work , but when we want to know the truth about important matters it can be well worth the effort . by SECTION BUBBLES AND In closing this chapter , it is important to point out some of related situations that are wider in scope . On this score , philosopher Thi makes a useful distinction between epistemic bubbles and Let us start with epistemic bubbles . The term epistemic means having to do with knowledge , and

108 THADDEUS an epistemic bubble refers to an information source or network that omits , overlooks , or filters out vant facts , arguments , or perspectives . In order to illustrate this point , consider the fact that our social media accounts tend to be connected with friends or people we like , respect , or are positively inclined toward in some other way . In itself , this is not problematic . The problem is that many people get their news , analysis , and commentary from social media as well . Because people we like tend to be similar to us , the information , analysis , and commentary we get from social media tends to be skewed toward our existing beliefs . That is , our social media feeds can act as a filter that screens out information and perspectives that do not fit with our existing beliefs or the beliefs of people who are similar to us . As points out , friends make for good parties , but not necessarily good information networks . Further , we may not be aware that our information has been filtered , since we see the information or arguments we are missing . Beyond social media , the proliferation of news channels and the news cycle make it easy to pursue social , political , and economic , news and commentary that with our existing beliefs and values ( almost whatever they ) This need not be intentional . As we saw in our discussion of bias , an inclination toward an idea or perspective can start up the engine of bias and lead us to look for and pursue information that fits our existing system of beliefs . To be in an epistemic bubble is to lack certain information , arguments , or perspectives . An echo chamber , on the other hand , is a community that actively undermines dissenting voices . Within such a community , dissenting voices are unfairly and dismissed as insincere , corrupt , or otherwise hy . A person in an echo chamber be exposed to dissenting perspectives , arguments , and data but hey will set these opposing views aside since , according to trusted voices within the echo chamber , these views come from unreliable sources . Given this description , we can see that echo chambers are simply an extension of people predisposition to vilify those who disagree with them . However , in an his tendency is amplified , and has an isolating effect , since it insulates members of the community from questions and information that challenge the views . there is nothing wrong with setting aside a person claims because they are being insincere or because they are an unreliable source on the issue . The problem with an is that it does so unfairly . Dissenting voices are undermined and dismissed simply because they question or dissent from he views . That is , in an voices are set aside or disparaged when there is no good prior reason to doubt them . In addition , we can think of both epistemic bubbles and as coming in degrees . When it comes to echo chambers , a community can go to greater or lesser lengths discredit opposition , and similarly a person within an can be more or less insulated from inconsistent information or dissenting voices . As a final point , can grow up around all kinds of phenomena . gives as examples communities centered around , political positions , specific diets and exercise programs , activism , techniques , and marketing programs . Avoiding epistemic bubbles and is not complicated if we are paying attention . In the case of epistemic bubbles , we need to make sure to get our information from varied sources , and to pay attention to other people sources of information . This makes sense particularly when people disagree with us . This is an opportunity to ask where is this opposing information coming from ?

is it a good source ?

When it comes to echo chambers , we need to pay attention to how a community handles dissent . Does it consider new information or perspectives or does it seek to vilify those who raise questions ?

Does the munity fairly represent criticisms , or do they misrepresent them ( straw man ) or mock them ?

BIAS AND MOTIVATED REASONING 109 EXERCISES Exercise Set Directions In order to practice combating bias , lay out the strongest case a critic might make against each of the following . Note this might be uncomfortable . The legal drinking age should be lowered to 18 . Each person has only one true soulmate . Corporations only care about profit . Firefighters are heroes . Banning books always amounts to unacceptable censorship . Standardized testing is a waste of everyone time . Exercise Set Some of the claims in exercise set are provocative . Why do you think you were asked to consider those claims in particular ?

Evaluate the following argument . It seems to me that I have impartially evaluated the evidence in this case . So , I probably have . What do you make of it ?

Outside and political , where have you seen biased reasoning ?

Give at least one example .

110 THADDEUS As we seen , ideas , questions , and arguments can have social value . How can this contribute to the of in particular ?

What sources of news or information do you trust ?

Are there sources you do trust , but that others do , or vice versa ?

In general , what features or characteristics of a source give you reason to trust it , or not trust it ?

Have you ever seen , experienced , or heard ofanything like an echo chamber as defined above ?

Explain . Notes Lord , Ross , 1979 ) Biased assimilation and attitude polarization The effects of prior theories on subsequently considered evidence . and Social Psychology , 37 ( 11 ) Congress , Senate Select Committee on Intelligence , Report on the Select Committee on Intelligence on the Community Prewar Intelligence Assessment on Iraq , 18 . et al . 1994 ) The impact of blinding on the results ofa randomized , multiple sclerosis clinical trial . Neurology 44 , Garcia and King call this kind of inference the fallacy . See Garcia , Robert and King , Nathan ( 2016 ) Towards intellectually Virtuous Discourse Two Vicious Fallacies and the Virtues that Inhibit Them in Virtues and Education . New York , 2008 ) Influence Science , ed . New York College Publishers . This is a general tendency Indeed , people who are typically regarded as very attractive in some way may be more likely to be regarded unfavorably in some , Alexander , Amir , and Hall , Crystal ( 2005 ) of Competence from Faces Predict Election Outcomes . in Science 308 ( 5728 ) Daniel . 2011 ) Thinking , Fast and Slow . New York , Strauss , and , 35 . Keith ( 2009 ) What Intelligence Tests Miss The Psychology of Rational Thought . New Haven Yale University Press , distinguishes between Type and Type systems of reasoning Since I have not adopted this same terminology , I have substituted the term automatic in parentheses . 10 Lord , and Preston , 1984 ) Considering the Opposite A Corrective Strategy for ment . journal of Personality and Social Psychology , 47 ( 11 , 2020 ) Echo Chambers and Epistemic Bubbles in , 17 ,

Unit Summary In this unit we took a closer look at the process of evaluating arguments . We began by discussing how to apply the standards of factual correctness and logical strength . We then characterized different levels of logical strength , and introduced the related terminology of deductive and inductive ' We then turned to factual correctness , focusing on two potential complications . First we looked at conditional claims , and cussed when claims like this are true and false . Second , we looked at a common kind of fallacious Straw Man . Last we looked at how bias can undermine our reasoning , and identified some techniques for mitigating its effects . KEY TE Soundness Strict Factual Correctness Strong Logical Strength Ambiguous Undetermined Arguments Straw Man Argument Deductive Arguments Contextual Straw Man Inductive Arguments Bias Inductively Strong Bias Inductively Weak Studies Not Factually Correct Social Value Undetermined Halo Effects True Cognitive Miser Standard Form for Devil Advocate Antecedent Epistemic Bubble Consequent FURTHER READING For a broad overview of the psychological literature on motivated reasoning see Motivated Cognition in Social Thought by David Dunning in APA Handbook and Social Psychology . For more on associations and some interesting examples see Chapter of Smart Thinking by Richard . If you are interested in learning about how advertisers use halo effects and other techniques to persuade see Sold on Language How Advertisers Talk to You and What this Says about You and Greg . Finally , for an accessible discussion of epistemic bubbles and echo chambers see contribution Escape the Echo Chamber in the online magazine Aeon . 111