On reducing anti-atheist prejudice

In my last post, I discussed research by Edgell et al. showing that atheists are the most disliked group in America. I also described a series of studies by Will Gervais and his colleagues showing that anti-atheist prejudice is strongly related to distrust. In another paper published last year, Gervais tested the hypothesis that prejudice against atheists will be reduced when there is a perception that atheists are numerous. This is an interesting hypothesis, as it is typically the case that prejudice towards an outgroup increases as the outgroup gets larger.

“For example, anti-Black prejudice is stronger where Black people hold a larger relative share of local populations in the United States (e.g., Fosset & Kiecolt, 1989; Giles & Evans, 1986; Pettigrew, 1959). The disparity in racial attitudes between North and South in the United States is statistically nonsignificant after local Black population share is controlled (Taylor, 1998)”

Given this tendency, why should atheists be different to any other outgroup? According to Gervais, different groups represent different perceived threats to those who are prejudiced. If an individual sees members of an outgroup as posing a physical threat, their fear and prejudice will increase as the outgroup increases in size. Distrust, in contrast to fear of physical threat, may decrease as the outgroup becomes larger. In addition, we can observe that in countries with greater relative numbers of atheists (such as Denmark and Sweden), there is less anti-atheist prejudice than in countries with fewer atheists (such as the US). In order to see whether it is really the case that there is less prejudice towards atheists in countries with greater atheist prevalence, Gervais analysed survey data from 40,271 respondents who expressed a belief in God, from 54 countries. He found that there was a significant negative correlation between atheist prevalence and anti-atheist prejudice.

Having established that religious people living in countries with more atheists tend to be less prejudiced against them, Gervais conducted an experiment in which he manipulated the perceived prevalence of atheists in order to determine whether this would cause a reduction in anti-atheist prejudice. One hundred and twelve undergraduate students were split into two groups and given different information about how many atheists there are. One group was told that around 5% of the students at their university were atheists, and that atheists are rare all over the world. The other group was told that 50% of the students at their university were atheists, and that atheists are the fourth largest “religious” group in the world (this last point is actually true: only Christians, Muslims, and Hindus are more numerous than atheists). The participants were then asked to rate two statements on a 7-point scale (from “strongly disagree to “strongly agree”). The two statements were “Atheists are dishonest” and “Atheists are trustworthy”. The second item was reverse scored and the two responses were added together to form a single measure of atheist distrust. The results showed that the students who were told that atheists are common gave lower ratings of distrust than the students who were told that atheists are rare, lending support to Gervais’ hypothesis that distrust decreases as perceived outgroup prevalence increases.

An obvious implication of this research in that atheists should seek to become more visible in order that others perceive them as more prevalent. This has clearly been happening in recent years – the bestselling books about atheism and the billboard and bus campaigns have definitely brought atheists to the attention of the population at large. Has all this effort led to a reduction in distrust of atheists? It’s hard to say, but if Gervais’ findings are generalisable, then the outspoken atheists who have been criticized for being outspoken, may be helping to reduce prejudice simply by altering people’s perception of the prevalence of atheists.


ResearchBlogging.org

Gervais, W. M. (2011). Finding the faithless: perceived atheist prevalence reduces anti-atheist prejudice. Personality & social psychology bulletin, 37 (4), 543-56 PMID: 21343437

On dislike and distrust of atheists

As many people reported back in 2006, atheists are perceived in rather a bad light by many Americans. According to a survey conducted by Penny Edgell and colleagues at the University of Minnesota

Atheists are at the top of the list of groups that Americans find problematic in both public and private life, and the gap between acceptance of atheists and acceptance of other racial and religious minorities is large and persistent. It is striking that the rejection of atheists is so much more common than rejection of other stigmatized groups. For example, while rejection of Muslims may have spiked in post-9/11 America, rejection of atheists was higher. The possibility of same-sex marriage has widely been seen as a threat to a biblical definition of marriage, as Massachusetts, Hawaii, and California have tested the idea, and the debate over the ordination of openly gay clergy has become a central point of controversy within many churches. In our survey, however, concerns about atheists were stronger than concerns about homosexuals.

Edgell and colleagues conducted a telephone survey of 2081 American households over the summer of 2003. First, participants were given a list of racial, ethnic, and religious groups, and asked whether “people in this group agree with YOUR vision of American society—almost completely, mostly, somewhat, or not at all?” Second, respondents were asked “Suppose your son or daughter wanted to marry [a person in given category]. Would you approve of this choice, disapprove of it, or wouldn’t it make any difference at all one way or the other?” For the first question, respondents were asked about recent immigrants and homosexuals, in addition to groups such as Conservative Christians, White Americans, African Americans, Jews, Muslims, and Atheists.

Only 2.2% of respondents answered “not at all” when asked whether White Americans agreed with their vision of America; White Americans can thus be seen as the least “problematic” group. At the other end of the scale are Homosexuals (22.6%), Muslims (26.3%), and Atheists (39.6%). Turning to the second question, respondents said they would most disapprove of the children marrying African Americans (27.2%), Muslims (33.5%), and Atheists (47.6%).

From these results, it would certainly seem that a lot of people don’t care too much for atheists. Two immediate questions arise. First, what is the basis of this dislike? Is it fear? Distrust? Moral outrage? Second, is this rejection of atheism still the case? The data discussed above were collected in 2003. Not really a long time ago, but before the current conception of ‘New Atheism’ became an object of discussion in the mass media. Before Dawkins’ The God Delusion. Before Harris’ The End of Faith. Before Hitchens’ God Is Not Great and Dennett’s Breaking The Spell

Both questions are addressed in a more recent paper from Will Gervais, Azim Shariff, and Ara Norenzayan. Their conclusion is that:

Atheists are among the least liked groups of people in many parts of the world, and the present studies help to explain why. The present six studies converged on the conclusion that distrust is at the core of this particularly powerful, peculiar, and prevalent form of prejudice.

In the first of their six studies, Gervais et al set out to replicate Edgell et al.’s finding that atheists are liked less than homosexuals. Three hundred and fifty-one participants were asked to rank atheists, gay men, and people in general on three scales of 0 to 100: general feeling toward each group, trustworthiness of each group, and disgust towards each group. The results show that atheists were viewed less favorably than the other groups, that atheists were viewed with more distrust than gay men, but that gay men were viewed with more disgust than atheists. These findings indicate that anti-gay prejudice results from disgust, whereas anti-atheist prejudice is rooted in distrust.

The authors then conducted several further studies aimed at examining distrust further. To this end, they modified a classic psychological task, first made famous by Tversky and Kahnemann (1983). In the original version of the task, participants were presented with the following problem:

Linda is 31, single, outspoken, and very bright. She majored in philosophy in college. As a student, she was deeply concerned with discrimination and other social issues, and participated in anti-nuclear demonstrations. Which statement is more likely?

a. Linda is a bank teller.

b. Linda is a bank teller and active in the feminist movement.

Tversky and Kahneman found that 85% of participants answered ‘b’. This response is a conjunction fallacy; it is fallacious because the likelihood of both A and B occurring is always less than the likelihood of either A or B alone. If “A and B” is true, then both A and B have to be true, which is less likely than only A being true or only B being true. To illustrate why this is the case, consider another example:

Which statement is more likely?
a. You will be hit by lighning
b. You will be hit by lightning and you will win the lottery

Clearly both being hit by lightning and winning the lottery are unlikely, but it should also be clear that any one of these two events is more likely than both. So why do people answer that Linda is more likely to be both a bank teller and a femininst? According to Tversky and Kahneman, it is because people tend not to reason using pure logic. Instead we rely on heuristics to guide our decisions. In this case, participants used the representativeness heuristic to guide their judgments. The description of Linda is more representative of a feminist than it was of a bank teller, so people pick the answer that includes the word feminist, ignoring the fact that the conjunction of feminist and bank teller is less likely than either one of these attributes alone.

Gervais et al. modified this task so as to compare the extent to which participants made the conjunction fallacy with different target groups, including atheists. Participants were given the following description:

Richard is 31 years old. On his way to work one day, he accidentally backed his car into a parked van. Because pedestrians were watching, he got out of his car. He pretended to write down his insurance information. He then tucked the blank note into the van’s window before getting back into his car and driving away.

Later the same day, Richard found a wallet on the sidewalk. Nobody was looking, so he took all of the money out of the wallet. He then threw the wallet in a trash can.

Participants were then asked if it is more likely that Richard is
a. A teacher
b. A teacher and a [Christian/Muslim/rapist/atheist]

Participants were placed in four groups. Each group saw of one of the four alternatives presented in answer ‘b’. The point of this task is that Richard is presented as someone who is untrustworthy. As a result, participants should make the conjunction fallacy if they consider the target group presented in answer ‘b’ to be untrustworthy. Given that the prejudice against atheists seems to be driven by distrust, people should be more likely to make the conjunction fallacy when answer ‘b’ is “teacher and atheist” than for, say, “teacher and Christian”. Gervais et al found that participants were significantly more likely to make the conjunction fallacy for ‘atheist’ than ‘Christian’ or ‘Muslim’. Also – and very alarmingly – there was no significant difference in the frequency of conjunction errors between ‘atheist’ and ‘rapist’!

I am going to skip over studies 3-5. Suffice it to say, they all paint a picture of the perception of atheists as untrustworthy. Furthermore, Gervais et al. found that belief in God was associated with higher levels of atheist distrust. In study 6, participants were given the task of choosing between two (fictional) job candidates for two different jobs: waitress and daycare worker. You can probably guess that one of the candidates was presented as religious and one as an atheist. You may also guess that participants preferred the religious candidate for the daycare position, as this job requires someone who is trustworthy. The authors also found that participants with a strong belief in God were less likely to hire the atheist candidate for the high-trust position than participants who did not have a strong belief in God.


ResearchBlogging.org

Edgell, P., Gerteis. J., & Hartmann, D. (2006). Atheists As “other”: Moral boundaries and cultural membership in American society American Sociological Review, 71 (2), 211-234 DOI: 10.1177/000312240607100203

Gervais, W. M., Shariff, A. F., & Norenzayan, A. (2011). Do you believe in atheists? Distrust is central to anti-atheist prejudice. Journal of personality and social psychology, 101 (6), 1189-206 PMID: 22059841

Tversky, A., & Kahneman, D. (1983). Extension versus intuitive reasoning:
The conjunction fallacy in probability judgment. Psychological Review (90), 293-315 : 10.1037/0033-295X.90.4.293

“Low-Effort Thought Promotes Political Conservatism”

…  is the title of a fascinating paper that came out in the Personality & Social Psychology Bulletin last month. Scott Eidelman and colleagues tested “the hypothesis that low-effort thought promotes political conservatism”. This hypothesis resulted from the existing literature on comparisons between conservatives and liberals and how ideological differences map onto psychological processes. For example, it has been found that conservatives are more likely to give person, rather than situation, explanations for behavior. In other words, conservatives are more likely to explain the actions of individuals in terms of personal motivation and responsibility than in terms of wider economic or societal constraints. “He is unemployed because he is too lazy to get a job” would be an example of such a person explanation. The emphasis on personal responsibility is a typical component of conservative ideology, and we also know that people are more likely to give person explanations when they do not put much effort into thinking about the issue.

A second example of the negative relationship between conservative ideology and effortful thinking is acceptance of hierarchy. It turns out that we are very good at noticing cues relating to relative status, and remembering hierarchical relationships. Preference for hierarchical systems where some people are at the top and others are at the bottom implies an opposition to equality – another hallmark of conservatism. A third component discussed by Eidelman et al. is preference for the status quo, a conservative value that is easy to endorse with little effort.

Given these mappings between components of conservative ideology and the ease with which they are accepted when we do not put much effort into thinking about them, the authors predicted that people could be manipulated into endorsing conservative attitudes by restricting their thought processes. In order to test this prediction, Eidelman et al. carried out four studies in which they examined the effects of restricted thought on conservative attitudes. In their first study, the researchers examined the extent to which alcohol intoxication affects attitudes. Groups of researchers stood outside a bar and measured the blood alcohol content of people as they left the bar. The participants were also asked to fill in a simple survey designed to measure conservative attitudes. Blood alcohol content and the participants’ self-identification as liberal or conservative were two significant predictors of conservative attitudes. People who identified as conservative were, not surprisingly, found to have higher conservative attitudes than liberals, but the key finding is that intoxication was associated with more conservative attitudes for both conservative and liberal participants.

In the second study, the researchers actively manipulated the cognitive resources available to participants. All participants were given scales measuring both liberal and conservative attitudes, but half the participants were asked to complete these scales while completing a second task in which they had to count a series of audio tones. The participants who had to divide their attention between the questionnaires and the counting task had increased conservative values and decreased liberal values compared to participants who were not under cognitive load.

In study three, the researchers manipulated time pressure. In this experiment participants were asked to respond to 25 terms relating to conservatism (e.g., private property) and 25 terms relating to liberalism (e.g., civil rights). For each term, participants indicated the extent to which they agreed or disagreed with the concept. Half the participants were given as long as they wanted to respond to each item and half were placed under time pressure: they only had half a second to read each term and just one second to respond. The participants placed in the high time pressure condition gave higher ratings to the conservative terms than the participants not under time pressure. There was no effect of time pressure on responses to the liberal items.

Finally, in study 4, the researchers told participants to rate 30 items relating to conservatism and liberalism. This time, participants were placed into two groups given different instructions. One group was told to think hard before making a response. The other group was told to give an immediate response and not think too hard. As in study 3, the manipulation affected conservatism but not liberalism: those told not to think too hard gave higher ratings to the conservative terms.

Across these four studies, the authors found a fairly robust effect: When the ability to engage in effortful thinking is impaired, participants were more likely to endorse political conservatism. An important point to note regarding the findings of this study is that the authors

“argue that low-effort thinking promotes political conservatism, not that conservatives rely on low-effort thought. Similarly, we do not assert that conservatives fail to engage in effortful, deliberative thought but rather that disengagement of effortful thinking leads to cognitions consonant with political conservatism.

Although the authors do not make the simplistic claim that political conservatism can be explained in terms of failure to think, they do point out a serious implication of their work:

“Without the means or motive to override an initial impulse that promotes conservative ideology, the political scales may be tipped toward the right of center and may provide a contributing explanation for what has been described as a conservative bias in American politics.”

This means that unless we override default, reactionary responses to arguments and/or situations, we are likely to endorse conservative ideology. In many situations this may be relatively easy, as there may be few demands on our cognitive resources and the issue can be given lots of attention. But if we are forced to go with our gut reactions, we may be more likely to endorse a position we would not endorse given more time to think.


ResearchBlogging.org

Eidelman S, Crandall CS, Goodman JA, & Blanchar JC (2012). Low-effort thought promotes political conservatism. Personality & social psychology bulletin, 38 (6), 808-20 PMID: 22427384

The Effects of Behavior on Beliefs

Richard Wiseman has an interesting article at New Statesman. He describes how

[a]t the end of the Korean War, 21 American prisoners-of-war chose to remain in communist Korea and openly sided with an enemy that had killed thousands of their comrades. In addition, a surprisingly large number of the American service personnel who did return home enthusiastically expounded the strengths of communism. The family and friends of these servicemen were stunned, and the world’s media flocked to Korea to report the story. Some researchers suggested that the Koreans had brainwashed the American soldiers with flashing lights, hypnosis or mind-altering drugs.

They were all wrong.

So how was this feat, of converting American soldiers to enthusiastic supporters of communism, achieved?

Shortly after capture, the Chinese guards asked servicemen to jot down a few short pro-communist statements (“Communism is wonderful”, and “Communism is the way of the future”). Many of the Americans were happy to oblige because the request seemed so trivial. A few weeks later the guards upped the ante and asked the prisoners to read the statements aloud to themselves. A couple of weeks later the Americans were asked to read the statements out to their fellow prisoners, and to engage in mock debates arguing why they believed the statements to be correct. Finally, fresh fruit or sweets were offered to any soldiers who were prepared to write pro-communist essays for the camp newsletter. Once again, many of the prisoners were happy to oblige.

The Chinese did not have to resort to arcane brainwashing techniques. Instead, they simply ensured that the prisoners were encouraged repeatedly to support communism, and then leave them to develop beliefs that were consistent with their behaviour.

Wiseman describes findings from laboratory studies, demonstrating that simply asking people to endorse certain attitudes or opinions can affect their actual beliefs. These findings conflict with our common sense intuitions about the cause-effect relationship of belief and behavior. If we are not aware of this subtle effect, we can be manipulated to support the values and opinions of others:

Having people repeatedly sing a national anthem will make them more patriotic. Making children pray every morning will increase the likelihood of them adopting religious beliefs.

We can use Leon Festinger’s Cognitive Dissonance Theory to explain why these shifts in attitudes or beliefs occur. Festinger’s theory asserts that when people hold two inconsistent attitudes or beliefs, or behave in a way that is not in keeping with their beliefs, they experience cognitive dissonance. In order to reduce this dissonance, they will alter their beliefs.

To test this theory Festinger and Carlsmith (1959) conducted what is now a classic study in social psychology. Participants were asked to engage in boring, repetitive tasks. They were asked to take 12 spools of thread off a board, put the spools back on, take them off again, and so on, for 30 minutes. They were then asked to turn pegs in holes to the left, then right, then back again, for another 30 minutes. The participants were next instructed to go out and meet the next participant (in reality, an accomplice of the experimenters) and tell her that the experiment was a lot of fun. Some of the participants were paid $1, others were paid $20 (quite a lot of money back in 1959!). Afterwards, the participants were asked to rate their enjoyment of the tasks they had engaged in. Those who were paid $20 gave the tasks a low rating – as one might expect – but those who were paid only $1 rated the tasks as more enjoyable.

When I describe this study to students, they are often confused – surely the person who was paid lots of money would be happy to go along with the idea that the experiment was fun? The findings seem to be the wrong way round. However, it makes sense when we think about conflicting behaviors and beliefs. The participants’ verbal behavior (what they said to the next participant) was to endorse the idea that the boring, repetitive tasks were fun. The verbal behavior is in clear conflict with the participants’ experience. However, those who were paid the higher sum have an easy alternative explanation for their behavior: they were paid $20 to lie. Those who were only paid $1 are not likely to see the money as sufficient cause for their verbal behavior, so another explanation must be found. The reason they claimed the tasks were fun was because the tasks were fun. Their attitudes toward the tasks changed so as to be in line with their behavior.

So be careful if someone asks you to “pretend” to endorse a belief contrary to what you actually think. It could lead to a gradual shift in your beliefs! This is why we also need to vigilant when authority figures ask children to repeat statements of ideology. Coercing the verbal behavior of children can be a type of brainwashing.


ResearchBlogging.org

Festinger, L., & Carlsmith, J. M. (1959). Cognitive consequences of forced compliance. Journal of abnormal psychology, 58 (2), 203-10 PMID: 13640824