Siri Carpenter
Scientific American Mind, April/May 2008
There is nothing more painful to me at this stage in my life,” Jesse Jackson once told an audience, “than to walk down the street and hear footsteps and start thinking about robbery—then look around and see somebody white and feel relieved.”
Jackson’s remark illustrates a basic fact of our social existence, one that even a committed black civil-rights leader cannot escape: ideas that we may not endorse—for example, that a black stranger might harm us but a white one probably would not—can nonetheless lodge themselves in our minds and, without our permission or awareness, color our perceptions, expectations and judgments.
Using a variety of sophisticated methods, psychologists have established that people unwittingly hold an astounding assortment of stereotypical beliefs and attitudes about social groups: black and white, female and male, elderly and young, gay and straight, fat and thin. Although these implicit biases inhabit us all, we vary in the particulars, depending on our own group membership, our conscious desire to avoid bias and the contours of our everyday environments. For instance, about two thirds of whites have an implicit preference for whites over blacks, whereas blacks show no average preference for one race over the other.
Such bias is far more prevalent than the more overt, or explicit, prejudice that we associate with, say, the Ku Klux Klan or the Nazis. That is emphatically not to say that explicit prejudice and discrimination have evaporated nor that they are of lesser importance than implicit bias. According to a 2005 federal report, almost 200,000 hate crimes—84 percent of them violent—occur in the U.S. every year.
The persistence of explicit bias in contemporary culture has led some critics to maintain that implicit bias is of secondary concern. But hundreds of studies of implicit bias show that its effects can be equally insidious. Most social psychologists believe that certain scenarios can automatically activate implicit stereotypes and attitudes, which then can affect our perceptions, judgments and behavior. “The data on that are incontrovertible,” concludes psychologist Russell H. Fazio of Ohio State University.
Now researchers are probing deeper. They want to know: Where exactly do such biases come from? How much do they influence our outward behavior? And if stereotypes and prejudiced attitudes are burned into our psyches, can learning more about them help to tell each of us how to override them?
Sticking Together
Implicit biases grow out of normal and necessary features of human cognition, such as our tendency to categorize, to form cliques and to absorb social messages and cues. To make sense of the world around us, we put things into groups and remember relations between objects and actions or adjectives: for instance, people automatically note that cars move fast, cookies taste sweet and mosquitoes bite. Without such deductions, we would have a lot more trouble navigating our environment and surviving in it.
Such associations often reside outside conscious understanding; thus, to measure them, psychologists rely on indirect tests that do not depend on people’s ability or willingness to reflect on their feelings and thoughts. Several commonly used methods gauge the speed at which people associate words or pictures representing social groups—young and old, female and male, black and white, fat and thin, Democrat and Republican, and so on—with positive or negative words or with particular stereotypic traits
Because closely associated concepts are essentially linked together in a person’s mind, a person will be faster to respond to a related pair of concepts—say, “hammer and nail”—than to an uncoupled pair, such as “hammer and cotton ball.” The timing of a person’s responses, therefore, can reveal hidden associations such as “black and danger” or “female and frail” that form the basis of implicit prejudice. “One of the questions that people often ask is, ‘Can we get rid of implicit associations?’ ” says psychologist Brian A. Nosek of the University of Virginia. “The answer is no, and we wouldn’t want to. If we got rid of them, we would lose a very useful tool that we need for our everyday lives.”
The problem arises when we form associations that contradict our intentions, beliefs and values. That is, many people unwittingly associate “female” with “weak,” “Arab” with “terrorist,” or “black” with “criminal,” even though such stereotypes undermine values such as fairness and equality that many of us hold dear.
Self-interest often shores up implicit biases. To bolster our own status, we are predisposed to ascribe superior characteristics to the groups to which we belong, or in-groups, and to exaggerate differences between our own group and outsiders [see “The New Psychology of Leadership,” by Stephen D. Reicher, S. Alexander Haslam and Michael J. Platow; Scientific American Mind, August/September 2007].
Even our basic visual perceptions are skewed toward our in-groups. Many studies have shown that people more readily remember faces of their own race than of other races. In recent years, scientists have begun to probe the neural basis for this phenomenon, known as the same-race memory advantage. In a 2001 study neurosurgeon Alexandra J. Golby, now at Harvard Medical School, and her colleagues used functional magnetic resonance imaging to track people’s brain activity while they viewed a series of white and black faces. The researchers found that individuals exhibited greater activity in a brain area involved in face recognition known as the fusiform face area [see “A Face in the Crowd,” by Nina Bublitz, on page 58] when they viewed faces of their own racial group than when they gazed at faces of a different race. The more strongly a person showed the same-race memory advantage, the greater this brain difference was.
This identification with a group occurs astoundingly quickly. In a 2002 study University of Washington psychologist Anthony G. Greenwald and his colleagues asked 156 people to read the names of four members of two hypothetical teams, Purple and Gold, then spend 45 seconds memorizing the names of the players on just one team. Next, the participants performed two tasks in which they quickly sorted the names of team members. In one task, they grouped members of one team under the concept “win” and those of the other team under “lose,” and in the other they linked each team with either “self” or “other.” The researchers found that the mere 45 seconds that a person spent thinking about a fictional team made them identify with that team (linking it with “self”) and implicitly view its members as “winners.”
Some implicit biases appear to be rooted in strong emotions. In a 2004 study Ohio State psychologist Wil A. Cunningham and his colleagues measured white people’s brain activity as they viewed a series of white and black faces. The team found that black faces—as compared with white faces—that they flashed for only 30 milliseconds (too quickly for participants to notice them) triggered greater activity in the amygdala, a brain area associated with vigilance and sometimes fear. The effect was most pronounced among people who demonstrated strong implicit racial bias. Provocatively, the same study revealed that when faces were shown for half a second—enough time for participants to consciously process them—black faces instead elicited heightened activity in prefrontal brain areas associated with detecting internal conflicts and controlling responses, hinting that individuals were consciously trying to suppress their implicit associations.
Why might black faces, in particular, provoke vigilance? Northwestern University psychologist Jennifer A. Richeson speculates that American cultural stereotypes linking young black men with crime, violence and danger are so robust that our brains may automatically give preferential attention to blacks as a category, just as they do for threatening animals such as snakes. In a recent unpublished study Richeson and her colleagues found that white college students’ visual attention was drawn more quickly to photographs of black versus white men, even though the images were flashed so quickly that participants did not consciously notice them. This heightened vigilance did not appear, however, when the men in the pictures were looking away from the camera. (Averted eye gaze, a signal of submission in humans and other animals, extinguishes explicit perceptions of threat.)
Whatever the neural underpinnings of implicit bias, cultural factors—such as shopworn ethnic jokes, careless catchphrases and playground taunts dispensed by peers, parents or the media—often reinforce such prejudice. Subtle sociocultural signals may carry particularly insidious power. In a recent unpublished study psychologist Luigi Castelli of the University of Padova in Italy and his colleagues examined racial attitudes and behavior in 72 white Italian families. They found that young children’s racial preferences were unaffected by their parents’ explicit racial attitudes (perhaps because those attitudes were muted). Children whose mothers had more negative implicit attitudes toward blacks, however, tended to choose a white over a black playmate and ascribed more negative traits to a fictional black child than to a white child. Children whose mothers showed less implicit racial bias on an implicit bias test were less likely to exhibit such racial preferences.
Many of our implicit associations about social groups form before we are old enough to consider them rationally. In an unpublished experiment Mahzarub R. Banaji, a psychologist at Harvard University, and Yarrow Dunham, now a psychologist at the University of California, Merced, found that white preschoolers tended to categorize racially ambiguous angry faces as black rather than white; they did not do so for happy faces. And a 2006 study by Banaji and Harvard graduate student Andrew S. Baron shows that full-fledged implicit racial bias emerges by age six – and never retreats. “These filters through which people see the world are present very early,” Baron concludes.
Dangerous Games
On February 4, 1999, four New York City police officers knocked on the apartment door of a 23-year-old West African immigrant named Amadou Diallo. They intended to question him because his physical description matched that of a suspected rapist. Moments later Diallo lay dead. The officers, believing that Diallo was reaching for a gun, had fired 41 shots at him, 19 of which struck their target. The item that Diallo had been pulling from his pocket was not a gun but his wallet. The officers were charged with second-degree murder but argued that at the time of the shooting they believed their lives were in danger. Their argument was successful, and they were acquitted.
In the Diallo case, the officers’ split-second decision to open fire had massive, and tragic, consequences, and the court proceedings and public outcry that followed the shooting raised a number of troubling questions. To what degree are our decisions swayed by implicit social biases? How do those implicit biases interact with our more deliberate choices?
A growing body of work indicates that implicit attitudes do, in fact, contaminate our behavior. Reflexive actions and snap judgments may be especially vulnerable to implicit associations. A number of studies have shown, for instance, that both blacks and whites tend to mistake a harmless object such as a cell phone or hand tool for a gun if a black face accompanies the object. This “weapon bias” is especially strong when people have to judge the situation very quickly.
In a 2002 study of racial attitudes and nonverbal behavior, psychologist John F. Dovidio, now at Yale University, and his colleagues measured explicit and implicit racial attitudes among 40 white college students. The researchers then asked the white participants to chat with one black and one white person while the researchers videotaped the interaction. Dovidio and his colleagues found that in these interracial interactions, the white participants’ explicit attitudes best predicted the kinds of behavior they could easily control, such as the friendliness of their spoken words. Participants’ nonverbal signals, however, such as the amount of eye contact they made, depended on their implicit attitudes.
As a result, Dovidio says, whites and blacks came away from the conversation with very different impressions of how it had gone. Whites typically thought the interactions had gone well, but blacks, attuned to whites’ nonverbal behavior, thought otherwise. Blacks also assumed that the whites were conscious of their nonverbal behavior and blamed white prejudice. “Our society is really characterized by this lack of perspective,” Dovidio says. “Understanding both implicit and explicit attitudes helps you understand how whites and blacks could look at the same thing and not understand how the other person saw it differently.”
Implicit biases can infect more deliberate decisions, too. In a 2007 study Rutgers University psychologists Laurie A. Rudman and Richard D. Ashmore found that white people who exhibited greater implicit bias toward black people also reported a stronger tendency to engage in a variety of discriminatory acts in their everyday lives. These included avoiding or excluding blacks socially, uttering racial slurs and jokes, and insulting, threatening or physically harming black people.
In a second study reported in the same paper, Rudman and Ashmore set up a laboratory scenario to further examine the link between implicit bias against Jews, Asians and blacks and discriminatory behavior toward each of those groups. They asked research participants to examine a budget proposal ostensibly under consideration at their university and to make recommendations for allocating funding to student organizations. Students who exhibited greater implicit bias toward a given minority group tended to suggest budgets that discriminated more against organizations devoted to that group’s interests.
Implicit bias may sway hiring decisions. In a recent unpublished field experiment economist Dan-Olof Rooth of the University of Kalmar in Sweden sent corporate employers identical job applications on behalf of fictional male candidates—under either Arab-Muslim or Swedish names. Next he tracked down the 193 human resources professionals who had evaluated the applications and measured their implicit biases concerning Arab-Muslim men. Rooth discovered that the greater the employer’s bias, the less likely he or she was to call an applicant with a name such as Mohammed or Reza for an interview. Employers’ explicit attitudes toward Muslims did not correspond to their decision to interview (or fail to consider) someone with a Muslim name, possibly because many recruiters were reluctant to reveal those attitudes.
Unconscious racial bias may also infect critical medical decisions. In a 2007 study Banaji and her Harvard colleagues presented 287 internal medicine and emergency care physicians with a photograph and brief clinical vignette describing a middle-aged patient—in some cases black and in others white—who came to the hospital complaining of chest pain. Most physicians did not acknowledge racial bias, but on average they showed (on an implicit bias test) a moderate to large implicit antiblack bias. And the greater a physician’s racial bias, the less likely he or she was to give a black patient clot-busting thrombolytic drugs.
Beating Back Prejudice
Researchers long believed that because implicit associations develop early in our lives, and because we are often unaware of their influence, they may be virtually impervious to change. But recent work suggests that we can reshape our implicit attitudes and beliefs—or at least curb their effects on our behavior.
Seeing targeted groups in more favorable social contexts can help thwart biased attitudes. In laboratory studies, seeing a black face with a church as a background, instead of a dilapidated street corner, considering familiar examples of admired blacks such as actor Denzel Washington and athlete Michael Jordan, and reading Abab-Muslims’ contributions to society all weaken people’s implicit racial and ethnic biases. In real college classrooms, students taking a course on prejudice reduction who had a black professor showed greater reductions in both implicit and explicit prejudice at the end of the semester than did those who had a white professor. And in a recent unpublished study Nilanjana Dasgupta, a psychologist at the University of Massachusetts Amherst, found that female engineering students who had a male professor held negative implicit attitudes toward math and implicitly viewed math as masculine. Students with a female engineering professor did not.
More than half a century ago the eminent social psychologist Gordon Allport called group labels “nouns that cut slices,” pointing to the power of mere words to shape how we categorize and perceive others. New research underscores that words exert equal potency at an implicit level. In a 2003 study Harvard psychologist Jason Mitchell, along with Nosek and Banaji, instructed white female college students to sort a series of stereotypically black female and white male names according to either race or gender. The group found that categorizing the names according to their race prompted a prowhite bias, but categorizing the same set of names according to their gender prompted an implicit profemale (and hence problack) bias. “These attitudes can form quickly, and they can change quickly” if we restructure our environments to crowd out stereotypical associations and replace them with egalitarian ones, Dasgupta concludes.
In other words, changes in external stimuli, many of which lie outside our control, can trick our brains into making new associations. But an even more obvious tactic would be to confront such biases head-on with conscious effort. And some evidence suggests willpower can work. Among the doctors in the thrombolytic drug study who were aware of the study’s purpose, those who showed more implicit racial bias were more likely to prescribe thrombolytic treatment to black patients than were those with less bias, suggesting that recognizing the presence of implicit bias helped them offset it.
In addition, people who report a strong personal motivation to be nonprejudiced tend to harbor less implicit bias. And some studies indicate that people who are good at using logic and willpower to control their more primitive urges, such as trained meditators, exhibit less implicit bias. Brain research suggests that the people who are best at inhibiting implicit stereotypes are those who are especially skilled at detecting mismatches between their intentions and their actions.
But wresting control over automatic processes is tiring and can backfire. If people leave interracial interactions feeling mentally and emotionally drained, they may simply avoid contact with people of a different race or foreign culture. “If you boil it down, the solution sounds kind of easy: just maximize control,” says psychologist B. Keith Payne of the University of North Carolina at Chapel Hill. “But how do you do that? As it plays out in the real world, it’s not so easy.”
Other research suggests that developing simple but concrete plans to supplant stereotypes in particular situations can also short-circuit implicit biases. In an unpublished study Payne and his colleague Brandon D. Stewart, now a postdoctoral fellow at the University of Queensland in Australia, found that those who simply resolved to think of the word “safe” whenever they saw a black face showed dramatic reductions in implicit racial bias. “You don’t necessarily have to beat people over the head with it,” Payne observes. “You can just have this little plan in your pocket [think ‘safe’] that you can pull out when you need it. Once you’ve gone to the work of making that specific plan, it becomes automatic.”
Taking Control
Despite such data, some psychologists still question the concept of implicit bias. In a 2004 article in the journal Psychological Inquiry, psychologists Hal R. Arkes of Ohio State and Philip E. Tetlock of the University of California, Berkeley, suggest that implicit associations between, for example, black people and negative words may not necessarily reflect implicit hostility toward blacks. They could as easily reflect other negative feelings, such as shame about black people’s historical treatment at the hands of whites. They also argue that any unfavorable associations about black people we do hold may simply echo shared knowledge of stereotypes in the culture. In that sense, Arkes and Tetlock maintain, implicit measures do not signify anything meaningful about people’s internal state, nor do they deserve to be labeled “prejudiced” — a term they feel should be reserved for attitudes a person deliberately endorses.
Others dispute the significance of such a distinction. “There is no clear boundary between the self and society—and this may be particularly true at the automatic level,” write Rudman and Ashmore in a 2007 article in the journal Group Processes & Intergroup Relations. “Growing up in a culture where some people are valued more than others is likely to permeate our private orientations, no matter how discomfiting the fact.”
If we accept this tenet of the human condition, then we have a choice about how to respond. We can respond with sadness or, worse, with apathy. Or we can react with a determination to overcome bias. “The capacity for change is deep and great in us,” Banaji says. “But do we want the change? That’s the question for each of us as individuals—individual scientists, and teachers, and judges, and businesspeople, and the communities to which we belong.”