Police show more respect to whites than blacks during traffic stops, according to a computer analysis of conversations recorded by police body cameras in Oakland, California.

Videotape footage of police exchanges with people of color has quickly become a mainstay of public — and often viral — stories about law enforcement practices in the U.S. But it remains unclear if these videos represent isolated incidents or a general pattern of racial bias.

By relying on computers, this new study from Stanford University provides an impartial take on policing during traffic stops as well as a new automated method for assessing the behavior of cops based on the language they use.

What they studied

    • Stanford psychologist Jennifer Eberhardt led a team of linguists and computer scientists who examined police body camera footage from one month — April 2014 — of routine traffic stops in the racially diverse city of Oakland, California.
    • They examined 981 stops, involving 682 black and 299 white drivers. These numbers mirror that national trend. Of the 26 million traffic stops recorded each year, a higher percentage are black drivers.
    • The stops involved 245 different officers of varying races (102 white, 39 black, 36 Asian, 57 Hispanic and 11 marked as “other.”) A large majority — 224 of the officers — were male.
    • Researchers reviewed 183 hours — 7.5 days — worth of body camera footage, from which they examined the language used in 36,000 exchanges between drivers and cops.
    • “We don’t know of any other department right now taking this kind of approach to the footage,” Eberhardt said of her study published Monday in PNAS. By leveraging body camera footage for a better understanding of police-community relations, “we can learn a lot more about the millions of interactions happening during these routine stops than we can from the popularized isolated cases.”
  • To measure officer treatment, the researchers conducted three experiments.
  • The first took a subset of the officer statements (312 directed at blacks, 102 at whites) and then had an independent panel of 70 people rank — on a four-point scale — how respectful, polite, friendly, formal and impartial the officer was in each exchange. The panel members did not know the racial makeup of the drivers, though they did see what the drivers said right before the cops responded.
    • The second replaced the human panel with computers armed with linguistic algorithms. These programs recorded when the officers used language that drive perceptions of respect — such as giving agency, softening of commands, saying thanks, apologizing or using formal titles versus informal addresses like dude, bro, boss, man, brotha, sista or chief. This experiment aimed to ascertain if computers could gauge respectful language to the same degree as humans in the first experiment.
  • The third experiment set loose the computer algorithms to sift through the full set of 36,000 exchanges by police officers.

What they found

  • The human panel judged that officer behavior varied most widely when it came to being respectful, with white members of the community receiving more respect than blacks. “Even though the people who were reading the statements had no idea about the driver’s race, we found they judged the officer language directed at black motorists to be less respectful than language directed at white motorists,” Eberhardt said.
  • Other aspects of police behavior — such as formality toward drivers — did not change based on the driver’s identity.
  • The race of the police officers did not alter these patterns, neither did the severity of the traffic offense, nor the location of the stop in the city.
  • The computer algorithms identified almost exactly the same trends, agreeing with the human assessment of respectful behavior at nearly the same rate.
  • The computer analysis could also quantify the scale of a problem. White community members were 57 percent more likely to have an exchange filled with a highest degree of respectful language, whereas black drivers were 61 percent more likely to experience an exchange that fell into the category of least respectful.
  • Moreover, over the course of an entire traffic stop, the use of respectful language increased more quickly for whites than blacks. The trend means “even when the community member hasn’t had much time to say very much at all, there’s already a race gap in respect,” Eberhardt said.

Why it matters

  • Despite the social cues and psychological biases that define much of human interactions, computers could pull consistent and meaningful disparities in how officers behaved toward drivers from language alone.
  • “That’s what we want to do. We want to automate this to a point where it’s not laborious to go through the footage,” Eberhardt said.
  • Community members want body cameras, because the devices provide accountability and transparency, she said. Law enforcement show mixed support for the use of body cameras in surveys, though their power to adjudicate disputes over the series of events of an incident is accepted by ordinary citizens and police. Eberhardt wants a study like hers to take matters a step further and identify what contributes to police-community relations — both positive and negative experiences — in the first place.

What happens next

  • Her team wants to explore if motorists provoke this race disparity in respectful behavior from police officers — though two aspects of this current study suggest drivers may play less of a role.
  • First, the human panel saw what the drivers said right before the officers responded, which provided context for the exchange.
  • Second, Eberhardt’s study also found that, on average, over the course of a single traffic stop, the use of respectful language increased more quickly for whites than blacks. The trend means “even when the community member hasn’t had much time to say very much at all, there’s already a race gap in respect.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here