Commodifying Discrimination

LEGAL • TECHNOLOGY 04.11.18

UPDATED 01.21; 01.24

Using commercial, social and historical data mining to determine threat levels.

IN 2017, POLICE fatally shot 987 people, "two dozen" more than in 2016. Even though the number of black males killed has fallen slightly, presumably due to the spotlight of the Black Lives Matter movement, data shows "black males continue to be shot at disproportionately high rates," according to an ongoing Washington Post database project that tracks fatal shootings,  reported by the Washington Post.

Data mining systems associate people of color with higher levels of suspicion due to an artifact of the technology itself even when the tech is used in good faith.

Police departments all over the country use data mining and predictive policing systems "sold in part as a 'neutral' method to counteract unconscious biases," and as a "more cost-efficient way to do policing." But when you zoom in, instead of removing bias, these data mining systems associate people of color with higher levels of suspicion, shows Andrew Selbst in his paper on Big Data Policing, Georgia Law Review.

In real life, this means increased suspicion and nuisance crime arrests in vulnerable communities due to an "artifact of technology itself."  It happens even when police departments use the tech in good faith. Yet, there is no way for these departments or for the public to determine what the ultimate social effects of these tools will be.

Algorithm Impact Statements (AIAs).

Selbst most eruditely proposes Algorithm Impact Statements (AIAs) modeled on the impact statements (EISs) of the National Environmental Policy Act (NEPA). AI Now proposed an AIA framework for the NYC Automated Decision Systems task force.

Current law doesn't address unintentional discrimination.

Police are customarily regulated by the Fourth and Fourteenth Amendments, however neither amendment clearly covers "unintentional discrimination."

Unintentional discrimination can happen when police "rely on machines for their suspicion detection,"  even though an individual officer might not be "subconsciously racist." The same can happen with the developers of the algorithms who are  not "likely to be relying, even unconsciously, on racial stereotypes."

Yet, data from the Missouri State Attorney GeneralACLUthe states of Connecticut, Illinois, Maryland, Nebraska, North Carolina and Rhode Island show that African Americans and Latinos are stopped by police more than whites, at a higher rate than is reflective of the size of their populations — even though contraband is found in greater numbers in whites who are stopped, finds by Michael Selmi in his paper on Statistical Inequality and Discrimination, citing reporting by the New York Times.

Accepting discrimination to achieve other police goals.

Selbst observes that by tolerating discrimination as a trade-off for achieving other policing goals, discrimination is commodified, citing Steven Kelman's  ethical critique on cost-benefit analysis.

There is concern over data mining in other areas of civil rights protections. A 2014 White House report questioned the potential of "big data analytics" to "eclipse longstanding civil rights protections," Podesta Report, and Interim Progress Report, 2015.

Programs that create public accountability address implicit bias.

Contemporary discrimination research is captivated with the concept of "implicit bias," observes Selmi.  Implicit bias is usually said to be "unconscious,"  "less blameworthy," and "beyond one's control." However, it is a different shade of discrimination and it needs to be addressed.

There is literature showing  that it can be controlled when people are held accountable, as noted by Frank Dobbin, Alexandra Kalev in their paper on workplace diversity, which finds "programs that create public accountability," lead to "increases in the presence" of "unrepresented" groups. (Updated 12.18)

Last April, McKinsey recognized Dobbin and Kalev's work. In that interview, Kalev asserted the value of  a "task force" made up of people from different departments and ranks to address discrimination,  reported by McKinsey.

Elaine Sarduy is a freelance writer and content developer @Listing Debuts