Weaponizing the White Space: Stop-and-Frisk Policing

Afton and I led a class discussion on the book No Place on the Corner, about the effects of stop-and-frisk policing on communities of color in the Bronx. This reading was extremely interesting, especially in the context of some of the other readings we’ve done, including Crook County and “The Inheritance of the Ghetto”. It was disturbing to see the system come full circle, from New Deal polices that marked primarily Black communities as liabilities, to White flight, to targeted and abusive policing, to corrupt courts. In addition, it shed a light on issues of policing often ignored in the news cycle, and provided context for tragic police shootings of young Black men.

After reading and having a class discussion on Algorithms of Oppression, I realized that COMP-STAT policing, based on computer software designed by former NYPD commissioner William Bratton, is important to consider in the context of biased algorithms. In Algorithms of Oppression, the author notes that no one is free of implicit bias, and computer programmers, knowingly or unknowingly code that bias into their programs. However, the public perceives these programs to be completely objective. This is indeed the case for the COMP-STAT software. Its creator, William Bratton, was recently accused of threatening an officer who had come forward saying he was ordered to arrest minorities (https://www.nydailynews.com/new-york/ny-bratton-bronx-quotas-roll-call-20191205-teuunpsiznecxlndk3n6htzxna-story.html) . This officer was assigned to the 40th precinct in the South Bronx, a precinct discussed in No Place on the Corner. There is potential that Bratton’s implicit bias may have made its way into the software he designed, which now is used to direct NYPD operations.

Regardless of Bratton’s influence, COMP-STAT software, compounded by “unofficial” summons quotas and broken-windows policing, reinforces police officer’s implicit biases. When officers are pushed to meet quotas, they often target minority communities, such as the ones discussed in the book. When unnecessary stops are made, officers put this data into the COMP-STAT system in order to “bloat their numbers”. The COMP-STAT system then makes it seem like some communities (typically minority communities) are committing more crimes than they actually are. Then, police officers use this skewed data to direct their future decision making when stopping people on the street, leading them to stop more people in already over-targeted communities of color. This process is summarized well in the USA Today article “When policing stats do more harm than good” by Joseph Giacalone and Alex Vitale; “All across the country, many of the complaints about excessive and heavy handed policing are driven by unnecessary and counterproductive over-policing in an attempt to ‘get the numbers up’”.

The idea that algorithms are perceived as objective yet inherently biased can be translated to other parts of society. How often do we see a statistic, or a news story, and immediately assume its true? How does this change how we see the world? We see arrest statistics and crime rates all the time, but rarely do we (as a society) question the social algorithms that skew the data. It’s easy to imagine that this is the case with policing in New York. It’s way too easy to look at an institution like a police department and immediately assume that they’re always right, and that they’ll always do the right thing. This assumption only serves to perpetuate racial stereotypes and further alienate and discriminate against communities of color.

One thought on “Weaponizing the White Space: Stop-and-Frisk Policing

  1. agordy

    I think the relationship between algorisms of oppression and no place on the corner is a really insightful one. When the government uses theses flawed technologies that replicate systemic racism, these inequalities get further entrenched. It isn’t just isolated to policing technology. Some states use algorisms to remove doubled registered voters from the voting lists; however, there is a high inaccuracy rate, and citizens of color are more likely to be wrongly excluded. It seems like our society wants desperately to modernize without considering the ways in which those modernizations further marginalize communities of color. It brings up questions of what does progress mean, and who is progress for.

Comments are closed.