[News] Police Tech Isn’t Designed to Be Accurate -- It’s Made to Exert Social Control

Anti-Imperialist News news at freedomarchives.org
Sun Dec 17 19:42:58 EST 2023


truthout.org 
<https://truthout.org/articles/police-tech-isnt-designed-to-be-accurate-its-made-to-exert-social-control/> 



  Police Tech Isn’t Designed to Be Accurate -- It’s Made to Exert Social
  Control

James Kilgore - December 16, 2023
------------------------------------------------------------------------

In the past 15 years, policing has grown its reach, largely through an 
array of technologies that record and store our personal details and 
daily activities. Using algorithms and other formulae, authorities are 
able to repurpose data to meet the emerging demands of the criminal 
legal and immigration systems. From predictive policing to GPS-enabled 
ankle monitors to gunshot trackers to massive interlinked databases, 
police are extending their capacity to track and control. But in recent 
years, communities, researchers and activists have begun to build a 
critique of these technologies. Their critique may ultimately take us 
well beyond liberal notions of privacy to address fundamental questions 
of political power and freedom.


    *Predictive Policing*

One key target has been predictive policing. Implemented as early as 
2008, predictive policing gathers data on incidents of crime and people 
who commit crime to predict future events and trends. Over the years, 
various versions of this policing technology, such as LASER or Hot Spot, 
have proven problematic. The most recent exposé of this widely used 
technology surfaced in an October 2023 piece 
<https://themarkup.org/prediction-bias/2023/10/02/predictive-policing-software-terrible-at-predicting-crimes?utm_source=TMP-Newsletter&utm_campaign=ea6b675321-EMAIL_CAMPAIGN_2023_10_03_11_07&utm_medium=email&utm_term=0_5e02cdad9d-ea6b675321-%5BLIST_EMAIL_ID%5D> 
by Aaron Sankin and Surya Mattu, published jointly by /The Markup/ and 
/Wired/. The authors’ findings revealed that the policing technology of 
the widely contracted company Geolitica (formerly PredPol) had a success 
rate of less than 1 percent in its mission of predicting the time and 
place of a crime. Drawing on more than 23,000 predictions from 360 
locations 
<https://themarkup.org/show-your-work/2023/10/02/how-we-assessed-the-accuracy-of-predictive-policing-software> 
in Plainfield, New Jersey, the authors found a success rate of 0.6 
percent for burglary and 0.1 percent for assaults and robberies. Part of 
the reason for these disastrous results was a statistical model which 
yields a large number of predictions in the hope of capturing at least 
some crime incidents in their net — a little like buying 1,000 lottery 
tickets in the hopes of getting at least one winner, regardless of how 
much is lost along the way.

Predictive policing algorithms also incorporate racial bias, often 
directing law enforcement to communities already rife with police, 
surveillance and high arrest rates. The Electronic Frontier Foundation 
describes predictive policing as a “self-fulfilling prophecy,” meaning 
that if authorities direct more police to an area or at a targeted 
group, police will make more arrests there regardless of the presence of 
crime.

The shortcomings of predictive policing led Plainfield authorities to 
follow in the footsteps of Los Angeles and other former clients of 
Geolitica and cancel their contract. Los Angeles’s cancellation grew out 
of a campaign led by the Stop LAPD Spying Coalition, whose activists 
revealed the racist bias in the technology’s predictions and the false 
logic of the company’s claim 
<https://stoplapdspying.org/wp-content/uploads/2018/05/Before-the-Bullet-Hits-the-Body-Report-Summary.pdf> 
that “criminal offenders are essentially hunter-gatherers; they forage 
for opportunities to commit crimes.”


    *GPS Monitoring*

Studies of GPS-enabled electronic monitors reveal patterns of 
inaccuracy. In 2023, a data scrape led by freelance data journalist Matt 
Chapman uncovered 
<https://thetriibe.com/2022/11/many-on-house-arrest-in-cook-county-bombarded-with-texts-from-sheriffs-contractor/> 
gross inaccuracies in the pretrial GPS monitoring program in Cook 
County, Illinois — the largest in the nation. Chapman found the devices 
generated thousands of false alerts, often leading to police raids and 
baseless arrests. A separate 2021 Cook County study 
<https://www.documentcloud.org/documents/22052939-presentation-gps-em-location-alert-analysis-nov-2021> 
concluded that 80 percent of the alarms for violation of electronic 
monitoring rules were “false positives.” These false alerts can have 
serious consequences. One respondent described the trauma of receiving 
six texts per day over a period of 18 months that delivered false alerts 
about alleged electronic monitoring violations. One of those false 
alerts led to a two-day stint in jail. His fate was not unique. 
/Truthout/ has talked with dozens of people across the country who have 
been wrongly sent back to prison after their “tracking” device reported 
that they were located several blocks, even several miles, away from 
where they actually were. One Chicago woman told us that a false alert 
led to her arrest. She subsequently fell in her jail cell, fractured her 
jaw and needed surgery when she was released.


    *Gunshot Trackers*

SoundThinking (formerly ShotSpotter) is a detection technology that 
claims to track and trace the sounds of gunshots in urban areas. But 
studies 
<https://apnews.com/article/artificial-intelligence-algorithm-technology-police-crime-7e3345485aa668c97606d4b54f9b6220> 
in several of the more than 100 cities where SoundThinking has contracts 
paint an alarming picture of inaccuracy. Despite complaints that false 
alerts disproportionately target Black and Brown neighborhoods, most 
decision-makers maintain their infatuation with the product. For its 
part, SoundThinking remains content with business as usual. In over 20 
years of operation, the company has not produced 
<https://www.macarthurjustice.org/shotspotter-generated-over-40000-dead-end-police-deployments-in-chicago-in-21-months-according-to-new-study/> 
a single scientific study testing how reliably their technology can tell 
the difference between the sound of gunfire and other loud noises. 
Instead, the company aggressively defends the secrecy of their product 
design. When a SoundThinking alert in Chicago led to the arrest of an 
individual, the company refused a court order to bring forward evidence 
of how it assessed gunshot sounds. The firm chose instead to accept a 
contempt of court 
<https://chicagoreader.com/news-politics/shotspotter-held-in-contempt-of-court/> 
charge. Chicago Mayor Brandon Johnson has pledged to not renew the 
city’s contract with SoundThinking in 2024. City leaders in Dayton, 
Atlanta and Seattle have taken similar steps by recently blocking or 
ending 
<https://ohiocapitaljournal.com/2023/07/18/why-dayton-quit-shotspotter-a-surveillance-tool-many-cities-still-embrace/> 
SoundThinking contracts.


    *Other Technologies*

Racial bias has surfaced in other technologies, most notably in facial 
recognition apps <https://www.youtube.com/watch?v=eRUEVYndh9c> that have 
led to the misidentification, and in some cases arrest, of at least six 
Black men in a number of cities including Detroit, New Orleans and 
Baltimore. Moreover, a 2023 New Orleans study 
<https://www.politico.com/news/2023/10/31/new-orleans-police-facial-recognition-00121427> 
contended that this technology fell short in proponents’ claims to be 
able to solve crime.

Risk assessment tools 
<https://pretrialrisk.com/the-danger/impacts-of-biased-risk-assessments/> 
that build algorithms based on data from racist criminal legal 
institutions and social service agencies have also come under fire from 
several scholars and researchers arguing that they wrongly classify 
people’s suitability for pretrial release or the appropriateness of a 
sentence.


    *Less Regulated Than Toasters*

Part of the explanation for these inaccuracies lies with the failure to 
adequately test these technologies before marketing. While toaster 
producers must conform to stringent regulations 
<https://www.itcindia.org/iec-60335-2-9-particular-requirements-for-toasters/> 
and subject their products to rigorous testing, in the high-stakes world 
of policing, producers often get a free pass.

    Many of these technologies simply have no place in a world that
    respects life.

The only technical requirement for an electronic ankle monitor at the 
national level is an optional set of standards 
<https://www.ojp.gov/pdffiles1/nij/249810.pdf> produced in 2016 by the 
National Institute of Justice requiring a geolocation accuracy of 98 
feet. Most residences, especially urban apartments, could not 
accommodate a person who is 98 feet from the geolocator box. Hence a 
miscalculation of 98 feet would register as a violation of household 
restrictions.

Meanwhile, Black computer scientist Joy Buolamwini used research 
<https://www.npr.org/2023/11/28/1215529902/unmasking-ai-facial-recognition-technology-joy-buolamwini> 
on her own face to expose what she labeled the “coded gaze.” The coded 
gaze refers to the data base of faces used to create models for 
prediction. In Buolamwini’s assessment, the database of faces for 
testing this technology is disproportionately white and male, making the 
software more likely to identify a face as white and male. In fact, 
Buolamwini, who is a dark-skinned Black woman, found that the technology 
could not even see her face, apparently because she was out of the norm.

Rather than developing rigorous pre-marketing testing protocols, as tech 
writer Dhruv Mehrota told /Truthout/, these technologies “are tested in 
the field.” Dillon Reisman, founder of the American Civil Liberties 
Union of New Jersey’s Automated Injustice Project, told 
<https://themarkup.org/prediction-bias/2023/10/02/predictive-policing-software-terrible-at-predicting-crimes?utm_source=TMP-Newsletter&utm_campaign=ea6b675321-EMAIL_CAMPAIGN_2023_10_03_11_07&utm_medium=email&utm_term=0_5e02cdad9d-ea6b675321-%5BLIST_EMAIL_ID%5D> 
/The Markup/ that all over New Jersey, companies are selling “unproven, 
untested tools that promise to solve all of law enforcement’s needs, 
and, in the end, all they do is worsen the inequalities of policing and 
for no benefit to public safety.”

Instead of providing test results, police technology companies primarily 
rely on promoting individual success stories 
<https://www.soundthinking.com/shotspotter-public-safety-results/?utm_term=gunshot%20detection&utm_campaign=Non-Branded+-+Services&utm_source=adwords&utm_medium=ppc&hsa_acc=8557512895&hsa_cam=19121679341&hsa_grp=142764973374&hsa_ad=655799149081&hsa_src=g&hsa_tgt=kwd-322691664378&hsa_kw=gunshot%20detection&hsa_mt=p&hsa_net=adwords&hsa_ver=3&gclid=Cj0KCQiAjMKqBhCgARIsAPDgWlzmZM6L0etGibUF2pGRpZ1udRlQsVxzQqefPjpZWY_5PXAoLnDV-CkaAkkSEALw_wcB> 
or simplistically attributing reductions in crime and the saving 
<https://www.youtube.com/watch?v=d1HbknbpFXQ> of lives to the presence 
of their technologies without considering other factors. Dayton, 
Ohio-based human rights activist Julio Mateo told /Truthout/ that 
SoundThinking tries “to play up the situations in which these 
technologies help and try to make invisible the times when people are 
searched and traumatized.”

Companies and decision-makers seem not to consider the opportunity costs 
or ancillary impact of using these devices. For example, in voting for 
the reinstatement of SoundThinking in New Orleans after a two-year ban, 
Black city councilor Eugene Green proclaimed 
<https://www.politico.com/news/2023/10/31/new-orleans-police-facial-recognition-00121427>, 
“If we have it for 10 years and it only solves one crime, but there’s no 
abuse, then that’s a victory for the citizens of New Orleans.” Like most 
supporters of police technology, Green failed to acknowledge that the 
financial and human resources devoted to SoundThinking could have gone 
to programs proven to prevent violence by providing direct benefits to 
impacted populations in the form of services such as mental wellness, 
after-school activities and job training. Similarly, Green’s comments 
overlooked the trauma of people subjected to repeated false alerts.

On the surface, these outrageous failures to test police technologies 
without even the rigor demanded of a toaster appear puzzling. We expect 
our phones, laptops, tablets, and every other device we use to meet a 
certain consumer standard. A cellphone that consistently connected us to 
the wrong number or jumbled the entries in our contact lists would have 
a very short shelf life. But completely different standards apply to 
technologies of control and oppression, especially those that deal with 
Black people and other marginalized populations.


    *Why the Paradox Continues*

This apparent paradox exists for several reasons. At a systems level, 
the decentralized structure of policing and law enforcement facilitates 
the expansion of these technologies. Local authorities typically make 
their own decisions on surveillance and policing. For the purveyors of 
these technologies, local decision-making offers a huge and welcoming 
marketplace. While cities like Boston 
<https://www.npr.org/sections/live-updates-protests-for-racial-justice/2020/06/24/883107627/boston-lawmakers-vote-to-ban-use-of-facial-recognition-technology-by-the-city#:~:text=Gaming-,Boston%20Lawmakers%20Vote%20To%20Ban%20Use%20Of%20Facial%20Recognition%20Technology,inaccurate%20for%20people%20of%20color.> 
and San Francisco 
<https://www.vox.com/recode/2019/5/14/18623897/san-francisco-facial-recognition-ban-explained> 
have banned facial recognition, most smaller jurisdictions lack the 
technical expertise and resources to conduct serious investigations into 
police technology. They rarely have policies or research agendas to 
address the potential perils of apps like facial recognition or gunshot 
trackers. As a result, the main sources of information for local 
government are frequently the company representatives themselves. In 
many cases, local police or sheriffs, operating through their own 
networks, become the major promoters of these technologies across 
regions, largely because they enhance that image of the technical 
efficiency of their operations.

The decentralized structure also makes mounting national opposition 
campaigns more challenging, especially since federal authorities have 
chosen not to impose regulations. In fact, in many instances, federal 
authorities promote such usage, offering free access to surplus military 
equipment and invasive surveillance technology through the Law 
Enforcement Support Office’s 1033 
<https://www.dla.mil/Disposition-Services/Offers/Law-Enforcement/Join-The-Program/> 
Program as well as grants 
<https://www.aclu.org/news/national-security/does-your-local-government-have-black-budget-too#:~:text=Federal%20grants%20to%20local%20and,governments%20via%20opaque%20grant%20programs.> 
operating through the Department of Homeland Security and National 
Security Agency. As of 2021, more than 10,000 federal, state and local 
law enforcement agencies were participating 
<https://www.aclu.org/news/criminal-law-reform/federal-militarization-of-law-enforcement-must-end> 
in the 1033 Program. Further, the emergence of COVID-19 relief funds 
through the American Rescue Plan Act (ARPA) directed 
<https://epic.org/two-years-in-covid-19-relief-money-fueling-rise-of-police-surveillance/> 
new resource flows to local authorities for police surveillance 
technologies such as automatic license plate-readers, facial recognition 
systems, gunshot detection programs and phone hacking tools. President 
Joe Biden encouraged such expenditures during an address to a Gun 
Violence Prevention Task Force meeting in 2022, urging 
<https://epic.org/two-years-in-covid-19-relief-money-fueling-rise-of-police-surveillance/> 
cities to purchase “gun-fighting technologies, like technologies that 
hears, locates gunshots so there can be immediate response because you 
know exactly where it came from.” The nonprofit Electronic Privacy 
Information Center estimated 
<https://epic.org/wp-content/uploads/2023/03/EPIC-ARPA-Surveillance-Funding-Table.pdf> 
that as of September 2022, at least 70 local governments had allocated 
ARPA funding to surveillance technology.

In addition to systemic factors, police technology also requires a 
controlling narrative. What researcher Evgeny Morozov calls 
technological-solutionism 
<https://www.publicbooks.org/the-folly-of-technological-solutionism-an-interview-with-evgeny-morozov/>, 
is essential to that narrative. Technological-solutionism influences 
decision-makers and thought leaders to ignore options for addressing 
deep social problems like white supremacy or the need to redistribute 
income and resources. Instead, technological-solutionism recasts complex 
social phenomena as “neatly defined problems with definite, computable 
solutions or as transparent and self-evident processes that can be 
easily optimized — if only the right algorithms are in place!” In 
contemporary capitalism such solutions enhance the profits and the power 
of Big Tech <https://hbr.org/2022/01/can-big-tech-be-disrupted> while 
making claims to address inequities, particularly those based on race. 
This obsession with technological solutions dampens efforts at critique 
and provides space for expanding or tweaking police technology. 
Moreover, technological-solutionism has emerged amid a fundamental 
restructuring of contemporary capitalism, characterized by the rise of 
Big Tech and the expansion of policing in all its forms. This 
transformation has enabled a range of “solutions” unimaginable less than 
two decades ago, including the technologies discussed here.


    *We Desperately Need a New Framework for Tech*

However, we are only in the early days of what I refer to as “digital 
colonialism,” a period that began with the launch of the first iPhone in 
2007. In the world of digital colonialism, solutions come from tech 
giants like Google, Microsoft, Apple, Meta and Amazon. In the manner of 
colonialists of the past, Big Tech leads the establishment of a settler 
regime within the unconquered territory of the digital world. The 
companies set the rules, control the technology and dictate the regime 
of accumulation. Like colonial states, these powers value order and 
hierarchies based on race, ethnicity and gender. Just as colonial states 
offered the Bible, Western education and the products of 
industrialization, so do Amazon and their ilk offer the digital world of 
Chrome, cellphones and Uber in exchange for the essential raw material 
for their empire: data.

As immense as the data on current computer clouds may seem, the colonial 
oligarchs are just starting to figure out how to deploy artificial 
intelligence to collect and use people’s data to both maximize their 
profits and extend 
<https://www.nytimes.com/2023/06/10/technology/ai-humanity.html> the 
depth of social control. Data from facial recognition, crudely racist as 
it may be, is only beginning to intersect with other punitive and 
controlling technologies. While research has unearthed several of the 
shortcomings of predictive policing and gunshot locators, exposing these 
flaws represents only a baby step on the path to challenging the immense 
power of the digital monopolists.

For the moment, to borrow a phrase 
<https://collectiveliberation.org/wp-content/uploads/2013/01/Lorde_The_Masters_Tools.pdf> 
from Audre Lord, critics are using the master’s tools to contest the 
power of Big Tech. Like the first discoverers of gold in South Africa, 
activists and researchers are grabbing a few nuggets of consumer 
products while handing over a lot more wealth in terms of biometrics and 
other data. Transforming these power dynamics won’t come from merely 
attacking the inaccuracies or racial bias baked into modern surveillance 
and policing. In fact, enhancing the technical capacity or reducing the 
racial bias in these technologies may only create more efficient 
punitive regimes.

Many of these technologies simply have no place in a world that respects 
life. Databases have many uses, especially in tracking climate change or 
air quality, but only if informed by a social justice framework that is 
not driven by profit nor dogmatic paradigms that either deify or totally 
reject technology.

We remain a long way from putting such frameworks in place. At a moment 
when the cutting-edge of technology and surveillance and the world’s 
political acumen are trained on Gaza, a tiny strip of land which is 
perhaps the ultimate laboratory 
<https://www.versobooks.com/products/2684-the-palestine-laboratory> for 
these technologies, building that framework looms all the more urgent.

/Thanks to Teresa Barnes, Dhruv Mehrota, Matt Chapman and Julio Mateo 
for providing the comments and information used to compile this article. /


          We need your help to propel Truthout into the new year

As we look toward the new year, we’re well aware of the obstacles that 
lie in the path to justice. But here at /Truthout/, we are encouraged 
and emboldened by the courage of people worldwide working to move us all 
forward — people like you.

If you haven’t yet made your end-of-year donation to support our work, 
this is the perfect moment to do so: *Our year-end fundraising drive is 
happening now, and we must raise $150,000 by the end of December.*

Will you stand up for truly independent, honest journalism by making a 
contribution in the amount that’s right for you? It only takes a few 
seconds to donate by card, Apple Pay, Google Pay, PayPal, or Venmo — we 
even accept donations of cryptocurrency and stock! Just click the red 
button below.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://freedomarchives.org/pipermail/news_freedomarchives.org/attachments/20231217/bea630de/attachment-0001.htm>


More information about the News mailing list