<!DOCTYPE html>
<html>
  <head>

    <meta http-equiv="content-type" content="text/html; charset=UTF-8">
  </head>
  <body>
    <div dir="ltr">
      <div id="gmail-toolbar" class="gmail-toolbar-container"> </div>
      <div class="gmail-container" dir="ltr" lang="en-US">
        <div
class="gmail-header gmail-reader-header gmail-reader-show-element"> <a
            class="gmail-domain gmail-reader-domain"
href="https://truthout.org/articles/police-tech-isnt-designed-to-be-accurate-its-made-to-exert-social-control/"
            moz-do-not-send="true">truthout.org</a>
          <h1 class="gmail-reader-title">Police Tech Isn’t Designed to
            Be Accurate -- It’s Made to Exert Social Control</h1>
          <div class="gmail-credits gmail-reader-credits">James Kilgore
            - December 16, 2023<br>
          </div>
        </div>
        <hr>
        <div class="gmail-content">
          <div
            class="gmail-moz-reader-content gmail-reader-show-element">
            <div id="gmail-readability-page-1" class="gmail-page">
              <div id="gmail-articleContent">
                <p>In the past 15 years, policing has grown its reach,
                  largely through an array of technologies that record
                  and store our personal details and daily activities.
                  Using algorithms and other formulae, authorities are
                  able to repurpose data to meet the emerging demands of
                  the criminal legal and immigration systems. From
                  predictive policing to GPS-enabled ankle monitors to
                  gunshot trackers to massive interlinked databases,
                  police are extending their capacity to track and
                  control. But in recent years, communities, researchers
                  and activists have begun to build a critique of these
                  technologies. Their critique may ultimately take us
                  well beyond liberal notions of privacy to address
                  fundamental questions of political power and freedom.</p>
                <h2><strong>Predictive Policing</strong></h2>
                <p>One key target has been predictive policing.
                  Implemented as early as 2008, predictive policing
                  gathers data on incidents of crime and people who
                  commit crime to predict future events and trends. Over
                  the years, various versions of this policing
                  technology, such as LASER or Hot Spot, have proven
                  problematic. The most recent exposé of this widely
                  used technology surfaced in an October 2023 <a
href="https://themarkup.org/prediction-bias/2023/10/02/predictive-policing-software-terrible-at-predicting-crimes?utm_source=TMP-Newsletter&utm_campaign=ea6b675321-EMAIL_CAMPAIGN_2023_10_03_11_07&utm_medium=email&utm_term=0_5e02cdad9d-ea6b675321-%5BLIST_EMAIL_ID%5D"
                    moz-do-not-send="true">piece</a> by Aaron Sankin and
                  Surya Mattu, published jointly by <em>The Markup</em>
                  and <em>Wired</em>. The authors’ findings revealed
                  that the policing technology of the widely contracted
                  company Geolitica (formerly PredPol) had a success
                  rate of less than 1 percent in its mission of
                  predicting the time and place of a crime. Drawing on
                  more than 23,000 predictions from 360 <a
href="https://themarkup.org/show-your-work/2023/10/02/how-we-assessed-the-accuracy-of-predictive-policing-software"
                    moz-do-not-send="true">locations</a> in Plainfield,
                  New Jersey, the authors found a success rate of 0.6
                  percent for burglary and 0.1 percent for assaults and
                  robberies. Part of the reason for these disastrous
                  results was a statistical model which yields a large
                  number of predictions in the hope of capturing at
                  least some crime incidents in their net — a little
                  like buying 1,000 lottery tickets in the hopes of
                  getting at least one winner, regardless of how much is
                  lost along the way.</p>
                <p>Predictive policing algorithms also incorporate
                  racial bias, often directing law enforcement to
                  communities already rife with police, surveillance and
                  high arrest rates. The Electronic Frontier Foundation
                  describes predictive policing as a “self-fulfilling
                  prophecy,” meaning that if authorities direct more
                  police to an area or at a targeted group, police will
                  make more arrests there regardless of the presence of
                  crime.</p>
                <p>The shortcomings of predictive policing led
                  Plainfield authorities to follow in the footsteps of
                  Los Angeles and other former clients of Geolitica and
                  cancel their contract. Los Angeles’s cancellation grew
                  out of a campaign led by the Stop LAPD Spying
                  Coalition, whose activists revealed the racist bias in
                  the technology’s predictions and the false logic of
                  the company’s <a
href="https://stoplapdspying.org/wp-content/uploads/2018/05/Before-the-Bullet-Hits-the-Body-Report-Summary.pdf"
                    moz-do-not-send="true">claim</a> that “criminal
                  offenders are essentially hunter-gatherers; they
                  forage for opportunities to commit crimes.”</p>
                <h2><strong>GPS Monitoring</strong></h2>
                <p>Studies of GPS-enabled electronic monitors reveal
                  patterns of inaccuracy. In 2023, a data scrape led by
                  freelance data journalist Matt Chapman <a
href="https://thetriibe.com/2022/11/many-on-house-arrest-in-cook-county-bombarded-with-texts-from-sheriffs-contractor/"
                    moz-do-not-send="true">uncovered</a> gross
                  inaccuracies in the pretrial GPS monitoring program in
                  Cook County, Illinois — the largest in the nation.
                  Chapman found the devices generated thousands of false
                  alerts, often leading to police raids and baseless
                  arrests. A separate 2021 Cook County <a
href="https://www.documentcloud.org/documents/22052939-presentation-gps-em-location-alert-analysis-nov-2021"
                    moz-do-not-send="true">study</a> concluded that 80
                  percent of the alarms for violation of electronic
                  monitoring rules were “false positives.” These false
                  alerts can have serious consequences. One respondent
                  described the trauma of receiving six texts per day
                  over a period of 18 months that delivered false alerts
                  about alleged electronic monitoring violations. One of
                  those false alerts led to a two-day stint in jail. His
                  fate was not unique. <em>Truthout</em> has talked
                  with dozens of people across the country who have been
                  wrongly sent back to prison after their “tracking”
                  device reported that they were located several blocks,
                  even several miles, away from where they actually
                  were. One Chicago woman told us that a false alert led
                  to her arrest. She subsequently fell in her jail cell,
                  fractured her jaw and needed surgery when she was
                  released.</p>
                <h2><strong>Gunshot Trackers</strong></h2>
                <p>SoundThinking (formerly ShotSpotter) is a detection
                  technology that claims to track and trace the sounds
                  of gunshots in urban areas. But <a
href="https://apnews.com/article/artificial-intelligence-algorithm-technology-police-crime-7e3345485aa668c97606d4b54f9b6220"
                    moz-do-not-send="true">studies</a> in several of the
                  more than 100 cities where SoundThinking has contracts
                  paint an alarming picture of inaccuracy. Despite
                  complaints that false alerts disproportionately target
                  Black and Brown neighborhoods, most decision-makers
                  maintain their infatuation with the product. For its
                  part, SoundThinking remains content with business as
                  usual. In over 20 years of operation, the company has
                  not <a
href="https://www.macarthurjustice.org/shotspotter-generated-over-40000-dead-end-police-deployments-in-chicago-in-21-months-according-to-new-study/"
                    moz-do-not-send="true">produced</a> a single
                  scientific study testing how reliably their technology
                  can tell the difference between the sound of gunfire
                  and other loud noises. Instead, the company
                  aggressively defends the secrecy of their product
                  design. When a SoundThinking alert in Chicago led to
                  the arrest of an individual, the company refused a
                  court order to bring forward evidence of how it
                  assessed gunshot sounds. The firm chose instead to
                  accept a <a
href="https://chicagoreader.com/news-politics/shotspotter-held-in-contempt-of-court/"
                    moz-do-not-send="true">contempt of court</a> charge.
                  Chicago Mayor Brandon Johnson has pledged to not renew
                  the city’s contract with SoundThinking in 2024. City
                  leaders in Dayton, Atlanta and Seattle have taken
                  similar steps by recently blocking or <a
href="https://ohiocapitaljournal.com/2023/07/18/why-dayton-quit-shotspotter-a-surveillance-tool-many-cities-still-embrace/"
                    moz-do-not-send="true">ending</a> SoundThinking
                  contracts.</p>
                <h2><strong>Other Technologies</strong></h2>
                <p>Racial bias has surfaced in other technologies, most
                  notably in <a
                    href="https://www.youtube.com/watch?v=eRUEVYndh9c"
                    moz-do-not-send="true">facial recognition apps</a>
                  that have led to the misidentification, and in some
                  cases arrest, of at least six Black men in a number of
                  cities including Detroit, New Orleans and Baltimore.
                  Moreover, a 2023 New Orleans <a
href="https://www.politico.com/news/2023/10/31/new-orleans-police-facial-recognition-00121427"
                    moz-do-not-send="true">study</a> contended that this
                  technology fell short in proponents’ claims to be able
                  to solve crime.</p>
                <p><a
href="https://pretrialrisk.com/the-danger/impacts-of-biased-risk-assessments/"
                    moz-do-not-send="true">Risk assessment tools</a>
                  that build algorithms based on data from racist
                  criminal legal institutions and social service
                  agencies have also come under fire from several
                  scholars and researchers arguing that they wrongly
                  classify people’s suitability for pretrial release or
                  the appropriateness of a sentence.</p>
                <h2><strong>Less Regulated Than Toasters</strong></h2>
                <p>Part of the explanation for these inaccuracies lies
                  with the failure to adequately test these technologies
                  before marketing. While toaster producers must conform
                  to stringent <a
href="https://www.itcindia.org/iec-60335-2-9-particular-requirements-for-toasters/"
                    moz-do-not-send="true">regulations</a> and subject
                  their products to rigorous testing, in the high-stakes
                  world of policing, producers often get a free pass.</p>
                <blockquote>
                  <p>Many of these technologies simply have no place in
                    a world that respects life.</p>
                </blockquote>
                <p>The only technical requirement for an electronic
                  ankle monitor at the national level is an optional set
                  of <a
                    href="https://www.ojp.gov/pdffiles1/nij/249810.pdf"
                    moz-do-not-send="true">standards</a> produced in
                  2016 by the National Institute of Justice requiring a
                  geolocation accuracy of 98 feet. Most residences,
                  especially urban apartments, could not accommodate a
                  person who is 98 feet from the geolocator box. Hence a
                  miscalculation of 98 feet would register as a
                  violation of household restrictions.</p>
                <p>Meanwhile, Black computer scientist Joy Buolamwini
                  used <a
href="https://www.npr.org/2023/11/28/1215529902/unmasking-ai-facial-recognition-technology-joy-buolamwini"
                    moz-do-not-send="true">research</a> on her own face
                  to expose what she labeled the “coded gaze.” The coded
                  gaze refers to the data base of faces used to create
                  models for prediction. In Buolamwini’s assessment, the
                  database of faces for testing this technology is
                  disproportionately white and male, making the software
                  more likely to identify a face as white and male. In
                  fact, Buolamwini, who is a dark-skinned Black woman,
                  found that the technology could not even see her face,
                  apparently because she was out of the norm.</p>
                <p>Rather than developing rigorous pre-marketing testing
                  protocols, as tech writer Dhruv Mehrota told <em>Truthout</em>,
                  these technologies “are tested in the field.” Dillon
                  Reisman, founder of the American Civil Liberties Union
                  of New Jersey’s Automated Injustice Project, <a
href="https://themarkup.org/prediction-bias/2023/10/02/predictive-policing-software-terrible-at-predicting-crimes?utm_source=TMP-Newsletter&utm_campaign=ea6b675321-EMAIL_CAMPAIGN_2023_10_03_11_07&utm_medium=email&utm_term=0_5e02cdad9d-ea6b675321-%5BLIST_EMAIL_ID%5D"
                    moz-do-not-send="true">told</a> <em>The Markup</em>
                  that all over New Jersey, companies are selling
                  “unproven, untested tools that promise to solve all of
                  law enforcement’s needs, and, in the end, all they do
                  is worsen the inequalities of policing and for no
                  benefit to public safety.”</p>
                <p>Instead of providing test results, police technology
                  companies primarily rely on promoting individual <a
href="https://www.soundthinking.com/shotspotter-public-safety-results/?utm_term=gunshot%20detection&utm_campaign=Non-Branded+-+Services&utm_source=adwords&utm_medium=ppc&hsa_acc=8557512895&hsa_cam=19121679341&hsa_grp=142764973374&hsa_ad=655799149081&hsa_src=g&hsa_tgt=kwd-322691664378&hsa_kw=gunshot%20detection&hsa_mt=p&hsa_net=adwords&hsa_ver=3&gclid=Cj0KCQiAjMKqBhCgARIsAPDgWlzmZM6L0etGibUF2pGRpZ1udRlQsVxzQqefPjpZWY_5PXAoLnDV-CkaAkkSEALw_wcB"
                    moz-do-not-send="true">success stories</a> or
                  simplistically attributing reductions in crime and the
                  <a href="https://www.youtube.com/watch?v=d1HbknbpFXQ"
                    moz-do-not-send="true">saving</a> of lives to the
                  presence of their technologies without considering
                  other factors. Dayton, Ohio-based human rights
                  activist Julio Mateo told <em>Truthout</em> that
                  SoundThinking tries “to play up the situations in
                  which these technologies help and try to make
                  invisible the times when people are searched and
                  traumatized.”</p>
                <p>Companies and decision-makers seem not to consider
                  the opportunity costs or ancillary impact of using
                  these devices. For example, in voting for the
                  reinstatement of SoundThinking in New Orleans after a
                  two-year ban, Black city councilor Eugene Green <a
href="https://www.politico.com/news/2023/10/31/new-orleans-police-facial-recognition-00121427"
                    moz-do-not-send="true">proclaimed</a>, “If we have
                  it for 10 years and it only solves one crime, but
                  there’s no abuse, then that’s a victory for the
                  citizens of New Orleans.” Like most supporters of
                  police technology, Green failed to acknowledge that
                  the financial and human resources devoted to
                  SoundThinking could have gone to programs proven to
                  prevent violence by providing direct benefits to
                  impacted populations in the form of services such as
                  mental wellness, after-school activities and job
                  training. Similarly, Green’s comments overlooked the
                  trauma of people subjected to repeated false alerts.</p>
                <p>On the surface, these outrageous failures to test
                  police technologies without even the rigor demanded of
                  a toaster appear puzzling. We expect our phones,
                  laptops, tablets, and every other device we use to
                  meet a certain consumer standard. A cellphone that
                  consistently connected us to the wrong number or
                  jumbled the entries in our contact lists would have a
                  very short shelf life. But completely different
                  standards apply to technologies of control and
                  oppression, especially those that deal with Black
                  people and other marginalized populations.</p>
                <h2><strong>Why the Paradox Continues</strong></h2>
                <p>This apparent paradox exists for several reasons. At
                  a systems level, the decentralized structure of
                  policing and law enforcement facilitates the expansion
                  of these technologies. Local authorities typically
                  make their own decisions on surveillance and policing.
                  For the purveyors of these technologies, local
                  decision-making offers a huge and welcoming
                  marketplace. While cities like <a
href="https://www.npr.org/sections/live-updates-protests-for-racial-justice/2020/06/24/883107627/boston-lawmakers-vote-to-ban-use-of-facial-recognition-technology-by-the-city#:~:text=Gaming-,Boston%20Lawmakers%20Vote%20To%20Ban%20Use%20Of%20Facial%20Recognition%20Technology,inaccurate%20for%20people%20of%20color."
                    moz-do-not-send="true">Boston</a> and <a
href="https://www.vox.com/recode/2019/5/14/18623897/san-francisco-facial-recognition-ban-explained"
                    moz-do-not-send="true">San Francisco</a> have banned
                  facial recognition, most smaller jurisdictions lack
                  the technical expertise and resources to conduct
                  serious investigations into police technology. They
                  rarely have policies or research agendas to address
                  the potential perils of apps like facial recognition
                  or gunshot trackers. As a result, the main sources of
                  information for local government are frequently the
                  company representatives themselves. In many cases,
                  local police or sheriffs, operating through their own
                  networks, become the major promoters of these
                  technologies across regions, largely because they
                  enhance that image of the technical efficiency of
                  their operations.</p>
                <p>The decentralized structure also makes mounting
                  national opposition campaigns more challenging,
                  especially since federal authorities have chosen not
                  to impose regulations. In fact, in many instances,
                  federal authorities promote such usage, offering free
                  access to surplus military equipment and invasive
                  surveillance technology through the <a
href="https://www.dla.mil/Disposition-Services/Offers/Law-Enforcement/Join-The-Program/"
                    moz-do-not-send="true">Law Enforcement Support
                    Office’s 1033</a> Program as well as <a
href="https://www.aclu.org/news/national-security/does-your-local-government-have-black-budget-too#:~:text=Federal%20grants%20to%20local%20and,governments%20via%20opaque%20grant%20programs."
                    moz-do-not-send="true">grants</a> operating through
                  the Department of Homeland Security and National
                  Security Agency. As of 2021, more than 10,000 federal,
                  state and local law enforcement agencies were <a
href="https://www.aclu.org/news/criminal-law-reform/federal-militarization-of-law-enforcement-must-end"
                    moz-do-not-send="true">participating</a> in the 1033
                  Program. Further, the emergence of COVID-19 relief
                  funds through the American Rescue Plan Act (ARPA) <a
href="https://epic.org/two-years-in-covid-19-relief-money-fueling-rise-of-police-surveillance/"
                    moz-do-not-send="true">directed</a> new resource
                  flows to local authorities for police surveillance
                  technologies such as automatic license plate-readers,
                  facial recognition systems, gunshot detection programs
                  and phone hacking tools. President Joe Biden
                  encouraged such expenditures during an address to a
                  Gun Violence Prevention Task Force meeting in 2022, <a
href="https://epic.org/two-years-in-covid-19-relief-money-fueling-rise-of-police-surveillance/"
                    moz-do-not-send="true">urging</a> cities to purchase
                  “gun-fighting technologies, like technologies that
                  hears, locates gunshots so there can be immediate
                  response because you know exactly where it came from.”
                  The nonprofit Electronic Privacy Information Center <a
href="https://epic.org/wp-content/uploads/2023/03/EPIC-ARPA-Surveillance-Funding-Table.pdf"
                    moz-do-not-send="true">estimated</a> that as of
                  September 2022, at least 70 local governments had
                  allocated ARPA funding to surveillance technology.</p>
                <p>In addition to systemic factors, police technology
                  also requires a controlling narrative. What researcher
                  Evgeny Morozov calls <a
href="https://www.publicbooks.org/the-folly-of-technological-solutionism-an-interview-with-evgeny-morozov/"
                    moz-do-not-send="true">technological-solutionism</a>,
                  is essential to that narrative.
                  Technological-solutionism influences decision-makers
                  and thought leaders to ignore options for addressing
                  deep social problems like white supremacy or the need
                  to redistribute income and resources. Instead,
                  technological-solutionism recasts complex social
                  phenomena as “neatly defined problems with definite,
                  computable solutions or as transparent and
                  self-evident processes that can be easily optimized —
                  if only the right algorithms are in place!” In
                  contemporary capitalism such solutions enhance the
                  profits and the power of <a
href="https://hbr.org/2022/01/can-big-tech-be-disrupted"
                    moz-do-not-send="true">Big Tech</a> while making
                  claims to address inequities, particularly those based
                  on race. This obsession with technological solutions
                  dampens efforts at critique and provides space for
                  expanding or tweaking police technology. Moreover,
                  technological-solutionism has emerged amid a
                  fundamental restructuring of contemporary capitalism,
                  characterized by the rise of Big Tech and the
                  expansion of policing in all its forms. This
                  transformation has enabled a range of “solutions”
                  unimaginable less than two decades ago, including the
                  technologies discussed here.</p>
                <h2><strong>We Desperately Need a New Framework for Tech</strong></h2>
                <p>However, we are only in the early days of what I
                  refer to as “digital colonialism,” a period that began
                  with the launch of the first iPhone in 2007. In the
                  world of digital colonialism, solutions come from tech
                  giants like Google, Microsoft, Apple, Meta and Amazon.
                  In the manner of colonialists of the past, Big Tech
                  leads the establishment of a settler regime within the
                  unconquered territory of the digital world. The
                  companies set the rules, control the technology and
                  dictate the regime of accumulation. Like colonial
                  states, these powers value order and hierarchies based
                  on race, ethnicity and gender. Just as colonial states
                  offered the Bible, Western education and the products
                  of industrialization, so do Amazon and their ilk offer
                  the digital world of Chrome, cellphones and Uber in
                  exchange for the essential raw material for their
                  empire: data.</p>
                <p>As immense as the data on current computer clouds may
                  seem, the colonial oligarchs are just starting to
                  figure out how to deploy artificial intelligence to
                  collect and use people’s data to both maximize their
                  profits and <a
href="https://www.nytimes.com/2023/06/10/technology/ai-humanity.html"
                    moz-do-not-send="true">extend</a> the depth of
                  social control. Data from facial recognition, crudely
                  racist as it may be, is only beginning to intersect
                  with other punitive and controlling technologies.
                  While research has unearthed several of the
                  shortcomings of predictive policing and gunshot
                  locators, exposing these flaws represents only a baby
                  step on the path to challenging the immense power of
                  the digital monopolists.</p>
                <p>For the moment, to borrow a <a
href="https://collectiveliberation.org/wp-content/uploads/2013/01/Lorde_The_Masters_Tools.pdf"
                    moz-do-not-send="true">phrase</a> from Audre Lord,
                  critics are using the master’s tools to contest the
                  power of Big Tech. Like the first discoverers of gold
                  in South Africa, activists and researchers are
                  grabbing a few nuggets of consumer products while
                  handing over a lot more wealth in terms of biometrics
                  and other data. Transforming these power dynamics
                  won’t come from merely attacking the inaccuracies or
                  racial bias baked into modern surveillance and
                  policing. In fact, enhancing the technical capacity or
                  reducing the racial bias in these technologies may
                  only create more efficient punitive regimes.</p>
                <p>Many of these technologies simply have no place in a
                  world that respects life. Databases have many uses,
                  especially in tracking climate change or air quality,
                  but only if informed by a social justice framework
                  that is not driven by profit nor dogmatic paradigms
                  that either deify or totally reject technology.</p>
                <p>We remain a long way from putting such frameworks in
                  place. At a moment when the cutting-edge of technology
                  and surveillance and the world’s political acumen are
                  trained on Gaza, a tiny strip of land which is perhaps
                  the ultimate <a
href="https://www.versobooks.com/products/2684-the-palestine-laboratory"
                    moz-do-not-send="true">laboratory</a> for these
                  technologies, building that framework looms all the
                  more urgent.</p>
                <p><em>Thanks to Teresa Barnes, Dhruv Mehrota, Matt
                    Chapman and Julio Mateo for providing the comments
                    and information used to compile this article. </em></p>
                <div id="gmail-truth-511823789">
                  <h5>We need your help to propel Truthout into the new
                    year</h5>
                  <p><span>As we look toward the new year, we’re well
                      aware of the obstacles that lie in the path to
                      justice. But here at <i>Truthout</i>, we are
                      encouraged and emboldened by the courage of people
                      worldwide working to move us all forward — people
                      like you.<br>
                    </span></p>
                  <p><span>If you haven’t yet made your end-of-year
                      donation to support our work, this is the perfect
                      moment to do so: <b>Our year-end fundraising
                        drive is happening now, and we must raise
                        $150,000 by the end of December.</b><br>
                    </span></p>
                  <p><span>Will you stand up for truly independent,
                      honest journalism by making a contribution in the
                      amount that’s right for you? It only takes a few
                      seconds to donate by card, Apple Pay, Google Pay,
                      PayPal, or Venmo — we even accept donations of
                      cryptocurrency and stock! Just click the red
                      button below.<br>
                    </span></p>
                </div>
              </div>
            </div>
          </div>
        </div>
        <div> </div>
      </div>
    </div>
  </body>
</html>