This article was originally published by Aaron Kesel at Activist Post.
After a flurry of police brutality cases this year and protests swarming the U.S. streets, thousands of mathematicians have joined scientists and engineers in calling for boycotting artificial intelligence from being used by law enforcement.
Over 2,000 mathematicians have signed a letter calling to boycott all collaboration with police and telling their colleagues to do the same in a future publication of the American Mathematical Society, Shadowproof reported.
The call to action for the mathematicians was the police killings of George Floyd, Tony McDade, Breonna Taylor, and many more just this year.
“At some point, we all reach a breaking point, where what is right in front of our eyes becomes more obvious,” says Jayadev Athreya, a participant in the boycott and Associate Professor of Mathematics at the University of Washington. “Fundamentally, it’s a matter of justice.”
The mathematicians wrote an open letter, collecting thousands of signatures for a widespread boycott of police using algorithms for policing. Every mathematician within the group’s network pledges to refuse any and all collaboration with law enforcement.
The group is organizing a wide base of mathematicians in the hopes of cutting off police from using such technologies. The letter’s authors cite “deep concerns over the use of machine learning, AI, and facial recognition technologies to justify and perpetuate oppression.”
Predictive policing is one key area where some mathematicians and scientists have enabled the racist algorithms, which tell cops to treat specific areas as “hotspots” for potential crime. Activists and organizations have long criticized the bias in these practices. Algorithms trained on data produced by racist policing will reproduce that prejudice to “predict” where crime will be committed and who is potentially a criminal.
“The data does not speak for itself, it’s not neutral,” explains Brendan McQuade, author of Pacifying the Homeland: Intelligence Fusion and Mass Supervision. Police data is “dirty data,” because it does not represent crime, but policing and arrests.
“So what are its predictions going to find? That police should deploy their resources in the same place police have traditionally deployed their resources.”
Several, if not all, U.S. states and major cities are thought to use some type of predictive policing or pre-crime software with known users including — Chicago, Atlanta, Tacoma, New York, and LA, though not without protesting its use. As Activist Post previously reported, many of these states are using Palantir software for their predictive crime algorithms and have been exposed for doing so, like Florida, whose police terrorized and monitored residents of Pasco County.
These police organizations across the U.S. have been using what is known as “heat lists” or pre-crime databases for years. What is a “heat list,” you may ask?
Well, “heat lists” are basically databases compiled by algorithms of people that police suspect may commit a crime. Yes, you read that right — a person who might commit a crime. How these lists are generated and what factors determine an individual “may commit a crime” is unknown.
Chicago wasn’t the only major police department exposed using predictive crime algorithms. The Los Angeles Police Department was also caught one year later in 2018 by activists from the Stop LA Spying Coalition, as Activist Post reported.
This heat list idea in local law enforcement actually originated in Miami then was rolled out in Chicago in 2013. However, Activist Post may have missed other cities that gained less media attention; and as this writer will discuss shortly, the idea comes from a federal database.
A paper released last year by MIT entitled “Technical Flaws of Pretrial Risk Assessments Raise Grave Concerns” has been signed by some of the highest level university experts in the field of A.I. and law who warn about the “technical flaws” of these pre-crime based systems, Activist Post reported.
Fortunately for us, as Nicholas West noted, the pushback has already started in several cities, and a few police departments have dropped their programs after becoming aware of the inaccuracies. In 2018, for example, New Orleans suspended its 6-year running pre-crime program after its secret predictive policing software was exposed.
The scariest part of all this is that the New Orleans and LA police departments were actually both linked to Palantir Technologies, which directly works with the CIA and is suspected of being the current fork of PROMIS Main Core software. PROMIS pre-dates all of these local police heat lists, with algorithms that put suspected “domestic terrorists” into their own round-up lists and highly scrutinized tracked purchases, created at first by Oliver North for President Ronald Reagan and Vice President George H.W. Bush under FEMA’s Readiness Exercise — 1984 (REX-1984.)
The use of Palantir’s pre-crime algorithm software posits that other police departments may be utilizing the same software for their own pre-crime programs. Palantir is also the same company working with the U.S. Immigration and Customs Enforcement agency on its own lists to catch illegal immigrants, as Activist Post and investigative journalist Barrett Brown originally reported.
You may remember Palantir from journalist Barrett Brown, Anonymous’ hack of HBGary, or accusations that the company provided the technology that enables NSA’s mass surveillance PRISM which is the successor to PROMIS. Palantir’s software in many ways is similar to the Prosecutor’s Management Information System (PROMIS) stolen software Main Core and maybe the next evolution in that code, which allegedly predated PRISM. In 2008, Salon.com published details about a top-secret government database that might have been at the heart of the Bush administration’s domestic spying operations. The database known as “Main Core” reportedly collected and stored vast amounts of personal and financial data about millions of Americans in the event of an emergency like Martial Law.
PROMIS was forked into many reported use-cases for the U.S. government, including an intelligence application onboard nuclear submarines of the United States and Great Britain, and the use by both the U.S. government and certain allied governments for inventory tracking of nuclear materials and long-range ballistic missiles. But the most bizarre and frightening use was to keep track of dissident Americans under Main Core.
The Main Core database isn’t just a rumor or conspiracy theory; PROMIS software was used by Iran-Contra fall guy then-National Security Council, Lt. Col. Oliver North to create the dissidents list for Rex-84 that would later evolve to Main Core. North used PROMIS software in 1982 in the Department of Justice, and at the White House, to compile a list of American dissidents to invoke if the government ever needed to do so under Ronald Reagan’s Continuity of Government (COG) program as a liaison to FEMA.
In 1993, Wired described North’s use of PROMIS in compiling the Main Core database:
Using PROMIS, sources point out, North could have drawn up lists of anyone ever arrested for a political protest, for example, or anyone who had ever refused to pay their taxes. Compared to PROMIS, Richard Nixon’s enemies list or Sen. Joe McCarthy’s blacklist look downright crude.
This Main Core database of individuals was given to a handful of individuals, meaning most government officials had no knowledge of the program ever existing. The database was passed off from administration to administration through National Security channels, according to sources.
This writer wrote extensively on Main Core and PROMIS in an investigation on the cover-up of stolen Inslaw software and murders of journalists Danny Casolaro and Anson NG Yonc, CIA intelligence operative Ian Spiro and NSA employee Alan Standorf. See: “Octopus PROMIS: The Conspiracy Against INSLAW Software, And The Murders To Cover Up A Scandal Bigger Than Watergate.”
Palantir was founded with early investment from the CIA and heavily used by the military, and Palantir is a subcontracting company in its own right. The company has even been featured in the Senate’s grilling of Facebook, when Washington State Senator Maria Cantwell asked CEO Mark Zuckerberg, “Do you know who Palantir is?” due to Peter Thiel sitting on Facebook’s board.
Palantir’s Gotham software allows Fusion Center police to track citizens beyond social media and online web accounts with people record searches, vehicle record searches, a Histogram tool, a Map tool, and an Object Explorer tool.
According to DHS, “Fusion centers operate as state and major urban area focal points for the receipt, analysis, gathering, and sharing of threat-related information between federal; state, local, tribal, territorial (SLTT); and private sector partners” like Palantir. Further, Fusion Centers are locally owned and operated, arms of the “intelligence community,” i.e. the 17 intelligence agencies coordinated by the National Counterterrorism Center (NCTC). However, sometimes the buildings are staffed by trained NSA personnel like what happened in Mexico City, according to a 2010 Defense Department (DOD) memorandum.
Tarik Aougab, an Assistant Professor of Mathematics at Haverford College, one of the many mathematicians who saw recent protests against police as a an opportunity to take action against these practices said. “If there is already disproportionately large amounts of time and energy being spent criminalizing Black and brown people,” Aougab continues, “the predictions the algorithm puts forth are just going to reflect that. It’s a way to perpetuate that over-criminalization.”
The mathematicians question if predictive policing is just a self-fulfilling prophecy.
“There’s a big question here: is predictive policing really getting ahead of events, or is it just a self-fulfilling prophecy?” McQuade explains that “crime statistics” are more accurately referred to as “arrest statistics.” They measure police behavior, which is not directly correlated with crime and violence. These arrests justify and perpetuate more arrests.
Athreya explains the boycotters will accomplish their goals by collaborating with criminal justice organizations.
We want to work through issues of how various algorithms are used in the criminal justice system, for things from facial recognition to DNA matching algorithms, where community groups and mathematicians can have a say.
In fact, one study conducted by the AI Institute last year investigated predictive policing systems and determined “in numerous jurisdictions, these systems are built on data produced during documented periods of flawed, racially-biased, and sometimes unlawful practices and policies.”
Another subsequent 2019 audit on predictive A.I. use in Los Angeles found a serious lack of oversight or procedures around the tools, rendering them utterly useless. Researchers have also noticed police tend to pursue their own “hotspots” rather than follow the technology making the tech become an enabler to police labeling and categorizing individuals without reason, Science Mag reported.
This is only the beginning of the fight, and it’s going to be a drawn-out battle to prevent the use of this technology, not just here in the U.S. but worldwide as well. There’s no telling how long these projects have been active, and trusting the police to honestly tell us is like trusting the wolf guarding the henhouse. However, with mathematicians as well as scientists and engineers on our side we have a fighting chance.