Mathematicians Call For End Of Police’s Predictive Crime AI

Predictive policing, via ActivistPost
Please Share This Story!
image_pdfimage_print
Law enforcement’s use of AI to determine where crime will take place and by whom, is incredibly dangerous to a free society, and amounts to little more than the use of a crystal ball or tarot cards. Thousands of mathematicians are calling for an end of it. ⁃ TN Editor

After a flurry of police brutality cases this year and protests swarming the U.S. streets, thousands of mathematicians have joined scientists and engineers in calling for boycotting artificial intelligence from being used by law enforcement.

Over 2,000 mathematicians have signed a letter calling to boycott all collaboration with police and telling their colleagues to do the same in a future publication of the American Mathematical Society, Shadowproof reported.

The call to action for the mathematicians was the police killings of George Floyd, Tony McDade, Breonna Taylor, and many more just this year.

At some point, we all reach a breaking point, where what is right in front of our eyes becomes more obvious,” says Jayadev Athreya, a participant in the boycott and Associate Professor of Mathematics at the University of Washington. “Fundamentally, it’s a matter of justice.”

The mathematicians wrote an open letter, collecting thousands of signatures for a widespread boycott of police using algorithms for policing. Every mathematician within the group’s network pledges to refuse any and all collaboration with law enforcement.

The group is organizing a wide base of mathematicians in the hopes of cutting off police from using such technologies. The letter’s authors cite “deep concerns over the use of machine learning, AI, and facial recognition technologies to justify and perpetuate oppression.”

Predictive policing is one key area where some mathematicians and scientists have enabled the racist algorithms, which tell cops to treat specific areas as “hotspots” for potential crime. Activists and organizations have long criticized the bias in these practices. Algorithms trained on data produced by racist policing will reproduce that prejudice to “predict” where crime will be committed and who is potentially a criminal.

“The data does not speak for itself, it’s not neutral,” explains Brendan McQuade, author of Pacifying the Homeland: Intelligence Fusion and Mass Supervision. Police data is “dirty data,” because it does not represent crime, but policing and arrests.

“So what are its predictions going to find? That police should deploy their resources in the same place police have traditionally deployed their resources.”

Several, if not all, U.S. states and major cities are thought to use some type of predictive policing or pre-crime software with known users including — Chicago, Atlanta, Tacoma, New York, and LA, though not without protesting its use. As Activist Post previously reported, many of these states are using Palantir software for their predictive crime algorithms and have been exposed for doing so, like Florida, whose police terrorized and monitored residents of Pasco County.

These police organizations across the U.S. have been using what is known as “heat lists” or pre-crime databases for years. What is a “heat list,” you may ask?

Well, “heat lists” are basically databases compiled by algorithms of people that police suspect may commit a crime. Yes, you read that right — a person who might commit a crime. How these lists are generated and what factors determine an individual “may commit a crime” is unknown.

Read full story here…

 

Join our mailing list!


About the Author

Patrick Wood
Patrick Wood is a leading and critical expert on Sustainable Development, Green Economy, Agenda 21, 2030 Agenda and historic Technocracy. He is the author of Technocracy Rising: The Trojan Horse of Global Transformation (2015) and co-author of Trilaterals Over Washington, Volumes I and II (1978-1980) with the late Antony C. Sutton.
Subscribe
Notify of
guest
4 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Rodney

The idesa is to get rid of so many expensive police and their huge Pensions, and just have a squad that is told who the AI has targeted to be crushed… sounds ideal, 1974 is here and has been supassed, even Commie Canada under turdo is now having Insurrection Camps built, ” To keep us Safe”

[…] Read more: Mathematicians Call For End Of Police’s Predictive Crime AI […]

[…] Read more: Mathematicians Call For End Of Police’s Predictive Crime AI […]

Statist

Der Computer ist die logische Weiterentwicklung des Menschen: Intelligenz ohne Moral. (John Osborne, engl. Dramatiker, 1921-1994) Ich habe mir die Doku predictive policing bei Arte angeschaut. Hm. Ich glaube sowieso nicht an KI. Die Software durchforstet Datenbanken nach Schwerpunkten und spuckt die mit tollen Grafiken in HD wieder aus. Die Algorythmen schreibe ich nebenbei. Was KI nicht kann und nie können wird, einer Zahl einen Sinn beizugeben. Die Psychologie hat “Intelligenz” simplifizieren müssen, damit sie meßbar wird. Praktisch hat man jetzt ein Rechenprogramm vor sich, das mit den vorgegenen “Lösungen” des Programmieres gefüttert worden ist. Wenn der sagt: so ist… Read more »