top of page

AI Technology Puts a Target on Your Back: It Could Be A Matter of Time Before You Find Yourself Wrongly Accused

By John W. Whitehead & Nisha Whitehead

May 26, 2022

“The Government solution to a problem is usually as bad as the problem and very often makes the problem worse.” — Milton Friedman

You’ve been flagged as a threat.

Before long, every household in America will be similarly flagged and assigned a threat score.

Without having ever knowingly committed a crime or been convicted of one, you and your fellow citizens have likely been assessed for behaviors the Government might consider devious, dangerous or concerning; assigned a threat score based on your associations, activities and viewpoints; and cataloged in a Government database according to how you should be approached by Police and other Government Agencies based on your particular threat level.

If you’re not unnerved over the ramifications of how such a program could be used and abused, keep reading.

It’s just a matter of time before you find yourself wrongly accused, investigated and confronted by police based on a data-driven algorithm or risk assessment culled together by a computer program run by artificial intelligence.

Consider the case of Michael Williams, who spent almost a year in jail for a crime he didn’t commit. Williams was behind the wheel when a passing car fired at his vehicle, killing his 25-year-old passenger Safarian Herring, who had hitched a ride.

Despite the fact that there were no eyewitnesses to the shooting and no gun was found in the car, Police charged the 65-year-old man with first-degree murder based on ShotSpotter, a Gunshot Detection Program that had picked up a loud bang on its network of surveillance microphones and triangulated the noise to correspond with a noiseless security video showing Williams’ car driving through an intersection. The case was eventually dismissed for lack of evidence.

Although the Gunshot Detection Program like ShotSpotter are gaining popularity with Law Enforcement Agencies, Prosecutors and Courts alike, they are riddled with flaws, mistaking “dumpsters, trucks, motorcycles, helicopters, fireworks, construction, trash pickup and church bells…for gunshots.”

As an Associated Press investigation found, “the system can miss live gunfire right under its microphones or misclassify the sounds of fireworks or cars backfiring as gunshots.”

In one community, ShotSpotter worked less than 50% of the time.

The same company that owns ShotSpotter also owns a Predictive Policing Program that aims to use Gunshot Detection Data to “predict” crime before it happens. Both Presidents Biden and Trump have pushed for greater use of these predictive programs to combat gun violence in communities, despite the fact that they have not been found to reduce gun violence or increase community safety.

The rationale behind this fusion of widespread surveillance, behavior prediction technologies, data mining, precognitive technology, neighborhood and family snitch programs is purportedly to enable the Government to take preemptive steps to combat crime (or whatever the Government has chosen to outlaw at any given time).

This is precrime, straight out of the realm of dystopian science fiction movies such as Minority Report, which aims to prevent crimes before they happen, but in fact, it’s just another means of getting the citizenry in the Government’s crosshairs in order to lock down the nation.

Even Social Services are getting in on the action, with computer algorithms attempting to predict which households might be guilty of child abuse and neglect.

All it takes is an AI Bot flagging a household for potential neglect for a family to be investigated, found guilty and the children placed in Foster Care.

Mind you, potential neglect can include everything from inadequate housing to poor hygiene.

According to an investigative report by the Associated Press, once incidents of potential neglect are reported to a Child Protection Hotline, the reports are run through a screening process that pulls together “personal data collected from birth, Medicaid, substance abuse, mental health, jail and probation records, among other Government data sets.” The algorithm then calculates the child’s potential risk and assigns a score of 1 to 20 to predict the risk that a child will be placed in Foster Care within two years after they are investigated. “The higher the number, the greater the risk. Social workers then use their discretion to decide whether to investigate.”

This Technology is also far from infallible. Whether fallible or not, the AI Predictive Screening Program is being used widely across the country by Government Agencies to surveil, targeting families for investigation.

The impact of these kinds of AI Predictive Tools are being felt in almost every area of life.

Under the pretext of helping overwhelmed Government Agencies work more efficiently, AI Predictive and Surveillance Technologies are being used to classify, segregate and flag the populace with little concern for privacy rights or due process.

All of this sorting, sifting and calculating is being done swiftly, secretly and incessantly with the help of AI Technology and a surveillance state that monitors your every move.

Where this becomes particularly dangerous is when the government takes preemptive steps to combat crime or abuse, or whatever the Government has chosen to outlaw at any given time.

With the advent of Government-Funded AI Predictive Policing Programs that surveil, flag someone as a potential threat to be investigated and treated as dangerous, there can be no assurance of due process: you have already been turned into a suspect.

To disentangle yourself from the fallout of such a threat assessment, the burden of proof rests on you to prove your innocence.

You see the problem?

It used to be that every person had the right to be assumed innocent until proven guilty and the burden of proof rested with one’s accusers. That assumption of innocence has since been turned on its head by a surveillance state that renders us all suspects and overcriminalization which renders us all potentially guilty of some wrongdoing or other.

Combine Predictive AI Technology with surveillance and overcriminalization, then add militarized Police crashing through doors in the middle of the night to serve a routine Warrant and you’ll be lucky to escape with your life.

As I make clear in my book Battlefield America: The War on the American People and in its fictional counterpart The Erik Blair Diaries, if you’re not scared yet, you should be.

Editor’s Note:

For more information about the case of Michael Williams, Visit: https://apnews.com/article/artificial-intelligence-algorithm-technology-police-crime-7e3345485aa668c97606d4b54f9b6220

Many local communities are using ShotSpotter Technology, such as Hallandale Beach, West Palm Beach, Miami and others. Time will tell the reliability of this new technology.

John Whitehead is an attorney and author who has written, debated and practiced widely in the area of constitutional law, human rights and popular culture. John Whitehead's commentary are his views and he is open for discussion, he can be contacted at: johnw@rutherford.org. Information about The Rutherford Institute is available at: www.rutherford.org

bottom of page