AI surveillance is becoming more common than most people know. As told in a report by Forbes, automatic license plate recognition (ALPR) systems use public traffic cameras as well as business-owned security monitors to create databases of cars compiling license plate numbers, makes, and models. They then use artificial intelligence to track those cars and their owners. That information is sometimes sold to police departments across the country, some of whom use the data as justification to search anyone whose driving behavior is deemed suspicious.
It's like that scene in The Dark Knight when Bruce Wayne taps into every Gotham citizen's cell phone to try and catch the Joker. This time, though, there isn't a cautious Morgan Freeman warning the police of how intrusive this system is to people's privacy.
What's even more disturbing is that this sort of AI surveillance has been at work in the background for some time. Forbes highlights the case of David Zayas, who, in March of 2022, was traveling on New York's Hutchinson River Parkway when he was pulled over for repeatedly driving routes typical of drug traffickers. When police searched Zayas' car, they found 112 grams of crack cocaine, $34,000 in cash, and a handgun in his possession. The AI surveillance software correctly identified criminal activity in that scenario, but Zayas' attorney argues the search of his car based solely on his driving route was unlawful.
There are several companies that make ALPR software and sell data to police departments. Rekor is the largest, followed by competitors such as Flock and Jenoptik. These firms are also selling their services to fast-food chains that use their existing security cameras to monitor how often customers visit, the times they eat, and what they order to create customer profiles. According to Forbes, both McDonald's and White Castle already use this sort of tech.
In certain states, police can use this tech to not only monitor people for suspicious behavior but create gender, race, and sexual orientation profiles. Taking it one step further, state police departments can file requests to get ALPR data from other states. For instance, according to the Sacramento Bee, police departments in Texas, Oklahoma, and Alabama—where abortion is now illegal—submitted requests to California police departments for ALPR data to find anyone who sought an out-of-state abortion.
The American Civil Liberties Union has been fighting against the use of such technologies in several states, arguing that any sort of government surveillance without a warrant is a Fourth Amendment violation. Brett Max Kaufman, senior staff attorney at the ACLU, told Forbes that this sort of mass monitoring is “quite horrifying.”
This software has clear benefits when used strictly for cases where criminal activity is rightly suspected, warrants are issued, and the proper legal channels are taken. However, continuously surveilling all cars to create behavior and identity profiles without any oversight is incredibly problematic.
Got tips? Send 'em to firstname.lastname@example.org