Be careful out there, the robocops are watching
The long arm of the law just got longer and scarier, thanks to AI.
Pull over to the side of the road and show me your algorithms. Source: Midjourney.
David Zayas wasn't doing anything unusual as he drove his gray Chevrolet down the Hutchinson River Parkway in March 2022. He wasn't speeding. He wasn't weaving or driving erratically. He didn't even have a broken tail light. But when the cops pulled him over, they found 112 grams of crack cocaine, a semi-automatic, and $34,000 in cash.
How did they know to stop him? Because AI told them to.
As Thomas Brewster of Forbes reported, the Westchester County Police Department was using data collected by Automatic License Plate Recognition (ALPR) technology combined with an AI system created by a company called Rekor. [1]
Rekor's software, which can also capture a vehicle's make/model and color, examined two years' worth of footage from Westchester County's 480 license plate cameras to determine that Zayas' driving patterns matched those used by known drug couriers. That was enough probable cause for the WCPD.
An AI-powered ALPR camera, keeping its eyes on the road. Source: Sacramento Bee.
The software analyzed 1.6 billion images of license plates to establish the pattern used by traffickers — similar departure and destination points, very quick turnarounds, the same cars. News reports are silent as to how the cops actually nabbed Zayas; did the cameras alert them as his car passed by? Were they camped on the side of the Parkway, waiting for him? I don't know. But if I were one of the other 1.599 billion whose license plates were being scrutinized for potential criminal activity, without probable cause, I'd want to know.
Doughnut pass go
In principle, the idea of using cameras and computers to monitor traffic is not that much different than a couple of cops sitting in a squad car, eating doughnuts, watching the same car pass by multiple times, and wondering 'What's up with that'?
In practice, it's enormously different. AI enables mass surveillance at scale. It allows law enforcement to look at everyone, all the time. You are a suspect simply for having passed by a camera at a particular moment in time. [2] And now there's a machine recording your movements and inferring your intent.
Let's assume that this David Zayas [3] is not a productive member of society and deserves what's coming to him. He was not arrested because of what he was doing at that moment; he was arrested because of what a machine inferred he was doing. The fact that the AI got it right this time should not assuage anyone's concerns.
Let's say you meet Mrs. Jones every day at the same cafe at 6:30, and no one else knows she'll be there. You're holding hands and making all kinds of plans. AI-powered cameras may well decide you got a thing goin' on, and that you both know it's wrong but it's much too strong to let it go now.
Maybe Mrs. Jones is a palm reader, and you're working through some things. Maybe she's tutoring you in Spanish or teaching you how to trade bitcoin. But the AI is pretty damned sure you're having an affair, and if Sheriff Jones finds out, boy is he gonna be pissed.
Go directly to jail
One of the problems inherent with this kind of surveillance is that the machines are inevitably going to get it wrong. Your normal patterns of behavior could be similar to those of someone with criminal or worse intent.
Another is that this data could end up being stolen, accidentally leaked, or sold. [4] Location data is big business. The locations analytics market was roughly $16 billion in 2022, and growing at a rate of around 16% per year.
A third pitfall is that this data could be used to target individuals who aren't breaking the law but might, say, have the wrong bumper stickers on their car. As we have all learned very painfully over the last few years, law enforcement officials occasionally abuse their authority; this could lead to abuse on steroids.
Our ability to disappear is disappearing.
A fourth is when this data is shared (legally or otherwise) with other authorities. For example, some California counties have shared license plate data with other states, including those that have recently outlawed abortion. Theoretically, these states (some of which forbid crossing state lines to get an abortion, because they've turned the clock back to the year 1823) could use that data to prosecute women once they return to their home states.
The Electronic Frontier Foundation, which has been very active in attempting to uncover and curb abuses of this technology, describes it thusly:
ALPRs are a form of location surveillance: the data they collect can reveal our travel patterns and daily routines, the places we visit, and the people with whom we associate. In addition to the civil liberties threat, these data systems also create great security risks, with multiple known breaches of ALPR data and technology occurring over the last few years.
‘Safety’ first, privacy last
Nearly all surveillance technologies are sold on the premise that they will "make us safer." For example, license plate cameras are useful for tracking stolen vehicles. [5] Everybody likes it when car thieves get caught (except for the thieves). They're also used for AMBER alerts and toll collection. But inevitably, somebody comes up with yet another use for that data that has nothing to do with making anyone safer. It snowballs from there.
And I haven't even gotten to facial recognition yet. (That's a topic for another depressing blog post.) What it all means is that our ability to disappear is disappearing. When the cameras are everywhere and AI is doing the watching, there's really nowhere left to hide.
In the next episode of COMYAI, I’ll talk about what we can do to fight back.
How would you act if you knew you were always being watched? Share your thoughts (or, better yet, a joke) in the comments below.
[1] Rekor is not the only vendor of this stuff, or even the biggest. A company called Flock has deployed ALPRs in more than 2,000 US cities.
[2] The US has an estimated 85 million surveillance cameras in total, including things like private Ring doorbell cams (whose footage Amazon has shared with police without a warrant), or about one for every four humans. That puts us on par with China.
[3] Not to be confused with character actor David Zayas (who's appeared in "Blue Bloods," "Dexter," and a million other TV shows) or the 80-year-old David Zayas in Zephyrhills, Florida, who shot his daughter's 13-year-old chocolate lab because it pooped in the house. (The dog survived, and that David Zayas was arrested for aggravated cruelty to animals, proving that Florida has at least some laws that make sense.)
[4] The California DMV, for example, makes $50+ million a year selling personal drivers license information to third parties. The state does make it illegal to sell data from license plate cameras to unauthorized parties, but it has done a spotty-at-best job of enforcing that law. Like, almost not at all.
[5] Power tip: If you're planning to boost a car, be sure to swap the plates first.