License Plate Camera Company Halts Federal Cooperation Amid Privacy Concerns

Flock Safety, a company that deploys license plate-reading cameras, has suspended operations with federal agencies due to concerns over data usage, especially in Illinois. The company’s cameras capture billions of license plate photos monthly, with local agencies controlling the data and responding to law enforcement inquiries. Illinois Secretary of State Alexi Giannoulias raised concerns after discovering Customs and Border Protection accessed Illinois data, potentially violating a state law restricting data sharing on abortion and immigration. As a result, Flock Safety has revised its system to clearly identify federal inquiries and limit search capabilities.

Read the original article here

License plate camera company halts cooperation with federal agencies. That’s the big story, and it’s got a lot of people talking. But first, let’s rewind a bit and get the picture of what we’re dealing with. Flock Safety, the company at the heart of this, creates these systems of cameras. Think of them as a widespread network designed to capture license plate information. These systems are sold to local agencies and even businesses. And it seems some folks are asking if Flock anticipated the data being used in ways it wasn’t originally intended. I mean, how could they *not* have seen this coming?

These cameras are everywhere now, and they’re not just recording where you’ve been. Some reports say they’re being used in a way that’s downright concerning – predicting whether you’re acting suspiciously. It’s one thing to use them to find a specific car of interest, it’s another to have an AI system actively evaluating everyone and flagging potential suspects. We’re talking about the system deciding who should be reported to law enforcement, potentially because of their movement patterns. So, yeah, it makes you wonder, doesn’t it?

Well, the halt in cooperation is a significant development. This pullback from federal agencies shows just how serious the privacy concerns around license plate tracking have become. It’s a stark shift, and you have to wonder what brought it about. It suggests that the company is now concerned with the potential fallout. And maybe, just maybe, they’re trying to protect their bottom line. Let’s face it, Flock is putting municipalities on the hook with a subscription game. Promising not to use the data for anything but specific needs and then turning this type of stuff out? It’s all about money.

This whole situation is a bit ironic, isn’t it? Laws are already in place to prevent government agencies from tracking people, yet these same agencies are buying technology from private companies to do just that. The same companies that are now facing pushback. It seems a bit hypocritical, doesn’t it? Why does it feel like history is repeating itself?

There are some serious questions to be asked about the security of these systems. They’re often minimally dependent on infrastructure, relying on things like LTE networking and solar power. That leaves them vulnerable. A plastic bag over the solar panel, for instance. That can bring things to a halt. And what about the data itself? There are reports of unsecured cameras with default credentials. That’s a massive breach of trust.

Adding to the complexity, the company says it filters data based on flagged search terms. But that’s pretty useless if you can use different terms to get the same information. That’s like putting a lock on the front door but leaving the windows open.

And it’s not just agencies and businesses using the cameras. Huge homeowner associations and cities are installing them too. This is a big business. So, did they anticipate people talking to delivery people through the cameras, or just watching wildlife in their yards at night? It seems like this is much bigger than a simple convenience factor.

There’s a debate here about whether a computer algorithm predicting criminal activity is even problematic. Some people suggest that as long as there’s due diligence and confirmation of the criminal nature of the activity, what’s the issue? However, that argument ignores the potential for misuse and the chilling effect it can have on innocent people’s lives. It’s a slippery slope.

It’s not uncommon for the promises of these systems to be far more limited than the reality. When questioned, they’ll admit they’ll turn over footage to the feds if asked, and they’re only obligated to notify you in advance so you can prepare legal defenses. If these cameras, which are largely in blue states, are providing data to the federal government, that’s a huge issue.

And let’s be honest, the political climate makes this situation even more precarious. This company is about to lose a lot of money because they’re about to be targeted by people who love surveillance states. This is far too juicy a target to ignore. Local PDs may refuse to assist with federal immigration crackdowns. But, they might just try to bypass them and download all the data.

Of course, some people are against the cameras. They’re talking about disabling them. Which begs the question: does this fall under free speech, or can the police charge people for destroying private property?

The very fact that these cameras are being installed on public property without public input is a problem. In a free society, it shouldn’t be necessary to explain your movements to law enforcement because a computer program determined you might be up to something. The AI drawing the attention of law enforcement to a certain person or vehicle because they fit a pattern is profiling, and it’s not something most of us want to see as standard.

And it’s not just about the technology itself. It’s about the potential for abuse. These systems can lead to “driving while Black” scenarios, or the “you don’t look like you live here” kind of profiling. The fear is that the AI will be used to target specific groups unfairly.

Ultimately, the accuracy of the predictors and the algorithm will determine the impact. But as for the future, there are some things to be concerned about.