Back to News & Commentary

Communities Should Reject Surveillance Products Whose Makers Won't Allow Them to be Independently Evaluated

A sign attached to a metal pole with while letters on a red background reading "NOTICE _ LICENSE PLATE READER 24/7 - Flock Safety".
Independent reviewers of new surveillance technology play a crucial role in safeguarding our right to privacy.
A sign attached to a metal pole with while letters on a red background reading "NOTICE _ LICENSE PLATE READER 24/7 - Flock Safety".
Jay Stanley,
Senior Policy Analyst,
桃子视频Speech, Privacy, and Technology Project
Share This Page
March 6, 2024

American communities are being confronted by a lot of new police technology these days, a lot of which involves surveillance or otherwise raises the question: 鈥淎re we as a community comfortable with our police deploying this new technology?鈥 A critical question when addressing such concerns is: 鈥淒oes it even work, and if so, how well?鈥 It鈥檚 hard for communities, their political leaders, and their police departments to know what to buy if they don鈥檛 know what works and to what degree.

One thing I鈥檝e learned from following new law enforcement technology for over 20 years is that there is an awful lot of snake oil out there. When a new capability arrives on the scene鈥攚hether it鈥檚 face recognition, emotion recognition, video analytics, or 鈥big data鈥 pattern analysis鈥攕ome companies will rush to promote the technology long before it is good enough for deployment, which sometimes never happens. That may be even more true today in the age of artificial intelligence. 鈥淎I鈥 is a term that often amounts to no more than trendy marketing jargon.

Given all this, communities and city councils should not adopt new technology that has not been subject to testing and evaluation by an independent, disinterested party. That鈥檚 true for all types of technology, but doubly so for technologies that have the potential to change the balance of power between the government and the governed, like surveillance equipment. After all, there鈥檚 no reason to get wrapped up in big debates about privacy, security, and government power if the tech doesn鈥檛 even work.

One example of a company refusing to allow independent review of its product is the license plate recognition company Flock, which is pushing those surveillance devices into many American communities and tying them into a centralized national network. (We wrote more about this company in a 2022 white paper.) Flock has steadfastly refused to allow the independent security technology reporting and testing outlet to obtain one of its license plate readers for testing, though IPVM has tested all of Flock鈥檚 major competitors. That doesn鈥檛 stop Flock from that 鈥淔lock Safety technology is best-in-class, consistently performing above other vendors.鈥 Claims like these are puzzling and laughable when the company doesn鈥檛 appear to have enough confidence in its product to let IPVM test it.

Communities considering installing Flock cameras should take note. That is especially the case when errors by Flock and other companies鈥 license plate readers can lead to innocent drivers finding themselves with their , facing jittery police pointing guns at them. Such errors can also expose police departments and cities to lawsuits.

Even worse is when a company pretends that its product has been subject to independent review when it hasn鈥檛. The metal detector company Evolv, which sells 鈥 wait for it 鈥 AI metal detectors, submitted its technology to testing by a supposedly independent lab operated by the University of Southern Mississippi, and publicly touted the results of the tests. But and the reported that the lab, the National Center for Spectator Sports Safety and Security (), had colluded with Evolv to manipulate the report and hide negative findings about the effectiveness of the company鈥檚 product. Like Flock, Evolv refuses to allow IPVM to obtain one of its units for testing. (We wrote about Evolv and its product here.)

One of the reasons these companies can prevent a tough, independent reviewer such as IPVM from obtaining their equipment is their subscription and/or cloud-based architecture. 鈥淢ost companies in the industry still operate on the more traditional model of having open systems,鈥 IPVM Government Research Director Conor Healy told me. 鈥淏ut there鈥檚 a rise in demand for cloud-based surveillance, where people can store things in cloud, access them on their phone, see the cameras. Cloud-based surveillance by definition involves central control by the company that鈥檚 providing the cloud services.鈥 Cloud-based architectures can worsen the privacy risks created by a surveillance system. Another consequence of their centralized control is increasing the ability of a company to control who can carry out an independent review.

We鈥檙e living in an era where a lot of new technology is emerging, with many companies trying to be the first to put them on the market. As Healy told me, 鈥淲e see a lot of claims of AI, all the time. At this point, almost every product I see out there that gets launched has some component of AI.鈥 But like other technologies before them, these products often come in highly immature, premature, inaccurate, or outright deceptive forms, relying little more than on the use of 鈥淎I鈥 as a buzzword.

It鈥檚 vital for independent reviewers to contribute to our ongoing local and national conversations about new surveillance and other police technologies. It鈥檚 unclear why a company that has faith in its product would attempt to block independent review, which is all the more reason why buyers should know this about those companies.

Learn More About the 桃子视频 on This Page