When we talk about defunding the police, we focus on what we can see. We imagine hiring fewer cops to flock in subway stations and wander sidewalks. We picture fewer high-priced tanks and military-grade tools of war in our communities. But today’s policing infrastructure also spends millions of dollars on an invisible, sprawling data surveillance industry. Policing with data has weaponized our private information, turning it into a tool that tracks us and builds cases for stopping, detaining, and even deporting us. Planning a less-policed future requires decreasing physical policing and dismantling big data policing.
More than any other policing tool, academics, and legal professionals play a role in supporting the police data surveillance industry. The “gold standard” of data policing products come from Thomson Reuters and RELX (the company that owns both LexisNexis and Reed Elsevier). These are the same companies that produce Westlaw, Lexis, and Reed Elsevier’s journal collection. Over the last few decades, Thomson Reuters and RELX have grown into huge data analytics corporations, swallowing entire information industries and cornering the academic and legal publishing markets. They have also become major policing information vendors, building comprehensive policing surveillance products centered around their personal data collections in CLEAR and LexisNexis’s Accurint. In essence, their law enforcement data products are the Westlaw and Lexis of the policing surveillance world.
Thomson Reuters and RELX are the US police force’s major law enforcement data brokers. Data brokers make billions of dollars selling our information to marketing firms, political consultants, and other operations that benefit from knowing who we are on a granular level. Thomson Reuters and RELX give police huge data collections about all of us containing billions of datapoints. Their products contain every address we’ve ever lived at, every license we’ve ever had, every one of our email addresses, and data from every social media account we’ve ever used. They also contain all of our phone data, updated in real-time and our credit bureau and criminal record history. CLEAR and Accurint data helps police “form an ever-evolving, 360-degree view” of our lives, updated in real-time.
On top of brokering our data, Thomson Reuters and RELX also acquire and partner with products that help law enforcement share data between agencies and police from their computers. The companies work with predictive policing technologies like Palantir that algorithmically sort and arrange data into “heat lists” that rank people by their likelihood to commit future crimes and photo “lineups” that group pictures of potential suspects using criminal histories, location markers, social media associations, and other data. RELX helps police incorporate peoples’ DNA data into investigations, partnering its surveillance tools with a company that does DNA analysis for law enforcement. Thomson Reuters works with Vigilant, a license plate tracking company, to help police pair time-stamped license plate images with automobile owner data and other datapoints to track people in real-time.
The companies help law enforcement agencies create a pool of personal data that they share across jurisdictions. They are building super-powered police surveillance that connects data between local, state, and federal police agencies. Thomson Reuters works with Forensic Logic, the company that makes local police data services CopLink and LEAP. The companies combine the in local and state police forces with federal law enforcement agencies like the FBI and ICE. RELX similarly works with over 1,300 police forces to pool information.
Thomson Reuters also sells its services directly to the most notorious law enforcement agency in the US — Immigration and Customs Enforcement (ICE). ICE feeds Thomson Reuter’s data through a Palantir-created Investigative Case Management System to give ICE agents an “ecosystem” of data that leads agents to targets and helps build cases against them. Reporter McKenzie Funk describes how ICE relies on Thomson Reuters to track, detain, and deport immigrants.
Data-fueled policing shifts bad policing practices online digitizing discriminatory policing practices and perpetuating systemic racism. Before the advent of data surveillance, policing relied on “human intelligence,” police gathering information on suspects, one-by-one. But modern stakeouts aren’t like the ones on Miami Vice or Brooklyn 99. Today’s stakeouts are digital. Software and data companies like Thomson Reuters and RELX supply local law enforcement with a bevy of military surveillance-grade technology tools, including facial recognition systems, video analytics, social media monitoring, predictive policing, and “stingray” cell site simulators. Local law enforcement uses these high-powered data tools to increase surveillance in their communities. Buzzfeed reporters warned that Minnesota police use Thomson Reuters surveillance tools, among others, to identify and track people protesting police brutality. At the same time, NYPD’s police commissioner is planning to replace human police with big data policing, to protect police from liability as they encounter protesters that “second-guess” their actions.
The shift to data-based policing puts our civil liberties and human rights at risk. Internationally, the United Nations Human Rights Council has raised concerns about governments using private surveillance industry tools. In the US, civil rights groups worry that data broker-fueled policing creates “racist feedback loops” by perpetuating inherent biases. Bad, biased data like racist gang databases and data based on inherently racist assumptions about particular zip codes, activities, and associations form biased mosaics of our habits, relationships, and daily lives.
One reason that big data policing a problem is that it’s often based on bad data. As with most personal datasets, law enforcement data is riddled with errors. Data brokers provide high volumes of low-quality data. When librarian Shea Swauger obtained his own CLEAR data, he got 41 pages of personal information including his social security number, mortgage information, and voting records. It was full of mistakes and accompanied by a waiver saying as much. The waiver took a hands-off approach to data quality, saying that Thomson Reuters doesn’t promise that the records are comprehensive, complete or accurate. It specifically says that it’s entirely possible that Swauger’s data could be mixed up with the data of other people. The company does not warrant depending on the data “for any purpose.”
The companies also buy and sell racist data from gang databases and other data collections gathered by police forces known to over-police marginalized communities and people of color. What’s more, the algorithms the companies build and buy to parse the data are also racist. This is especially true in predictive policing systems, which integrate race-based policing policies with algorithmic biases. As Hamid Khan, the head of the Stop LAPD Spying Coalition says, “Algorithms have no place in policing.” Data organizing tools are also dangerous. RELX products provide law enforcement with comprehensive “crime maps” and both RELX and Thomson Reuters law enforcement tools build webs of suspects by connecting data associations. These huge data projects create policing dragnets that sweep up both suspects’ information and data that belongs to innocent bystanders, catching entire communities and social networks in surveillance snares.
When police officers use CLEAR and Accurint data, they give erroneous records the power of law, weaponizing our personal information. These products also work around constitutional and statutory privacy protections. Police agencies and immigration enforcement rely on Thomson Reuters and RELX data, in part, because the data isn’t government-owned. By using third party data, the government can work around federal data privacy laws and requirements.
Despite human rights and civil liberties concerns, Thomson Reuters and RELX continue to build surveillance products on the backs of its other businesses, including their news services, legal research products, and financial data products. When lawyers, finance professionals, and journalists pay their Thomson Reuters and RELX subscription fees or supply the company with content, they may be funding police surveillance product research and development.
We are only just beginning to understand the implications that big data policing has on our civil rights, and how data surveillance forms oppressive systems that discriminate against communities of color, refugees, and migrants. But big data surveillance is not going anywhere. Contact tracing and pandemic-related surveillance schemes are flourishing as we grapple with COVID-19. As more of our data gets swept up in government surveillance schemes, we should pay more attention not just to big data policing platforms like Palantir and CopLink, but also to the data companies providing the data that fuels these products. The government has not yet determined the contours of our digital privacy, which gives us an opening to participate in decisions about how the government, and especially law enforcement, use data brokers’ products. There is still time to limit the role of data brokering in policing. Militarized policing practices and invasive pandemic surveillance depend on third-party data and services that are not designed to serve the public interest. We can demand that government keep data brokering tools out of our public services and defund the million-dollar contracts that police sign with data brokers.
Sarah Lamdan is a Professor of Law at CUNY School of Law in Long Island City, New York. Her work focuses on data justice, and the spectrum of data governance, from government transparency to personal data privacy. Her book, Data Cartels, is forthcoming with Stanford University Press.
Suggested citation: Sarah Lamdan, Defund the Police, and Defund Big Data Policing, Too , JURIST – Academic Commentary, June 23, 2020, https://www.jurist.org/commentary/2020/06/sarah-lamdan-data-policing
This article was prepared for publication by Brianna Bell, a JURIST Staff Editor. Please direct any questions or comments to her at firstname.lastname@example.org