This post is part of a symposium on the Political Economy of Technology Read the rest of the posts here.
The year 2015 witnessed a dramatic rise in demands for police surveillance machines. After a number of widely shared incidents of police violence against often unarmed civilians, public protests and media attention led to calls for the adoption of surveillance machines by the police. Advocates of surveillance machines, including the family of Michael Brown, argued that these technologies would increase transparency and accountability surrounding police interactions with civilians by collecting and preserving data for public review. Indeed, the most contentious police-civilian interactions often came down to public disputes as to the alleged threat posed by the civilian, versus the propriety of the police response. Surveillance machines promised a technological layer of accountability by rendering these hidden interactions public. Now that they are being implemented, however, the political economy of police technologies raises new concerns about concentrated private power, consumer platform protection, and adequate regulation of data in the future of policing.
The structure of American policing–essentially local–dictated how the adoption of surveillance machines would unfold. In 2015, the federal government offered millions in grant funding for local agencies to purchase their own surveillance machines. These federal grants required that police agencies address substantive concerns–including “privacy considerations” –in their grant funding, although without detailed specifications. That funding, along with national attention to the problem, spurred police agencies across the country to adopt surveillance machine pilot projects. Facing considerable public pressure, police chiefs around the country were understandably eager to demonstrate their willingness to engage in reforms with a tangible and technological solution. Hence, most large agencies surveyed in 2016 stated they intended to or already had adopted a surveillance machine program.
While speedy adoption of these machines demonstrated a visible commitment to reform, it unfolded in a manner that gives private corporate players significant power over law enforcement data, and, by extension, the very nature of policing. In essence, public police agencies are customers in highly concentrated technology markets. In the case of surveillance machines, police agencies have faced limited options from vendors, possessed little guidance about desirable design features to request, and few incentives to establish rules or guidelines about their use prior to purchase. With surveillance machines, procurement was itself policymaking for a democratically accountable institution.
First, the surveillance machine market offered too little consumer choice to police agencies. One company, Axon, is the dominant market player. Axon has relied on long established relationships with most of the country’s 18,000 police agencies it had secured by acting as the primary vendor of electronic force compliance instruments. Those established commercial relationships reduced competition from smaller surveillance machine companies, and favored Axon. Investigative journalists, for instance, uncovered several instances of Axon encouraging police agencies to enter into no-bid contracts.
Second, because police agencies are consumers, the design specifications of the surveillance machines they use are dictated by the vendor. In surveillance products, accountability and privacy are often matters of design. Whether, for example, a surveillance machine can record surreptitiously or not is a matter of product design. So too is the choice as to whether the person using the surveillance machine possesses the ability–and thus the discretion–to turn it on or off. The same is true of future applications for surveillance machines, such as facial recognition technology. When the market is dominated by one company, customers–in this case police agencies–have little choice and input into the final product procured.
Third, in the example of surveillance machines, adoption came with little thought or deliberation regarding rulemaking over their use. Police agencies purchased them to demonstrate their commitment to reform, but these machines alone would never guarantee police accountability. Their service to democratic policing would be as good or as bad as the rules governing them. In the rush for federal funding, some agencies adopted rules, but a great many were slow to adopt clear guidelines, and some have still failed to do so. The rush to adopt surveillance machines without uniform or consistent adoption of best practices has left their promise of accountability and transparency ill-served. While individual agency rules vary considerably, in most states, the data generated by police surveillance machines are not presumptive public records. A non-exhaustive list of other issues that either vary widely or have been scarcely addressed include: data use and access rules, inclusion of biometric identification tools, and recording of activity protected by the First Amendment.
Furthermore, because both advocates and the police characterized surveillance machines as a response to the specific problem of violent police-civilian encounters, the surveillance machine experience obscured the nature of the technology itself. Police agencies were purchasing and investing in a platform, not just individual machines. For example, in April 2017, Axon offered every police agency a free year of surveillance machines for their officers; after a year, any dissatisfied agency could return the machines, no questions asked. That offer lay bare the vendor’s aims. Few profits arise from individual surveillance machines. Rather, real profits lie in police agency subscriptions to the data platform. This includes not just cloud data storage, but also the software management systems that allow agencies to permit secure access, to tag metadata, and for services like redaction and transcription. The company that dominates three quarters of the market is not in the surveillance machine business; it is in the police platform business. And Axon expects to introduce automated report writing and facial recognition technology through its platform to its police customers in the future.
An Apple-like platform to address a host of police technology needs is a difficult offer to refuse. Most police departments are ill-equipped to address technology matters internally, and few have the capability to store securely the petabytes of data generated by their officers’ surveillance machines. As implemented currently, Axon’s platform dominates.
As that platform develops in sophistication, we are likely to see the increasingly uneven and invisible burdens of surveillance on the public. To be poor, black, and brown has historically meant being subjected to heightened police surveillance. Surveillance machines both magnify and hide that power. When applied to surveillance machines, artificial intelligence applied to big data can help the police generate inferences. A facial recognition system coupled with geofencing, for example, can alert the police to people who have left their neighborhoods and fit a suspicious profile hit list.
Overenforcement is also a likely unanticipated consequence of widespread surveillance machine adoption. This can happen in two ways. Surveillance machines equipped with artificial intelligence might be used to automate enforcement of certain offenses. Alternatively, officers may feel nudged to enforce the law more frequently with a surveillance machine watching their own work. All of this data is likely to be generated in places where human officers have historically had a heavy presence.
Police surveillance machines are also a matter of employment privacy for police officers, who might change their behavior in the presence of surveillance. This accounts for some of the early and vocal resistance to surveillance machine adoption by some police unions. The lack of clarity about use rules further increased tensions between police management and labor about worker privacy. Some of this resistance has faded, however, as more officers have grown to understand that surveillance machines are useful as evidence gathering tools and as countermeasures against accusations of police misconduct.
These features of our experience with surveillance machine adoption have led to many foreseeable consequences that hold lessons about surveillance technology markets, private power, and police reform. However, these questions of private power, police agencies as consumers, surveillance platforms, and limited public access and input are not limited to this experience. Many new police technologies will arise in the same way. If we are to effectively use technologies to bring about a more just law enforcement system, we must design police systems of procurement and use in a manner that emphasizes democratic, not private, control. Our experience with police body cameras provides us with a cautionary tale.