Learning Like a State: Statecraft in the Digital Age


Marion Fourcade is Professor of Sociology and Director of Social Science Matrix at UC Berkeley.

Jeff Gordon (@jeffgordon12) is a second-year law student at Yale Law School and a doctoral candidate in sociology at the University of California, Berkeley.


Marion Fourcade is Professor of Sociology and Director of Social Science Matrix at UC Berkeley.

Jeff Gordon (@jeffgordon12) is a second-year law student at Yale Law School and a doctoral candidate in sociology at the University of California, Berkeley.

This post is part of the symposium celebrating the inaugural issue of the Journal of Law and Political Economy.

On this blog and beyond, LPE scholars have alerted us to the new ways that law enables and encases private power in the digital age. Recent books have argued that we live in an age of “informational” or “surveillance” capitalism, a new form of market governance marked by the accumulation and assetization of information, and by the dominance of platforms as sites of value extraction. Over the last decade-plus, both actual and idealized governance have been transformed by a combination of neoliberal ideology, new technologies for tracking and ranking populations, and the normative model of the platform behemoths, which carry the banner of technological modernity. In concluding a review of Julie Cohen’s and Shoshana Zuboff’s books, Amy Kapcyznski asks how we might build public power sufficient to govern the new private power. Answering that question, we believe, requires an honest reckoning with how public power has been warped by the same ideological, technological, and legal forces that brought about informational capitalism.

In our contribution to the inaugural JLPE issue, we argue that governments and their agents are starting to conceive of their role differently than in previous techno-social moments. Our jumping-off point is the observation that what may first appear as mere shifts in the state’s use of technology—from the “open data” movement to the NSA’s massive surveillance operation—actually herald a deeper transformation in the nature of statecraft itself. By “statecraft,” we mean the state’s mode of learning about society and intervening in it. We contrast what we call the “dataist” state with its high modernist predecessor, as portrayed memorably by the anthropologist James C. Scott, and with neoliberal governmentality, described by, among others, Michel Foucault and Wendy Brown.

The high modernist state expanded the scope of sovereignty by imposing borders, taking censuses, and coercing those on the outskirts of society into legibility through broad categorical lenses. It deployed its power to support large public projects, such as the reorganization of urban infrastructure. As the ideological zeitgeist evolved toward neoliberalism in the 1970s, however, the priority shifted to shoring up markets, and the imperative of legibility trickled down to the individual level. The poor and working class were left to fend for their rights and benefits in the name of market fitness and responsibility, while large corporations and the wealthy benefited handsomely.

Weakened by decades of anti-government ideology and concomitantly eroded capacity, privatization, and symbolic degradation, Western states have determined to manage social problems as they bubble up into crises rather than affirmatively seeking to intervene in their causes.

As a political rationality, dataism builds on both of these threads by pursuing a project of total measurement in a neoliberal fashion—that is, by allocating rights and benefits to citizens and organizations according to (questionable) estimates of moral desert, and by re-assembling a legible society from the bottom up. Weakened by decades of anti-government ideology and concomitantly eroded capacity, privatization, and symbolic degradation, Western states have determined to manage social problems as they bubble up into crises rather than affirmatively seeking to intervene in their causes. The dataist state sets its sights on an expanse of emergent opportunities and threats. Its focus is not on control or competition, but on “readiness.” Its object is neither the population nor a putative homo economicus, but (as Gilles Deleuze put it) “dividuals,” that is, discrete slices of people and things (e.g. hospital visits, police stops, commuting trips). Under dataism, a well-governed society is one where events (not persons) are aligned to the state’s models and predictions, no matter how disorderly in high modernist terms or how irrational in neoliberal terms.

The Downside of Data-Driven Governance

We conceptualize dataism as a philosophy of governing—a form of “governmentality” in Foucault’s terms. Like all modern organizations, modern states are subject to a “data imperative”: a mandate to mine data and to decide what to manage based on what can be measured. In service of data-hungry machine learning techniques, the state (and its contractors) find themselves compelled not only to seek and demand new kinds of data, but also to mine it in a somewhat agnostic fashion to find the relations that stick. It is no longer necessary to flatten society to make it legible (as high modernism required); instead, ubiquitous data capture means that categories emerge inductively from regularities observed in the data. And while we praise recent work in administrative law demanding oversight for algorithmic “rules” that change over time as they process new training data, and demanding that affected parties have “a right to a well-calibrated machine decision,” we caution that there is a difference between holding the state to account for its decisions—i.e. as justified on the basis of the available data—and holding the state accountable for what sort of data it collects in the first place.

This is especially so because the prevailing atheoretical approach to measurement is not a neutral one—bureaucrats are not simply “following where the evidence leads” (as if that were ever possible). As Bernard Harcourt, Safiya Noble, Ruha Benjamin, Sarah Brayne, and others have suggested, the long-lasting structural inequities embodied in historical datasets, such as those commonly used in police and judicial algorithms, reproduce stigmatization and injustice. Moreover, when it comes time to intervene, the dataist state’s go-to tool is behavioral manipulation, or nudging. One concern, voiced by Yuval Harari and others, is that this kind of control is profoundly anti-democratic—witness, for instance, China’s ambitious plans to scale up its social credit system to cover the whole nation. But an opposite concern is that nudging may be feckless, especially because its effect is limited to the near-term future. Dataism seems to have little answer for long-gestating problems that outlive the relevance of the current data.

The Private Appropriation of State Statistics and Functions

The way these dynamics play out depends on the way the dataist state interfaces with increasingly state-like (and dataist) corporations. The state has historically sat in a privileged position for data production, which we analogize to its unique power over coinage. Today, the state can still induce the production of all sorts of data that would not be recorded in its absence. Through compulsory interactions—from policing to border security to tax collection to census-taking to education—the state mints data about people, and specifically about populations under special care or scrutiny or both: children, prisoners, foreigners, the poor. More recently, agencies at all levels of government have also begun to cast voluntary data sharing as a form of good citizenship, much like paying taxes.

And yet, when the state defines itself as a statistical authority, an open data portal, or a provider of digital services, it opens itself up to claims by private parties eager to piggy-back on its data-minting functions and to challenge its supremacy on the grounds that they, the private technologists, are better equipped technically, more trustworthy, or both. These new relationships are made material (and legal) in the data-sharing agreement, a contract regulating the access and use of data produced either by firms or by the state. As firms like Palantir enter the business of extracting information from state-collected data (often matched with data from other sources), a key question is, who can claim rights to the raw materials, intermediary analyses, and final products? We suggest that the private appropriation of public data, the downgrading of the state as the legitimate producer of informational truth, and the takeover of traditional state functions by a small corporate elite may all go hand in hand.

From Dataism to Digital Socialism

None of this is inevitable nor inscribed in the technology, however. The tools of big data analysis might be adaptable to another political rationality, which we call “seeing like a citizen.” Seeing like a citizen is a mode of statecraft that identifies social problems—including those problems stemming from the deployment of dataism itself—from the perspective of those affected. Although not in exactly these terms, LPE scholars have begun to emphasize the radical power of governing by empowering those affected by public policy. Sabeel Rahman and Hollie Russon Gilman argue for infusing bureaucratic decision making with participation by non-experts, and Conor Dwyer-Reynolds advocates calling on citizen panels, known as administrative juries, to make environmental regulatory decisions.

Digital technologies could help make these protocols both more inclusive and more effective. For example, Evgeny Morozov offers a series of innovative propositions to use “digital feedback infrastructures” to help citizens get their local problems onto a public agenda, organize relatively decentralized government bureaucracies, and build public platforms for citizens to donate and sell products and services to each other, with production directed by citizens’ expressed needs. A complete program for “digital socialism” would also include plans for reorganizing basic digital infrastructure, where feasible, as public utilities rather than profit-seeking firms, and decommodifying a number of essential internet-based services–-including, possibly, search and social media. Depending on the size, function, and type of power embodied by particular platform businesses, some may be suitable for public ownership, others for cooperative ownership, devolution to open-source non-ownership, or outright abolition.

Given our view that the fundamental technologies of the digital age (distributed systems, machine learning) are not going anywhere, the question is whether we can put them to more solidaristic and less extractive uses. There is no escape in Luddism. Now may be the time for civic technologists to build systems for incorporating each citizen’s information and amplifying each citizen’s voice, and for lawyers and bureaucrats to bend the course of technological progress away from corporatist or authoritarian dataism, and towards a democratic, publicly-oriented digital socialism.

Related Content