The last few decades have been characterized by the return of market fundamentalism: the belief that society can and should be organized through the institutional mechanism of “self-regulating markets.” Many expected that the 2008 financial crisis might constitute a blow to pervasive market expansion and a check on global dominance of private corporations. Not so.…
Are social media platforms more like common carriers or newspapers? The answer is neither. And that answer has significant implications for how Courts should treat regulation of content moderation.
As markets began to usurp other forms of social regulation throughout the 20th century, metrics became increasingly central to the coordination of new spheres of market-mediated relations. More recently, digital metrics have been operationalized to facilitate the platformization of those domains. Platforms use automated scoring systems to rank content and actors across the markets they mediate. Search engines, e-commerce sites, and social media feeds all have ways to rank material and deliver it to users according to their calculation of “relevance.” This post explores metrics and gatekeeper power through the Google Scholar platform and its intermediation of the “scholarly economy”—the domain in which research is produced, consumed, bought and sold.
As law and political economy scholars take aim at the deficiencies of dominant modes of legal thought and chart a path for law to promote a more just and egalitarian society, they must also attend to the role of algorithmic systems and algorithmic thought in shaping political imaginations. By the same token, computer and information scientists interested in computation’s role in social reforms would do well to learn from the critiques and proposals of the LPE community.
Both law and technology have played a foundational role in constructing, maintaining, and extending neoliberal modes of governance. Technological implementations have given new life to the longstanding neoliberal separation of economic and political domains, and legal methods that have facilitated the neoliberal political economy have also enabled new technologies. As critiques of the centrality of neoliberal economic logic gain traction, we must take care that such work does not simply clear the path for an emerging hegemony of neoliberal computational logic. Instead, we must be attentive to proponents of the epistemic and political dominance of computational mechanisms, and we must critique them on similar grounds and with similar urgency. In addition, theories of the legal programs and methods required to democratize the economy must not ignore the role digital technologies may play in achieving these goals.
Would reconsidering the state/private action divide in First Amendment jurisprudence just unleash a torrent of endlessly abusive communications and misinformation of all kinds? Can antitrust categories help to solve the problem?
Whether or not an information fiduciary model would be the best way to regulate data governance, it is not guilty of many of the accusations that Lina Khan and David Pozen lob at it.
Rather than seeing data as person-like or property-like entity, egalitarians should focus on data relations: the way data’s collection and use puts people (or entities) into relation with one another.
We do not want to have to choose between John Roberts and Mark Zuckerberg as the guardians of democracy, though that is what current doctrine seems to require. Luckily, the contemporary framework is not the only one available to us.
The exact trajectory predictive legal analytics will take in the coming years is really anyone’s guess, but we can be fairly sure that the discourse of cost cutting and increased rationality via technology will be coming for the legal system, piece by piece. As such, it is worthwhile for left legal theory to take legal futurism, and even the idea of a legal singularity, seriously. Doing so, even for critics of liberalism, requires revisiting the liberal principle of the rule of law.
Like all modern organizations, modern states are subject to a “data imperative”: a mandate to mine data and to decide what to manage based on what can be measured. In service of data-hungry machine learning techniques, the state (and its contractors) find themselves compelled not only to seek and demand new kinds of data, but also to mine it in a somewhat agnostic fashion to find the relations that stick. It is no longer necessary to flatten society to make it legible (as high modernism required); instead, ubiquitous data capture means that categories emerge inductively from regularities observed in the data.
Automated hiring systems are paradoxical. Although often the stated reason for adopting them is to curtail human bias, they frequently end up exacerbating the biases they’re meant to correct. Even as employment discrimination continues to morph with the introduction of new technologies, so, too, should the law change to meet it head to head.
Sandeep Vaheesan interviews Frank Pasquale about his forthcoming book, New Laws of Robotics: Defending Human Expertise in the Age of AI.
By now, many of the societal, political, and distributive harms caused by large technology companies and so-called “social” media companies (Amazon, Facebook, Google, etc.) have been surfaced. They invade our privacy, decrease market competition, erode our sense of self and, despite their euphemistic label, our sense of community. Shoshana Zuboff’s new book—The Age of Surveillance…
About a decade ago, when legal employment dipped sharply, there was a raging debate on the future of the legal profession. Some said the drop reflected a permanent decrease in legal work. The logic here was simple: computers were increasingly capable of completing more sophisticated projects. Having eclipsed paralegals in some document review tasks, they…