Platforms' Governance: Analyzing Digital Platforms' Policy Preferences (original) (raw)

Emerging platform governance: antitrust reform and non-competitive harms in digital platform markets

Information, Communication & Society, 2023

Following the Snowden revelations, Cambridge Analytica, and a policy vacuum created by technological convergence and neoliberal reforms, policy efforts to articulate oversight of digital platform markets gathered policymaker support and public attention internationally. In the U.S., examined here as a case study of these international policy efforts, competition policy emerged as a prominent governance mechanism over digital platforms, resulting in the current antitrust scrutiny of tech giants like Google and Meta. Drawing on policy documents, fieldwork, and expert interviews, I trace how antitrust reform proposals, pitched as reclaiming democratic governance over private markets, came to dominate platform policy discussions. I examine how policy efforts to address platform power via competition grappled with non-competitive harms arising in digital markets, such as threats to user privacy and disinformation flows. Finally, I show how these debates began to converge on the contours of an emergent governance paradigm for digital platform oversight. I argue that this governance framework, which seeks to optimize market mechanisms to discipline platform markets, has significant limitations, notably in addressing issues associated with big data commodification and quantification.

Digital Platform Policy and Regulation: Toward a Radical Democratic Turn

International Journal of Communication, 2020

This article considers challenges to policy and regulation presented by the dominant digital platforms. A radical democratic framing of the deliberative process is developed to acknowledge the full complexity of power relations that are at play in policy and regulatory debates, and this view is contrasted with a traditional liberal democratic perspective. We show how these different framings have informed historical and contemporary approaches to the challenges presented by conflicting interests in economic value and a range of public values in the context of media content, communication infrastructure, and digital platform policy and regulation. We argue for an agonistic approach to digital platform policy and regulatory debate so as to encourage a denaturalization of the prevailing logics of commercial datafication. We offer some suggestions about how such a generative discourse might be encouraged in such a way that it starts to yield a new common sense about the further development of digital platforms, one that might favor a digital ecology better attuned to consumer and citizen interests in democratic societies.

Public Policy exigencies for Platform Economy & Social Media regulation Is Politics underhandedly mining Social Networks RL Vol XII No CCCIX, MMXV

Social media, through its heavy reliance on memes, is reshaping human language via unprecedented mixing of idioms, dialects, and alphabets. What long-term effects will it have on the way we speak, write and listen? Relative anonymity in social media is a double-edged sword: while users can express their ideas more freely, the space is also crowded by false alarms and an even newer player in the field — clandestine influencers who are learning the lexicon of new media. How do we balance anonymity with veracity? Netscape founder wrote a widely read essay in 2011 entitled, 'Why software is eating the world', but this was not taken seriously believing this was a metaphor. Now, the world faces the challenge of extracting the world from the jaws of Internet (Mcnamee, 2018). The platform economy is economic and social activity facilitated by platforms. Such platforms are typically online matchmakers or technology frameworks. By far the most common type are 'transaction platforms', also known as 'digital matchmakers'. A second type is the 'innovation platform', which provides a common technology framework upon which others can build, such as the many independent developers who work on Microsoft's platform. Forerunners to contemporary digital economic platforms can be found throughout history, especially in the second half of the 20th century. Yet it was only in the year 2000 that the 'platform' metaphor started to be widely used to describe digital matchmakers and innovation platforms. Especially after the financial crises of 2008, companies operating with the new 'platform business model' have swiftly came to control an increasing share of the world's overall economic activity, many times by disrupting traditional business. The conflict, the desire to speak publicly and the fear of the consequences of this act or the burden of responsibility derives from the fact that the right to freedom of expression of the thoughts and feelings is the natural human right. Public policy on social media regulation is a legal, ethical and moral dispensation by the executive branch of government, curved out of constitutional, legislative and administrative laws, vis-à-vis a class of issues in a manner consistent with law and institutional customs. Social media must be regulated without hampering the bill of rights on freedom of expression but, for pundits it may suggests itself and seems within reach, only to elude; appears readily practicable only to resist realisation. "Social media could be the start of a slippery slope leading to an Orwellian world controlled by Big Data, accelerated by fusing with the sensors in our devices and rapid advances in artificial intelligence. Authoritarian regimes are already marshalling them to exercise control on a supreme scale. It takes significant effort to assert and defend the freedom of mind. Moreover, there is a real chance that, once lost, those who grow up in the digital age in which the power to shape people's attention is increasingly concentrated in the hands of a few companies, will have difficulty regaining it" (Annan, 2018 & Soros, 2018)

Governance of and by platforms

2016

Platforms rose up out of the exquisite chaos of the web. Their founders were inspired by the freedom it promised, but also hoped to provide spaces for the web's best and most social aspects. But as these platforms grew, the chaos found its way back onto them-for obvious reasons: if I want to say something, be it inspiring or reprehensible, I want to say it where people are likely to hear me. Today, we by and large speak on platforms when we're online. Social media platforms put people at "zero distance" (Searls, 2016) from one another, afford them new opportunities to speak and interact, and organize them into networked publics (Varnelis, 2008; boyd, 2011)-and though the benefits of this may be obvious, even seem utopian at times, the perils are also painfully apparent. While scholars have long discussed the dynamics of free speech online, much of that thinking preceded the dramatic migration of online discourse to platforms (Balkin, 2004; Godwin, 2003; Lessig, 1999; Litman, 1999). By platforms, I mean sites and services that host public expression, store it on and serve it up from the cloud, organize access to it through search and recommendation, or install it onto mobile devices.

Should We Regulate Digital Platforms? A New Framework for Evaluating Policy Options

Policy & Internet

The economic and societal impact of digital platforms raises a number of questions for policymakers, including whether existing regulatory approaches and instruments are sufficient to promote and safeguard public interests. This article develops a practical framework that provides structure and guidance to policymakers who design policies for the digital economy. The framework differs from other approaches in taking the digital business models of platforms as the starting point for the analysis. The framework consists of three pillars, namely determining a platform's characteristics, relating these to public interests, and formulating policy options. The framework then invokes a return-path analysis for assessing how the interventions affect the business model, whether it has the desired effect on public interests, and ensuring it has no undesired side-effects on public interests. The framework puts forward two key messages for current discussions on digital platforms. First, one should look at the underlying characteristics of platforms rather than trying to understand digital platforms as a single category. Second, policymakers should explore existing rules and policy options, as they seem fit to deal with several characteristics of digital platforms in a time frame that matches the rapid development of platform technologies and business models.

Reconciling private market governance and law: A policy primer for digital platforms

This paper considers the development of digital platforms, their economic role and the policy approach which should apply to them. Digital platforms are heterogeneous, so no flavour of ‘platform regulation’ is likely to be appropriate across all platforms, whilst horizontal laws apply to platform and other business models alike. However, a unifying theme in relation to platforms is that they provide market governance via codes of conduct and software code. Rather than thinking of law as something that should simply be imposed on the market, we should therefore search for an accommodation or balance between law and 'code', given that they both provide governance.

Platform policy and regulation: towards a radical democratic turn

International Journal of Communication, 2020

This article considers challenges to policy and regulation presented by the dominant digital platforms. A radical democratic framing of the deliberative process is developed to acknowledge the full complexity of power relations that are in play in policy and regulatory debates and this view is contrasted with a liberal democratic perspective. We show how these different framings have informed historical and contemporary approaches to the challenges presented by conflicting interests in economic value and a range of public values in the context of media content, communication infrastructure and digital platform policy and regulation. We argue for an agonistic approach to digital platform policy and regulatory debate so as to encourage a denaturalization of the prevailing logics of commercial datafication. We offer some suggestions about how such a generative discourse might be encouraged in such a way that it starts to yield a new common sense about the further development of digital platforms; one that might favor a digital ecology better attuned to consumer and citizen interests in democratic societies.

The Information Society Governing online platforms: From contested to cooperative responsibility

Online platforms, from Facebook to Twitter, and from Coursera to Uber, have become deeply involved in a wide range of public activities, including journalism, civic engagement, education, and transport. As such, they have started to play a vital role in the realization of important public values and policy objectives associated with these activities. Based on insights from theories about risk sharing and the problem of many hands, this article develops a conceptual framework for the governance of the public role of platforms, and elaborates on the concept of cooperative responsibility for the realization of critical public policy objectives in Europe. It argues that the realization of public values in platform-based public activities cannot be adequately achieved by allocating responsibility to one central actor (as is currently common practice), but should be the result of dynamic interaction between platforms, users, and public institutions.

No innocents: Platforms, politics, and media struggling with digital governance

Communications

In retrospect, the communication world was so different in February 2020, when scholarly members of the Euromedia Research Group applied to become a Jean Monnet Network, focusing on media and platform policy (EuromediApp). Shortly after sending off the application, Covid-19 conquered the planet and jeopardized the main objective of networks, namely, to strengthen ties between network nodes. When the three-year network started operating in October 2020, it immediately became clear that dominant features of the pandemic would be fake news and harmful content online. Additionally, it was evident that digital platforms would play an even more central role in opinion-shaping during lockdowns than they had before. During the following three years, it turned out that the concept of the Eurome-diApp network was smart. Focusing on digital platforms, their relations to mass communication, and their performance regarding democracy and human rights allowed the network to organize cutting-edge workshops and conferences. For these events, it invited scholars to contribute scientific state-of-the-art texts and presentations on this fast-moving topic. This special issue of Communications serves to consolidate the learnings from that journey, timely addressing burning issues in digital platform governance. It explores questions such as how to limit hate speech and other harmful content online, how to hold digital platforms accountable for publishing it, how to accommodate automated decision-making (a.k.a. artificial intelligence), and how to economically balance platform profits achieved at the expense of mass media. Several attempts have been made over the last years to allow digital platform communication to thrive within the boundaries of the wider policy concept of

TOWARDS A REGULATORY THEORY OF PLATFORM RULE: CORPORATE "SOVEREIGNTY" THROUGH IMMUNITIES

St. Mary's Law Journal , 2024

The scale of inflammatory, divisive, false and harmful online content has prompted much soul-searching about its sources, causes and possible responses. This has brought the sweeping immunity in section 230 of the Communication Decency Act (intended to empower platforms as moderators) under intense scrutiny. Far from providing relief, it appears to have turned platforms into a source of the problem. This Article offers a fresh take on section 230, which is - despite its apparent commonplaceness - shown to be an extraordinary legal intervention as it gives important actors, otherwise key to controlling a space, a “carte blanche immunity for wrongful conduct.” That extraordinariness requires an explanation going beyond standard arguments about giving young internet companies some “breathing space” or removing disincentives for content moderation. The discussion starts with the proposition that an immunity entails self-governance, not as a matter of cause and effect, but in purely analytical term being immune means to self-govern within the scope of the immunity, that is to act without legal accountability. Building on the basic understanding of an immunity as self-governance, the Article traces the provenance of section 230 and its sweeping application to online platforms through three very different, but complementary, legal contexts: first, within the landscape of immunities as extraordinary legal devices often employed in support of governing activity; second, within the conception of the corporation as a self-governing institution embedded in immunities and impunities; and, third, within the constitutional framework and its capacity to recognise the “sovereignty-sharing” arrangement of government and platform in cyberspace. The Article’s overarching argument is that section 230 taps into the governing propensity of platforms not just as intermediaries or gatekeepers of online content, but as corporate actors which are, it is argued, inherently immune/self-governing actors with a long-standing history of “sovereignty-sharing” with government. Through this corporate prism the extraordinary “sovereign” role of platforms in cyberspace becomes intelligible. Normatively, the argument recasts platforms as hybrid private-public actors, consistent with the body of corporate scholarship, which postulates the sui generis nature of the corporation as a neither quite private nor quite public. Section 230 intensifies this argument in the case of online platforms. Repositioning online platforms as sitting “on the fence” of the private-public constitutional divide then provides the foundation for asking how constitutional restraints applicable to government may be adapted to ensure platform accountability.