Every day, billions of people wake up and reach for a screen. They scroll through feeds curated by algorithms they did not design, shaped by rules they did not agree to, enforced by companies that answer to no democratic body. The question of who governs this vast digital commons — and whether it can be governed at all — is one of the defining political challenges of our era. It is a question that touches free speech, national security, electoral integrity, and the very structure of public life in democratic societies.

John P. Wihbey has spent years thinking carefully about this problem. An associate professor of media innovation at Northeastern University, co-founder of the Institute for Information, the Internet & Democracy, and director of the AI-Media Strategies Lab, Wihbey brings to the debate both scholarly rigor and a journalist's instinct for the concrete and the consequential. His book, Governing Babel: The Debate over Social Media Platforms and Free Speech — and What Comes Next, published by MIT Press, is one of the most serious attempts yet to imagine what a legitimate, workable framework for platform governance might look like. Efforts to regulate platforms have so far produced wildly divergent results — Europe's Digital Services Act represents one ambitious attempt at accountability, while the United States has moved haltingly, and authoritarian governments have simply weaponised the question of governance to serve their own ends.

The title draws on a powerful image. Babel, in the biblical story, is the moment when shared language collapses and communication fractures into confusion. For Wihbey, it is an apt metaphor for the contemporary information environment — where the same platforms that gave voice to Arab Spring protesters also enabled genocide incitement in Myanmar, where the same algorithms that connect diaspora communities also fuel political polarisation and coordinated disinformation campaigns. In democracies like India, governments have directed platforms to remove posts or suppress content under the broad banners of national security, public order, and electoral integrity — creating legal cover for what critics argue is straightforward political self-interest. The pattern is neither uniquely Indian nor uniquely Asian; it is a global disease spreading through democratic and authoritarian systems alike.

DiploPolis spoke with Wihbey about his central arguments, the prospects for meaningful platform regulation, and why he believes the choices made in the next decade will determine the shape of democratic life for a century to come.
THE INTERVIEW
Q. Social media platforms are in serious need of an intervention, but the key question is, who truly 'governs' Babel today — should it be the government, the platforms themselves, or some other form of mechanism?
We need regulations that examine and target broad patterns of harm, with a decision-making mechanism that draws its credibility from a democratic source. I could see a hybrid regulator, with appointed commissioners serving with citizen juries or assemblies helping to create legitimacy.
Q. 'Governing' and 'free speech' are ideas that pull in opposite directions. What is the kind of governance that can balance freedom with responsibility in the online world?
The kind of governance that can balance freedom with responsibility in the online world could be organized around what we might call a 'response principle' — an approach rooted not in censorship but in active stewardship and demonstrable duty of care. The key is to judge platforms not on individual posts or instances, but on broad patterns or the totality of their actions in different problem areas — their good faith and diligence in responding to systemic harms. Such governance would be 'effects-based,' meaning regulation does not specify how platforms attend to online harms but rather focuses on outcomes in terms of mitigating or reducing those harms. This requires transparency of evidence, capacity for revision, and orderly procedures for iteration of new rules. Platforms should be encouraged to preserve speech rather than removing it (wherever possible), but vigorously encouraging counterspeech, labeling, right of reply, and other algorithmic tools. We should also interrogate the defaults and design of their products.
Q. Many governments now invoke 'free speech' selectively to justify either censorship or inaction. How do we escape such politicisation of the free-speech debate? How can societies protect open expression and still demand accountability from platforms that have become global gatekeepers of information?
To escape the politicisation of the free-speech debate, we must return to deeper democratic traditions and international human rights law as our North Star, rather than allowing free speech to become a selective political cudgel. The solution lies in smart regulation of internet intermediaries — ensuring their compliance with human rights due diligence, meaningful transparency, and due process requirements, rather than viewpoint- or content-based regulation. This means social media companies should not automatically comply with state requests that violate international human rights law; rather, states should bring their laws into conformity with their UN commitments. A country's courts and judicial system are the appropriate places to render verdicts on whether content is illegal, not politicians in executive branch positions making unilateral decisions. We need regulatory frameworks that are grounded in the deep structures of open societies and continuous with democratic traditions — frameworks that resist both authoritarian censorship and laissez-faire abandonment of responsibility.
Q. Social media has been a double-edged sword — giving voice to the voiceless, but also playing a divisive role. Do you believe we can rebuild trust in the digital world, or has the age of universal public discourse ended for good?
While the age of universal public discourse faces profound challenges, I do not believe it has ended for good — though rebuilding trust will require a fundamental shift in how we approach platform governance. The current moment resembles the early days of broadcast regulation, when new media technologies seemed chaotic and unmanageable. History shows us that it took from roughly 1912 to the 1960s to formulate rules around mass media in the United States; we are likely still decades away from fully getting our arms around social platforms and the new world of networked expression. Trust can be rebuilt, in part, through what we might call the transparency of evidence and orderly procedures for rule-making and revision. Platforms must demonstrate active response to harms, not merely voluntary and selective attention to problems. This means regular public reporting, compliance baselines, and proof that companies are exercising social responsibility as stewards of the information commons.
Q. Many people today trust neither governments nor tech platforms. What, in your view, are the first small steps we can take to rebuild shared trust and shared facts — without falling into censorship or cynicism?
Unfortunately, there are very challenging trends relating to political economy and technological innovation that run deeper than content moderation problems. My general view is that we need platforms to help reinvigorate journalism and other knowledge institutions. Platforms will increasingly come under pressure as the underlying societies in which they operate begin to deteriorate politically. I have no faith that platforms will suddenly become more prosocial, but I think they may see incentives to help rebuild the public sphere. They might also think more about what we might call the 'social epistemology' facilitated by their algorithms and design. In an ideal way, they would see their task as establishing instrumental value — helping users understand the origin, source, and context of information — and knowledge value, so users can make good judgments and formulate true, justified beliefs. We must also ensure platforms treat people fairly as subjects of discussion, creating systems that listen carefully and respond to coordinated harms.
Q. In Nepal, the government's temporary ban on social media platforms ignited public anger and unrest, culminating in the collapse of the ruling administration. How do you interpret such state actions — as legitimate attempts to restore order, or as warning signs of how easily 'governing Babel' can slide into digital authoritarianism?
Such state actions represent warning signs of how easily 'governing Babel' can slide into digital authoritarianism when governments lack transparent, rights-respecting frameworks for addressing online volatility. The pattern we see globally — from Uganda to Russia to India — reveals what happens when executive authorities make unilateral content decisions without judicial oversight or adherence to international human rights norms. These shutdowns reflect the deeper problem that nearly every society faces: the growing contest between platforms and government power, often resulting in damage to human rights and free speech. To manage online volatility without resorting to shutdowns, democracies need mechanisms rooted in what Irene Khan, the UN Special Rapporteur for Freedom of Opinion and Expression, calls 'survivor rights' — the recognition that access to verifiable information from outside conflicts is essential during times of strife. Companies must hold strong against government requests that violate international human rights law, while states must bring their laws into conformity with UN commitments. New, rights-respecting platform regulatory bodies across countries could serve as a clearinghouse for government-platform communications, documenting interactions and providing legitimate lanes for urgent concerns while preventing coercion.
Q. The US and Europe have very different traditions of free expression, and countries like India, Brazil, and Kenya are experimenting with their own regulatory paths. Do you see the future of online speech governance becoming more globalised or more fragmented along cultural and political lines?
The future of online speech governance appears destined to become more fragmented along cultural and political lines, at least in the near term, though the aspiration for synchronised global norms remains vital. Our own public opinion research across democracies revealed stark differences: Americans are clear outliers in supporting uninhibited free expression and resisting censorship measures, while publics in South Korea, Mexico, and the United Kingdom show greater support for stringent moderation and regulation. Europe's 'Brussels Effect' through the Digital Services Act suggests one path toward alignment, as companies potentially synchronise global efforts with EU rules as prevailing norms. Yet we are increasingly witnessing what Ian Bremmer calls a 'technopolar' world, where technological forces make the cohesion of international, state-based rules more difficult over time. The question of whether fragmentation or alignment prevails may ultimately depend on whether major democracies can articulate compelling visions that give countries a choice rather than forcing them to choose between American, European, and Chinese models.
Q. From the Arab Spring to the wars in Ukraine and Gaza, social media has become both a battlefield and a diplomatic tool. Do you think digital platforms have strengthened or weakened the ability of states to manage conflict, shape narratives, and build consensus internationally?
Digital platforms have both strengthened and weakened the ability of states to manage conflict and build consensus, creating a paradoxical situation where governments face unprecedented challenges to narrative control while simultaneously gaining new tools for surveillance and information warfare. The technology has fundamentally altered the landscape, showing how initial democratic promise can collapse when platforms lack the local contextual knowledge, language expertise, and human resources to properly manage communal violence or contested elections. Platforms have become speech police without adequate accountability to either government or civil society, leaving them pulled by competing incentives — profit and growth, political power and conformity, and the relentless creative energies of human users. Whether digital platforms ultimately strengthen or weaken state power depends less on the technology itself than on whether democracies can establish frameworks that hold platforms accountable as stewards of the information commons while resisting authoritarian demands for censorship.
Q. Social media has become a new arena of geopolitical competition, with governments and state-linked actors using it to spread disinformation and influence politics beyond their borders. How do you see this weaponisation of digital communication altering the balance of power among nations?
The weaponisation of digital communication as a tool of geopolitical competition represents perhaps the defining challenge of our technopolar moment, fundamentally altering the balance of power by creating what we might call an information arms race that transcends traditional state capabilities. China's full court press through information and communications technology, Russia's deployment of social media for political warfare, and other state-linked actors' efforts to shape narratives beyond their borders all exploit the underlying logic of hands-off communications rules that emerged from Section 230's original intent. To contain this weaponisation, democracies need diplomatic approaches that acknowledge countries now have choices — between American libertarian models, European regulatory frameworks, and Chinese authoritarian alternatives. Defending the integrity of information ecosystems requires not just technical solutions but sustained investment in trust and safety professionals, local contextual knowledge, and the kind of active response that demonstrates duty of care — treating this as a systemic risk comparable to challenges in financial or health industries.
Q. As we look ahead to the next decade, do you see the digital communication landscape moving toward greater openness and accountability, or deeper fragmentation and control? What will determine which path we take?
The digital communication landscape stands at a historical pivot point where the trajectory toward either openness with accountability or deeper fragmentation and control remains genuinely uncertain, though current signs are not encouraging. The path we take will be determined, in some large part, by whether the United States can articulate a compelling vision for platform governance rooted in a response principle — or whether the non-regulation status quo and race to the bottom will persist. Other critical factors include whether democracies can establish regulatory frameworks before generative AI agents saturate the online landscape, whether international human rights law gains force as a standard for corporate social responsibility, and whether major powers can offer models that give countries genuine choices. My reading of history and the current moment suggests that without urgent action, we are more likely to see deeper fragmentation than meaningful alignment, as the clouds of moral and political darkness continue spreading across the globe. Yet if we take the long view that history demands — recognising we are somewhere in the early 1930s of this hundred-year journey — there remains hope that new rules rooted in democratic traditions can emerge, preserving innovation and free speech while acknowledging the power of networked communications to inflict harm. The choice between openness and control is ultimately ours to make, but the urgency of response cannot be overstated.
ABOUT THE AUTHOR
John P. Wihbey is Associate Professor of Media Innovation at Northeastern University, where he co-founded the Institute for Information, the Internet & Democracy and directs the AI-Media Strategies Lab. He has served as a research consultant to social media companies, foundations, and government. Governing Babel: The Debate over Social Media Platforms and Free Speech — and What Comes Next is published by MIT Press (2025).