Grey Skies and Clear Eyes: Confronting Disinformation at the Cambridge Summit
It was April 23, 2025; we arrived under overcast skies with a sense of anticipation. What followed was a week of deeply provocative discussion—an invitation to step into the fog of online misinformation and try, together, to see clearly.
The Cambridge Disinformation Summit, hosted by the Cambridge Judge Business School, brought together a diverse mix of voices: journalists who risk their safety to speak the truth, former advisors grappling with the collapse of institutional trust, and researchers building the scaffolding for future resistance.
There was a striking tension throughout: between catastrophe and creativity, distrust and design, resignation and resolve. Disinformation today is no longer just a threat—it's an industry, an ecosystem, a narrative economy.
From Threat to Architecture
Rather than framing disinformation as merely "false information," several speakers encouraged us to look at the architecture of belief. What incentives, infrastructures, and emotional hooks are at play? From crypto communities that sell freedom narratives to social media's attention economy, this isn't just about what's true—it's about what works.
Eliot Higgins of Bellingcat outlined a framework rooted in verification, deliberation, and accountability. Yet even those anchors can be manipulated when the very act of "verifying" becomes performative.
As one presenter noted, we've entered the era of "the performance of democracy." Verification gets distorted. Liberation becomes division. Accountability turns into deflection.
We must ask: Are we participating in an information ecosystem, or are we trapped in an echo chamber built to extract attention and reinforce identities?
Psychology, Power, and Participation
One of the most eye-opening threads came from discussions around cognitive inoculation—efforts to "immunize" people against misinformation by teaching them about manipulation tactics. Yet, as researchers like Michael Cohen and Fabio Carella pointed out, many of these interventions don't work. Transparency pop-ups and warnings that you're being microtargeted might make you pause—but not change your mind.
Why? Because we don't just consume narratives—we belong to them.
We heard how conspiracy movements like QAnon begin not with misinformation but with distrust. They're then fueled by identity, loss, crisis, and a desire for meaning. Crypto platforms, we were reminded, don't just sell digital coins—they sell community, mobility, freedom, and a new kind of belonging.
Disinformation is effective because it leverages the same psychological tools that modern consumerism and politics employ: emotional resonance, rapid gratification, and narrative coherence.
Research, Responsibility, and Radical Imagination
As part of the Internet User Behavior Lab, it was powerful to see our shared mission echoed in so many forms. Whether it was the Cambridge Online Trust & Safety Index (COTSI) or workshops on the effect of misinformation on political perception, the call was clear: we need frameworks that are not just reactive but anticipatory.
We need tools, coalitions, and open access systems that expose agendas rather than obscure them. We need media and education systems that teach people not just what to think but how to be curious.
To quote the late media theorist Stuart Hall, "Power is not just the capacity to act directly, but also the capacity to define the terms of the debate." That's not just a warning for public relations—it's a call for critical inquiry and public resilience.
What's Next: From Critical Thinking to Critical Infrastructure
On a personal note, a true highlight of the week was that it allowed two of the three co-founders of the Internet User Behaviour Lab to finally meet in person after years of collaboration across the Atlantic. With the generous support of the Internet Society Foundation, we've already had a meaningful impact through research, tools, and policy engagement.
Meeting in Cambridge, just a stone's throw from where Alan Turing once worked, gave us space to think ambitiously about IUBL 2.0. We aim to contribute to building an internet that is not only safer but also accurate to its open and principled foundations.
At the heart of our work is a deeper question: What kind of internet culture do we want to build, not just resist?
To everyone who curated this summit—especially Professor Alan Jagolinzer and his team at Cambridge Judge Business School—thank you for creating a space filled with nuance, debate, and community.
By Theo Richardson-Gool and Bryan Boots
On behalf of the Internet User Behavior Lab (IUBL)