Tribalism Fuels Thai-Cambodian Conflict How Echo Chambers Weaponize Belief Online
Echo chambers amplify biases: How conflict entrepreneurs monetize and weaponize tribal instincts online, fueling real-world violence.
Why is it so easy to weaponize belief? The recent skirmishes along the Thai-Cambodian border offer a grimly familiar case study: landmines, injuries, accusations, denials, and a media landscape choked with unverified reports, nationalistic fervor, and a deep-seated unwillingness to question one’s own side. It’s a local conflict, but it illuminates a global plague: the seductive power of tribalism in the information age, a power that’s not just about believing what they say, but about understanding why we need to believe something, anything, to feel secure.
Khaosod details the echo chamber effect, where Thai citizens “naturally” trust Thai sources. Why wouldn’t they? To question their own narrative feels like betrayal. To entertain the possibility of their nation’s wrongdoing is even more unthinkable. This creates fertile ground for misinformation, readily accepted and amplified because it confirms existing biases.
As far as I know, there is not even a single journalist who has gone to the front lines to report on the fighting between Thailand and Cambodia and has actually seen the clashes.
The specifics of the Thai-Cambodian dispute—border claims, historical grievances—matter less than the underlying psychological mechanism at play. Humans are tribal. We crave belonging. Our identities are intricately linked to our national and cultural affiliations. And that belonging is often reinforced by a shared belief in a common narrative, a story that paints “us” as virtuous and “them” as suspect. This isn’t just about political affiliation; it’s about the fundamental human need for cognitive coherence, the psychological discomfort of holding contradictory beliefs. We resolve that discomfort, often unconsciously, by prioritizing information that confirms our existing worldview, even if that information is demonstrably false. This is where the system truly breaks down, when the need to feel right trumps the pursuit of being right.
Consider the Cold War. Both the US and the USSR maintained elaborate propaganda machines. Each side painted the other as an existential threat, manipulating information to solidify public support for their respective ideologies. This wasn’t simply about geopolitics; it was about shaping individual identities, creating a sense of shared purpose, and demonizing dissent. The US Information Agency, for instance, didn’t just broadcast news; it curated a vision of American exceptionalism, one carefully designed to contrast with the supposed evils of Soviet communism. Or reflect on the Rwandan genocide, where radio broadcasts played a key role, using rhetoric to dehumanize Tutsis, turning neighbors against neighbors in a horrifyingly efficient campaign of hate. Before the killing began, Radio Télévision Libre des Mille Collines spent months priming its audience, meticulously constructing a narrative of Tutsi perfidy.
These examples show us that modern technology does not magically improve truth-telling. In many ways, it has empowered misinformation at unprecedented scale. We now exist in personalized echo chambers where algorithms feed us information designed to reinforce our pre-existing beliefs. Social media platforms, optimized for engagement, incentivize outrage and polarization. The incentives all push towards a fractured reality, where each side crafts its own facts. But there’s another layer here: algorithms aren’t just reflecting our biases; they’re amplifying them, turning them into a self-perpetuating cycle of radicalization. The problem isn’t just that we choose to believe certain things; it’s that the very architecture of the digital world is designed to nudge us towards those beliefs, often without us even realizing it.
The author of the Khaosod piece astutely points out the parallel between unquestioning acceptance of state narratives and the biased consumption of information regarding the Thai monarchy. Prejudice, whether directed at a rival nation or a domestic institution, acts as a filter, selectively admitting information that confirms pre-existing biases and rejecting anything that challenges the established worldview. This is confirmation bias operating at scale, amplified by nationalism and political animosity.
This raises a deeper question about the nature of truth in the digital age. If access to primary sources is limited, if journalists are unable or unwilling to risk their lives to verify information, and if individuals are predisposed to believe what they already want to believe, what hope is there for a shared understanding of reality? Sociologist Zygmunt Bauman’s concept of “liquid modernity,” wherein traditional institutions and narratives have become unstable and fluid, speaks to this phenomenon. It can describe the world we inhabit, where truth itself becomes a matter of interpretation. But it’s not just that truth is fluid; it’s that the very concept of a shared truth is under assault. We’re not just disagreeing on the facts; we’re disagreeing on whether there are any shared facts to begin with.
Moving forward, media literacy is not enough. We also require “narrative literacy.” That is, an ability to discern the ways that narratives are constructed, manipulated, and used to shape our perceptions of the world. We need a dose of humility—recognizing that our own biases inevitably color our understanding. Perhaps the most important step we can take is to embrace a sense of intellectual curiosity, a willingness to question our own assumptions and to consider alternative perspectives, even when they challenge our deeply held beliefs. But even that might not be enough. Because the problem isn’t just individual bias; it’s a system designed to exploit that bias, to monetize it, to weaponize it. Until we address the systemic incentives that reward division and misinformation, we are doomed to repeat the same cycles of conflict, forever trapped in echo chambers of our own making, not just believing what we want to believe, but being compelled to believe it.