In many parts of the Global South, elections are never merely administrative events. They arrive as social rituals—dense moments when hope, fear, memory, and aspiration circulate simultaneously.
Much like Lebaran (Eid al-Fitr) in Indonesia, elections compress social life into an intense window of reconnection and reckoning. Lebaran is not simply a holiday; it is a mass social choreography where millions return home to renew relationships through presence and shared forgiveness. Trust, in this context, is not asserted abstractly; it is renewed relationally through shared meals and stories.
Elections function in a similar register. They concentrate social energy and heighten emotional sensitivity. What matters is not only what is decided, but who speaks and how meaning is collectively affirmed. This resonates with the African philosophical notion of Ubuntu—that truth is not purely individual or procedural, but social and negotiated.
Seen through this lens, elections are not just moments of democratic choice. They are moments of epistemic exposure—periods when shared reality is tested at scale.
The Deepfake Moment: When Ritual Meets Synthetic Media
Deepfakes do not thrive in ordinary conditions. They flourish when emotions run high and verification cannot keep pace with virality. Elections provide this exact environment.
In this context, deepfakes function not simply as tools of misinformation but as ritual disruptors. They do not only challenge facts; they destabilize the shared moments through which democratic meaning is negotiated. During elections, deception does not need to be perfect—it only needs to arrive faster than trust can respond.
The Limits of Technical Verification: A Global Concern
Most policy responses today focus on detection: fact-checking, watermarking, and takedowns. While necessary, global institutions increasingly recognize this approach is insufficient because it treats the problem as purely technical rather than psychological.
The World Economic Forum (WEF) has explicitly warned that synthetic media poses a systemic risk not merely because it is undetectable, but because it undermines shared frames of reality faster than institutions can repair them. Similarly, UNESCO’s Guidelines for the Governance of Digital Platforms emphasize that technical fixes alone cannot sustain trust during high-stakes civic moments.
Detection operates on probabilities, while democracy operates on judgment. During emotionally charged elections, verification systems simply cannot scale fast enough. What emerges is “verification fatigue”—an erosion of confidence driven not just by lies, but by exhaustion.
Why the Global South Is More Exposed
The Global South is often portrayed as vulnerable to disinformation due to lower digital literacy. This view is incomplete. The fragility is not just technological; it is historical.
Unlike many European nation-states that formed gradually around shared identities, many Global South nations were born from colonial map-making. Borders often compressed multiple ethnicities and histories into single political units. As a result, national identity remains a continuous negotiation.
Elections in these regions carry a heavy symbolic burden. As the United Nations Development Programme (UNDP) has noted, elections in fragile contexts are not just technical exercises but critical moments for social cohesion. When deepfakes enter this space, they do not just distort political choices; they reactivate historical fault lines.
In these “high-context” societies, trust flows through relationships before institutions. As the OECD’s work on information integrity suggests, legitimacy depends less on perfect accuracy than on whether citizens feel included in the process of sense-making. This makes the Global South an epistemic laboratory—showing the world how democratic trust survives, adapts, or fractures under pressure.
Three Democratic Futures
Under this epistemic stress, three trajectories are emerging, echoing concerns raised by bodies like the Carnegie Endowment for International Peace regarding democratic backsliding:
Democratic Resilience: Societies develop “epistemic muscle,” where citizens learn to doubt content without destroying shared meaning, and institutions focus on trust-building rather than just policing content.
Authoritarian Populism: Confusion overwhelms patience. Leaders market “certainty” and “order” as a relief from the chaos of deepfakes, trading freedom for a stable (albeit controlled) reality.
Horizontal Fragmentation: Trust collapses across social lines. Communities retreat into identity-based “truth bubbles,” and democracy erodes from within—not through coups, but through epistemic balkanization.
Conclusion: Towards an Algorithm of Aspiration
Elections will pass. Technologies will grow more convincing. The deeper question remains: when the ritual ends, will democratic trust recover?
Governing AI in this era requires more than fear-driven regulation. We need to move beyond the logic of control, towards what we might call an “algorithm of aspiration.”
This mindset refuses to treat worst-case scenarios as the only compass for decision-making. Instead, it asks how we can design institutions and technologies that protect the conditions for belief. It insists that governance should not only prevent collapse but actively cultivate the spaces where democratic imagination can survive.
In the long run, the resilience of democracy in the Global South will not depend on how perfectly we detect every deepfake, but on how faithfully we preserve the rituals of trust that hold our societies together.

