The Crumbling Pillars of the AI Era

January 5, 2026 by No Comments

Fragile foundation for the future

Our society confronts a profound yet hard-to-define crisis. 

Underlying the surface of political instability and rapid technological change are two quietly crumbling pillars: truth and trust. Their decline is transforming the global scene more deeply than the headline-grabbing events of the day.

Truth and trust are frequently seen as moral virtues, but they act as essential conditions—requirements for cohesive societies, effective institutions, and stable global systems. Without them, even the most cutting-edge technologies can’t drive progress; without them, democratic discourse is unachievable; without them, economic and social life gradually loses its binding threads.

In previous decades, societies operated with a shared belief that truth—no matter how debated—was worth striving for. Institutions like science, media, and the courts built systems to establish, correct, and validate facts publicly. That structure has eroded. Digital networks and curated content have split public life into isolated information bubbles. The rise of synthetic media and other tools has accelerated this fragmentation. Citizens find it harder and harder to tell if what they encounter is real. Consequently, the concept of a collective reality is fraying.

This change doesn’t just increase the amount of misinformation—it reshapes the very nature of public thinking. When truth is unstable, societies lose their way. Disagreements become unresolvable because they require at least some shared points of reference. Without those, political life devolves into theater, identity politics, and mutual distrust. The term “”—named the Oxford English Dictionary’s Word of the Year in 2016—now points to a deeper structural issue: the unraveling of the shared knowledge base that modern societies rely on.

Alongside this decline in truth is a loss of trust. Trust isn’t sentiment—it’s the operating system of social and political order. In societies with high trust, institutions work efficiently, governments can pursue long-term plans, and economies thrive. In low-trust societies, coordination gets more expensive, compliance falls, and politics is driven by short-term gain. Falling trust is evident worldwide: in democracies, media, business leaders, even science. It creates a climate where authority is diluted and legitimacy is fleeting. Even well-crafted policies often fail to win public support because people no longer trust the systems that create them.

The era of artificial intelligence risks worsening these trends. AI is built to speed up decision-making and expand the flow of information. But on its own, it doesn’t help societies better understand that information or trust those who share it. In fact, as algorithms become more integrated into daily life—from finance to education, healthcare, and government—the gap between decision-makers and the public can widen. When algorithmic choices seem mysterious, even small mistakes can trigger outsized distrust. The paradox of the AI age is that more information may go hand in hand with less societal unity.

This unity can’t be fixed by technology alone. The main challenge is institutional and cultural. Societies need to rebuild shared reference points—through open dialogue, trustworthy knowledge institutions, or common civic values. Institutions must earn back trust by confronting the problem head-on: transparency as a habit, not a show; accountability as a routine, not a slogan. The AI era requires rebalancing the relationship between institutions and citizens, one that accounts for the psychological and political impacts of information overload and tech’s lack of transparency.

The global stage faces similar strains. When countries can’t agree on basic facts, cooperation becomes shaky. When international institutions lose trust, multilateral solutions are hard to find. And when technologies that shape global politics are used without shared rules, the risk of systemic chaos rises. In this context, truth and trust aren’t just idealistic goals—they’re strategic musts.

The risks of neglecting these pillars are becoming clear. Societies with conflicting versions of reality find it harder to settle disputes peacefully. Countries where people don’t trust their own institutions often blame outsiders. Global systems weakened by distrust stall when collective action is most needed. The decline of truth and trust isn’t a side issue to the AI age’s challenges—it’s the central one. Without fixing it, progress in any other area will be limited.

The AI era will challenge every assumption from the industrial age. Success won’t just depend on how advanced our technologies are but on the strength of the ideas that hold collective life together. If truth keeps splintering and trust keeps fading, the world could enter a time of lasting instability—politically, economically, and socially. On the other hand, if we can strengthen these foundations—even a little—the AI age might still deliver on its promise of progress.

The message is clear: no society, institution, or technology can survive for long on foundations that people no longer believe in. Truth and trust are still the essential pillars of modern civilization—and how well we can repair or reimagine them will shape the future.