By now, dozens of studies suggest—if not outright demonstrate—that the sense of civilizational decline is, to a significant extent, a feature of the human mind itself. It arises from a combination of cognitive mechanisms. One of the more important is this: when we evaluate the present, the mind is wired to scan for threats. It therefore gravitates toward what is alarming, dysfunctional, or potentially dangerous. The past, by contrast, is filtered through a more generous lens, one that allows for distance, selectivity, and even a degree of indulgence.
In recent years, this built-in bias has been amplified by media ecosystems and social platforms. If we are predisposed to attend to negative information, the modern information environment supplies it in abundance. Positive developments, meanwhile, are less likely to reach us—not least because of the filters we construct for ourselves.
But a note of caution is in order. The fact that the mind can mislead us—tilting our perception toward unwarranted pessimism—is not, in itself, proof that conditions are not deteriorating.
If we want clarity, we need to turn to more objective indicators. Life expectancy has increased, and overall health has improved. Yet educational attainment appears to have declined, as have certain cognitive and practical competencies. Suicide rates have fallen in many places, but psychological distress has become more widespread. Consumer technologies are more accessible than ever, while housing has grown markedly less attainable.
Taken together, these trends point in different directions. Still, there are strong—arguably compelling—grounds for the claim that the Western world is entering a phase of meaningful decline, even after accounting for the cognitive biases that color our judgment.
That said, it is both more serious and more analytically precise to speak of decline in specific domains rather than to declare a wholesale civilizational collapse. And, of course, what is specific can, at least in principle, be corrected.
