Back in the late 1980s, I watched a PBS special on the human mind and learned about the terrible fate of Clive Wearing, a man unable to make new memories. He had been an accomplished musician and musical programmer for the BBC, but in 1985 he contracted herpes encephalitis, which attacks the central nervous system. Since then, he has lived in a deep mental fog that never lifts. He recalls almost nothing about his previous life, and he is unable to remember anything that happens to him for longer than about 30 seconds. (He can, however, still play the piano and conduct a choir.) His life is a continual experience of emerging to consciousness and trying to piece together what’s going on, only to forget immediately and start all over again. The most heartrending detail I recalled from the TV program was his journal: a long sequence of entries along the lines of “I am now completely awake” or “I have finally woken up,” with all the entries but the last one crossed out.
I see now that this show aired in 1988, only three years after Clive Wearing fell ill. Reading about him again for this essay, I was somewhat horrified to discover that he’s still alive, almost 40 years after disaster struck. Here’s a clip from about 20 years ago of him and his wife, the only person he recognizes: every time he sees her, even if it’s just after she ducked out of the room for a minute, he greets her joyously as if they’re being reunited after many years.
The story of Clive Wearing has always stuck with me, just because of the diabolical nature of his malady. But I’ve been thinking of him more and more in recent years, as he seems like a metaphor for the deepening cultural amnesia that now works to shorten our attention spans and obscure our connections to the past. Like Clive Wearing, contemporary culture is increasingly trapped in an endless present.
The widespread study and understanding of history are in long-term, headlong decline. The number of undergraduate history degrees in 1970 stood around 45,000; in 2020, it was a little over 23,000, or roughly a 50 percent decline in absolute numbers. Over this same period, the share of history degrees out of all degrees awarded dropped from roughly 5 percent to 1 percent. The falloff has been especially steep since the Great Recession: the absolute number of history degrees dropped by roughly a third between 2008 and 2020.
A similar story can be told about history PhDs. From 1970 to 2020, the number of history PhDs awarded annually started around 1,000 and has fluctuated between that high and 500 ever since, with just under 1,000 doctorates conferred in 2022. History doctorates as a share of total PhDs awarded have fallen by roughly half, from around 3 percent to around 1.5 percent. While the supply of newly minted historians has been stagnant, the demand has been dropping relentlessly: there were only 274 tenure track positions open for the 2022-2023 academic year.
The collapsing demand for new historians reflects not just the drop-off in history majors but the wider decline in history coursework by non-majors. Broad survey courses on Western civilization — known back in the day as “Plato to NATO” — and American history were once widely available and frequently required at American colleges and universities, but no longer. There is broader exposure now to world history, area studies, and black and women’s history, but it adds up to disconnected smatterings rather than anything coherent. Meanwhile, the project of ensuring that all well-educated people are well grounded in how their own larger social order and surrounding culture came to be has been abandoned.
There has been a great deal of hand-wringing within the history profession about these trends, with a number of contending explanations on offer. But surely the most important reason by far is the general shift in what students and their parents want from colleges and universities. Once they wanted a liberal education, now they want marketable skills: instead of preparing you for a well-lived life, higher education is now supposed to prepare you for a lucrative career. Extensive knowledge of the past isn’t obviously practical in the same way that earning a math or engineering degree is, nor does the study of history have a reputation for special intellectual rigor that would allow it to serve as a signal of raw brainpower.
The dimming of historical consciousness has been abetted by the ongoing decline in “deep literacy,” or the practice of regular, sustained engagement with complex texts. Television started us down this road, substituting the undemanding, passive reception of visual stimuli for the active construction of meaning and sense through reading. The internet and social media have further undermined our cognitive capacities. Ironically enough, they do so by confronting us with endless torrents of written communications — which come at us in a disconnected, contextless, hyperlinked jumble, much of it in the form of semi-literate texts or tweets or other ephemera, the combined effect of which has been to scramble our ability to focus and pay attention at length. It’s hard to slog through endless long paragraphs of analysis and abstraction when your phone is constantly pinging you about some shiny new object to take a look at.
The change in the cognitive culture is obvious. Pleasure reading is down sharply — among kids and adults. All of us who remember life before the internet can attest to the damage done to our attention spans, but we at least have previously acquired habits that can help us to tune out at least some of the distractions. Young people, however, grew up having their brains wired in this post-literate environment, and as a result they have suffered real cognitive impairment relative to previous generations. As high school teachers and college professors now regularly attest, students today simply cannot handle the reading load that their predecessors managed — see here, here, and here.
The quality of historical understanding can’t help but suffer under these circumstances. History can’t be boiled down to graphs or equations or simple formulas: historical knowledge consists of seeing and tracing patterns and connections in the events of the past, an art which is made possible by deep, detailed familiarity with those events — with the many and varied threads of historical narrative that describe the unfolding of political, economic, social, and cultural life. And that familiarity can come only from reading: deep, extensive, careful reading of the vast secondary historiographical literature combined with immersion in primary sources. Trying to advance our understanding of the past in an increasingly post-literate society is like trying to make progress in astronomy by restricting the apertures of all the telescopes.
The decline in serious reading is surely one factor in the diminished appeal of history courses on campus. Those courses typically entail a heavy reading load — who needs that hassle when you could be enjoying yourself, and impressing would-be employers, with extracurricular activities? With rampant grade inflation, it’s through outside activities that students are now best able to distinguish themselves.
The rise of the internet and the broader digital revolution are making it much easier for a few of us to learn more about the past — while at the same time feeding the broader cultural amnesia and indeed undermining the preservation of historical memory. The digitization of books, academic papers, and historical archives has allowed historians to accomplish at their desk in a morning what previously would have taken months to travel to the locations where documents are stored and manually sift through them. At the far edge of our amazing new information-processing capabilities, the Vesuvius Challenge — an initiative to decipher the extensive, charred scrolls found in the Herculaneum ruins — has already offered the first tantalizing glimpses of what could be a revolutionary expansion of our access to ancient texts.
For researchers and avid students of history then, the internet has been a boon. But for the rest of us, moving much of our lives online has had the effect of cutting us off from the past. Because there is so much information online — such an insane hypertrophy of data, so wildly in excess of what any of us could consume in a lifetime — it’s very easy to forget that vast troves of our accumulated storehouses of knowledge have yet to be digitized, and are accessible only in printed form on some shelf buried deep in the library stacks. I’ve seen young interns — very bright, very well-schooled, very conscientious — completely flummoxed when trying to tackle some research question and failing to find the answer online. We’re in the process of forgetting how to remember the past.
Beyond the growing incuriosity about offline sources of knowledge, we have failed to construct our new online world in a way that preserves the past and renders it accessible. The internet’s great virtue is its decentralization: it’s a protocol for sharing information, but the information that’s posted and shared comes from all of us as we build and fill up the umpteen sites that now populate the World Wide Web. It’s up to us to maintain these websites, or not; there is no centralized authority in charge of indexing and preserving all that’s posted. The internet is “not a place in any reliable sense of the word,” writes Adrienne LaFrance. “It’s not a repository. It is not a library. It is a constantly changing patchwork of perpetual nowness.”
The internet’s decentralization, the key to its power as a communications tool, is thus also its Achilles’ heel as a storehouse of knowledge. The main problems are now known as “content drift” and “link rot.” Jonathan Zittrain explains:
It turns out that link rot and content drift are endemic to the web, which is both unsurprising and shockingly risky for a library that has “billions of books and no central filing system.” Imagine if libraries didn’t exist and there was only a “sharing economy” for physical books: People could register what books they happened to have at home, and then others who wanted them could visit and peruse them. It’s no surprise that such a system could fall out of date, with books no longer where they were advertised to be—especially if someone reported a book being in someone else’s home in 2015, and then an interested reader saw that 2015 report in 2021 and tried to visit the original home mentioned as holding it. That’s what we have right now on the web.
Which means that our whole scholarly apparatus for recording sources of information is now falling apart on a daily basis. A 2014 survey of citations in Supreme Court opinions and Harvard Law Review articles found that 50 percent of the links in court opinions since 1996, and 75 percent of the links in law review articles, were no longer operational. Meanwhile, links in more casual texts are even more ephemeral.
Several initiatives have been launched to address this problem — the Internet Archive’s “Wayback Machine,” which preserves how websites looked at selected dates in the past, is probably the one we’re most familiar with. And it’s fair to point out that this is a recurring problem for new media: most of the books from the early decades of printing are lost, as are most silent movies, and a great deal of early television programming.
I expect that we will continue to improve the preservation of online information, but still the triumph of the virtual works to weaken our historical perspective at a basic, phenomenological level. Books and archive files are physical objects located in particular, designated physical spaces, namely libraries and archives: entering those spaces is a physical act that moves you out of the ordinary world and into the domain of the past. Descending into dark and abandoned stacks feels like going back in time. And as physical objects, books and papers exist in time and are subject to its passage: older paper looks different, feels different, smells different. Just as email is an impoverished form of communication relative to face-to-face conversation because all the nonverbal cues — body language, facial expressions — are missing (which we rather pathetically try to make up for with emojis), so reading historical documents on a screen feels flat and contextless compared to holding them in your hands.
I have mentioned now a number of trends that have weakened our cultural connection to the past: the careerist turn in higher education, the decline in reading, the growing hegemony of online experience. In each of these cases, the appeal of studying history has been eclipsed by other values: the importance of landing a good job, the irresistible lure of our beeping phones. But our estrangement from the past is being driven by more than distraction. In addition, we have made the collective determination that studying history is less important, less worthy of our time and attention, than we previously thought.
Part of this reflects the rise of the social sciences. Before economics, political science, sociology, and anthropology were developed as stand-alone academic disciplines, we had history as the master discipline for understanding human nature and social reality. In the months prior to the Constitutional Convention, young James Madison holed up in his library at Montpelier and pored over the histories of republics, ancient and modern, searching for clues for how to avoid the errors that had led all of them to eventual ruin. With that effort, he was about as well prepared as it was possible to be.
Today, however, we have developed a wealth of new social scientific techniques and knowledge to help explain and guide the workings of our highly complex social order. And there have been great advances in understanding, thanks to the reductionism of social scientific method: by abstracting out from the confusing welter of events, we can see general patterns and develop elegant, parsimonious, “thin” models of social phenomena that are then analytically tractable. History, by contrast, offers us “thick” descriptions of events organized in narrative form. To use historical knowledge in the service of present-day understanding, we draw rough-and-ready analogies: this thing we are dealing with today is similar to something that happened before.
The rigor and exactitude of social sciences offer real advantages, but it is easy to overstate them. Social science is quantitative, precise, analytically sophisticated — and thus seems far superior to the educated guesswork of drawing lessons from history. Yet the models of social science are quantifiable and analytically manipulable only because of their reductionism — that is, because of all they leave out. The model is not reality: its relationship to reality is analogical. Accordingly, when we are presented with two different models that yield very different conclusions, how do we determine which is the better fit to current circumstances? In exactly the same way that we decide whether this or that historical precedent is more relevant — by judging which analogy is the more apt. Social science provides us with new concepts and new techniques as the basis for making analogies, but ultimately the best social scientists, like the best historians, practice the art of developing, refining, and exhibiting well-informed good judgment.
Given the sway of scientism in contemporary culture, this point is often lost. In our educational system, we generally overemphasize the acquisition of skills and techniques at the expense of amassing substantive, detailed knowledge of a subject area. We are under the impression that it’s a short cut to intellectual mastery, but really it’s a path to a shallow, brittle kind of expertise — one with massive blind spots and a studied ignorance of their existence. It is altogether too common to encounter economists and political scientists with virtually no grounding at all in economic or political history. They are so many Mr. Magoos, stumbling blindly and utterly unaware of their handicap.
But the fallen status of history as an intellectual discipline is not just due to misplaced overconfidence in the social sciences. There is something deeper and darker going on, something I have written about previously: the loss of faith among our cultural elites in the animating values and constitutive institutions of our civilization. The closer one gets to the centers of intellectual and cultural influence, the more widely accepted is the view that America — and Western civilization more generally — is fundamentally rotten: a cesspool of irredeemable racism, misogyny, and transphobia, driven by runaway capitalist greed to burn and drown the world through climate change. And since we no longer like who we’ve become, we’ve lost interest in exploring our past and figuring out how we got to be this way. The only things worth learning from history are the names of the victims.
We, the residents of the advanced democracies, are the beneficiaries of a transformational expansion in human possibilities that has occurred over the past couple of centuries — lifted out of poverty into unimagined comfort and convenience, lifted out of dictatorship and disease into peace and freedom and health, lifted out of mass ignorance into the Information Age. Yes we have problems, deep and serious ones, but it takes only the faintest hint of historical awareness to recognize that our problems are set against a backdrop of unprecedented blessings. Yet look at us now: poor little rich kids overwhelmed by the unfairness of life, with many of our best and brightest imagining that in their vast privilege they are somehow buckling under oppression and hardship.
Look, literally, at the poor little rich kids protesting on American campuses this spring. They have latched onto the tragic mess in Israel and Gaza and cast themselves, however improbably, as key actors in the drama. Their weird combination of alienation and entitlement has twisted them into shocking moral inversions, proclaiming solidarity with baby murderers and mass rapists in the name of opposing genocide. As for the attitude toward history that accompanies this worldview, recall the wave of statue-toppling iconoclasm that surged in the wake of the George Floyd protests in 2020. Things started off sensibly enough, with efforts aimed at various statues and memorials commemorating leaders of the Confederacy. But mob action invariably degenerates into mindlessness, and so it was there: among the statues toppled, removed, or vandalized were ones of Columbus, Washington, Jefferson, Lewis and Clark, Lincoln, Grant, Theodore Roosevelt, Walt Whitman, and Mahatma Gandhi (!). In Rochester, New York, dimwits even tore down a statue of Frederick Douglass, though it was later put back.
What has happened has the makings of genuine tragedy — that is, it is the signal virtue of Western civilization that has rendered it vulnerable to this dispiriting wrong turn. A fundamental distinguishing characteristic of modernity is the social order’s openness to criticism and improvement: custom and just prices replaced by the endless trial-and-error of the marketplace; the divine right of kings replaced by democratic contestation; hereditary status replaced by careers open to talents; settled dogma from sacred texts replaced by experiment and inference. All complex social orders require hierarchy and authority to function, but the hierarchies and authorities of modernity are unusually permeable and flexible: they are regularly challenged from below, and in response to such challenges they regularly bend and reconfigure.
This kind of civilizational openness is a delicate balancing act — and over the past half-century or so, things have fallen out of balance. As I see it, the excesses of romantic individualism — in particular, a knee-jerk hostility to all forms of authority as inherently oppressive — are to blame. When this ethos was confined to bohemian subcultures, romanticism offered a vital counterpoint to modernity’s mechanization and bureaucratization. But in the 1960s, it broke out into the mass cultural mainstream, and anti-authoritarianism — formerly a dissenting principle in modern life — sought to become a governing principle. Here we see the rise of what the early neoconservatives called the “adversary culture.” As Irving Kristol wrote back in 1979:
Has there ever been, in all of recorded history, a civilization whose culture was at odds with the values and ideals of that civilization itself?... The more “cultivated” a person is in our society, the more disaffected and malcontent he is likely to be – a disaffection, moreover, directed not only at the actuality of our society but at the ideality as well.
The term “adversary culture” went out of fashion, but the phenomenon continued all the same. We see its effects most broadly in the general decay in patriotism on the left side of the political spectrum: only 26 percent of Democrats say they’re extremely proud of their country, and by a 52-40 majority they said they would leave their country in the event of an invasion rather than stay and fight. These attitudes reflect abysmal historical ignorance and at the same time discourage any effort to remedy that ignorance.
Of all the worrisome cultural and social trends I’ve written about on this blog, none runs more directly counter to my own personal inclinations. At some point in my preschool years, I was asked what I loved most about my mom, to which I replied — as was repeated to me many times over the years — “she’s pretty, she cooks good, and she knows a lot about history.” I got the bug from her.
Understanding the world in terms of its history and how things got to be this way has always been absolutely central to the way my brain works. As a kid deciding what football teams to root for, I fell in love with the Green Bay Packers and Notre Dame precisely because of their rich histories. I was always strongly interested in science, but most of all the historical sciences of astronomy, geology, and paleontology. Where did the universe come from and how did it develop? How did the Earth and Moon form? How did life begin and human intelligence evolve? It always seemed to me that the most important questions were historical questions. And human history proper – I could never get enough of it. I can remember the giddiness I had when I first got my hands on H.G. Wells’s The Outline of History — the whole human story in one bulging volume!
I didn’t grow out of it. When I went to college, deciding to major in history didn’t even feel like a decision. And once I made my way into a career as a think tank scholar, I regularly brought a historical perspective to bear on any policy subject I studied and wrote about. My first book — Against the Dead Hand, about globalization — sought to explain the phenomenon by looking back at the first, Victorian era of globalization and tracing how it had risen and fallen. My next book, The Age of Abundance, was a history of postwar American culture and politics; the one after that, Human Capitalism, examined the new class divide and how it emerged out of rising economic complexity. Only my most recent, co-authored book, The Captured Economy, considers a topic without delving too much into its history.
In my view, a well-developed historical perspective is absolutely essential for making sense of contemporary economic, social, cultural, and political issues. We all know the George Santayana line, and it’s especially pertinent these days — our understanding of the rise of authoritarian populism is certainly much improved if we can hear the echoes of similar political developments from other times and places. (And indeed, cutting against the larger trend, our political troubles have elevated a number of historians as prominent explainers of what’s going on.) Yet our need for historical knowledge goes far beyond its capacity to alert us to coming disaster. Whatever the issue, the backstory provides vital context. Knowing what things have actually happened before, and what things have never happened, provides an invaluable reality check on analyses of current events and projections of future ones. And being historically well-grounded gives you a powerful bullshit detector: you’ll know that people have said things like this before, not just once but on repeated occasions, and they were proven wrong each and every time.
More broadly, I believe that the study of history is a path toward genuine wisdom. Learn enough to see real patterns in the “one damned thing after another” jumble, and you will learn to recognize triumphalism and despair as twin imposters. There are no final victories and no final defeats: today’s good is more fragile than it appears and can easily be lost, while today’s ills can one day be defeated or outlasted. And recognizing history’s penchant for irony will grace you with a degree of enlightened philosophical detachment. As you see good and evil repeatedly tangled up with each other, with gains and progress leading to troubles and catastrophes revealing silver linings, you will come to understand the narrow limits of all heavily moralistic perspectives. You will see, again and again and again, that the line between good and evil does not run between countries or classes or races but right down the middle of every individual heart.
And if interest in history has been one casualty of the romantic heresy and the alienation that it spreads, it is equally true that the study of history — and in particular, study of the enormous leaps in human wellbeing that have occurred over the past couple of centuries — can inoculate you against that particular mind virus. As I wrote in an earlier essay:
It is an important clue as to the nature of our current problems that the fact of human progress is not widely known or appreciated. Surveys reveal that most people simply have no idea how much better things are now than they were before — and, in particular, about the astonishing recent gains in human welfare outside the North Atlantic region. And this isn’t purely passive ignorance, either. There is considerable resistance to acknowledging improvement in almost any domain — from global poverty to the resource-intensiveness of economic growth to air and water quality to American race relations — for fear that doing so amounts to excuse-making for the status quo.
It is worth dwelling on our state of actively maintained ignorance and what it says about us. In a healthy culture, the revolution of the past couple of centuries would be common knowledge: schoolchildren could be expected to recite rhymes about it, as we once did about when Columbus sailed the ocean blue. The fact that most people are unaware of the real record of human progress means, quite simply, that we are lost. We don’t know where we are in history, the unique position we occupy. We don’t know the special privilege that we enjoy, or the weighty responsibility that we shoulder.
Once we face up to the fact of progress, the perverse falsity of the romantic heresy becomes obvious. The powers and riches that we enjoy and take for granted are the legacy of a massive mobilization of organization, planning, self-discipline, focus, and diligence — the antithesis of romantic antinomianism. And those powers and riches are the very reason that we are able to enjoy the luxury of a romantic streak running through our civilization. The romantic ideal of liberating the individual from oppressive constraints has indeed been progressively realized — but through mass submission to a new set of looser but still-binding constraints. Real progress is an intricately orchestrated symphony, not a cacophony of barbaric yawps.
I don’t want to overegg my argument. There is so much worth knowing about the world, so of course it’s possible to have a rich and fulfilling intellectual life without much interest in what came before. Likewise, there have been plenty of people who have combined historical erudition with abysmal judgment and worse values. I’ll confess that I don’t have the slightest idea where the “right” level of interest in or understanding of history lies. I’m just reasonably sure it’s higher than what we have today.
Goethe once said (in loose translation) that “he who cannot draw on 3,000 years is living hand to mouth.” That may not be true at the individual level, but it is for the broader culture. For our culture to make progress in the art of “living wisely and agreeably and well,” one important step is to push back against our great forgetting. There’s no future in presentism — at least none worth having.
Great piece Brink, though I think there is a paradox here. Yes, the internet has ushered in an age "presentism," but at the same time, our short attention spans leave us less "present" than ever. Go to a concert and you will find most people holding up their phones and recording the event for watch in the future...are they truly living in the present at all?
To your point, over the past few years, I have noticed that Google searches are increasingly "presentism-cursed" and it is getting harder and harder to find information pertaining to past events. All links that appear in the top searches as designed to sell me something today...not provide me information about something six months ago.
I'm reminded of John Adams: "I must study politics and war that my sons may have liberty to study mathematics and philosophy. My sons ought to study mathematics and philosophy, geography, natural history, naval architecture, navigation, commerce and agriculture in order to give their children a right to study painting, poetry, music, architecture, statuary, tapestry, and porcelain."
The trouble with this progression is that the grandkids wind up ignorant about politics, war, etc. An unwitting recapitulation of the saying (about wealth) "from shirtsleeves to shirtsleeves in three generations".
In my own case, I studied computer science and mathematics so that I could earn enough to retire and read history and literature. I'm not wise enough to plan for multiple generations, only how to earn a living and retire to "finer things".