There’s something called wilderness therapy that offers a distinctive approach to helping troubled young people: make them live outside for a couple of months, camping and backpacking with therapists and a small group of peers. I happen to know about it because someone I know went through it.
A central idea behind wilderness therapy is that kids can get lost in the gap between actions and consequences. Indeed, one way troubled kids get to be troubled kids is by learning how to maintain and widen that gap — by lying, manipulating, and bullshitting, convincing authority figures to look the other way, let them slide, give them another chance.
But out in the woods, the gap between actions and consequences shrinks dramatically. If you leave your knife at the old campsite, you don’t have a knife anymore. If you don’t put purification tablets in your water, you’ll get diarrhea. If you don’t hang your bear bag high enough, when you wake up all your food may be gone. Wilderness therapy, then, can help to build a sense of personal responsibility because, well, it’s not nice to fool Mother Nature.
I’m not here to tout wilderness therapy. It has a mixed record, there’s a lot of variation in quality among programs, and there have been abuses and fatalities. As to its efficacy, it’s in the same boat with all other substance abuse programs and therapy regimes: it’s hard to help people change until they’ve decided for themselves that they want to.
For the person I know, though, wilderness therapy was a turning point. Before then they had made a series of wrong turns that led ultimately to winding up on a long, involuntary camping trip in the Rockies. Afterwards, no more wrong turns: the person I know got their life together and is doing well.
What’s stuck with me over the years is the basic insight behind wilderness therapy: the contrast between the hard, non-negotiable authority of the physical world and the soft, eminently resistible authority of the social world. And in recent years I’ve come to see that, like troubled kids, our troubled society has lost its way in the gap between actions and consequences.
Here’s a way to think about capitalism’s transition to mass affluence and what it has meant for the subsequent course of economic and social development: we have gone from a society organized around solving problems in the physical world to a society organized around solving problems inside our own heads.
For all of human existence until just recently, the lives of the overwhelming majority of people were dominated by wrestling with one particular set of problems in the physical world: ensuring adequate food, shelter, and clothing. Most people lived as subsistence farmers, engaged in a direct and uncertain confrontation with nature and seeking to wring enough sustenance from her to keep going. With industrialization, a widening buffer opened up between a growing share of the population and privation: the stakes of the economic problem softened from life or death to larger or smaller surplus. But this occurred because large numbers of people had been mobilized to solve a new set of problems in the physical world — operating and maintaining large and complex machinery in factories, extracting valuable minerals from underground, building railroads and highways, loading and unloading cargo.
With the arrival of mass affluence, however, things changed — and changed radically. Direct, continuous engagement with and feedback from the physical world was no longer at the center of human experience. First, the threat of material deprivation lifted, not just for members of the new and growing middle class, but for almost everyone: fulfillment of basic material needs became the norm and could for most people be taken entirely for granted. Meanwhile, as manufacturing’s superior productivity ensured its long-term decline as an employer, the new postindustrial service economy was born. Work moved out of farms and factories and into shops and offices, and the focus of work shifted from the manipulation of matter to the manipulation of desires and feelings. The most pressing problems in life were now no longer physical, but rather social and psychological — proximately, how to navigate the increasingly complicated social realities of highly organized modern life; and ultimately, what kind of person to be and what kind of experiences to have.
David Riesman noted this cultural shift in its early days in his celebrated book The Lonely Crowd, but his focus was on the purported shift from “inner-directed” to “other-directed” personality. For present purposes, what’s most important is the change in the culture’s basic orientation: from outward toward the external world, with the aim of improving the world for human purposes, to inward into the psyche, with the aim of managing feelings and experiences. As I’ve already written about repeatedly, this was the “postmaterialist” cultural turn documented and analyzed by Ronald Inglehart: less focus on material accumulation, more attention paid to fulfilling experiences and self-expression.
The inward turn toward the social and psychological, on its own, was a positive development. With satisfaction of basic physical needs now secure for most people, it was entirely fitting for more attention to be paid to those elements of wellbeing that are to be found in other people or in oneself. As I described this development in The Age of Abundance, it amounted to an ascent of Maslow’s famous pyramid of human needs, moving past the ancient preoccupation with subsistence to seek the higher goods of belonging, self-respect, and the fuller realization of one’s potential. Environmentalism, feminism, anti-racism, greater sexual openness and tolerance of sexual difference — all were summoned forth by mass abundance, and all were necessary steps in human progress.
The problem lay not in the inward turn itself, but in the fact that it was accompanied by the abandonment of the culture’s external orientation. If the traditional culture of scarcity could be justly condemned for its repression of social and psychic needs, the new culture of abundance was led astray by its loss of an anchor in physical reality.
What occurred was a growing problem of “out of sight, out of mind”: because physical problem-solving no longer figured centrally in our lives in the way it did before, it became easier to think, or just assume without thinking, that physical problem-solving no longer matters much — or at any rate isn’t the path to progress.
The rise of the environmental movement, badly disfigured by the anti-Promethean backlash, was a major milestone in this process. As I wrote in my last essay, it’s easy to imagine things going very differently (although I don’t believe there was ever a realistic chance that they would). Just as widespread plenty and the advent of the service economy were lulling us into discounting the importance of physical problem-solving in our own lives, the rapid spread of environmental consciousness led us to recognize that industrialization had created huge and critical physical problems that menaced both us and the world we live in. This recognition might have led us to shake off our complacency and reprioritize physical problem-solving — this time, to clean up the planet we had fouled and upgrade our technologies across the board from dirty to clean. Had the environmental movement come without the anti-Promethean backlash, it could have inspired a rededication to dynamism in the world of atoms.
But that’s not how things went. Instead, the spread of environmental awareness ended up stigmatizing human power over the physical world. Industry was too often dirty and poisonous; more powerful sources of energy meant greater powers of destruction; technology turned us into mad bulls in nature’s china shop, setting off new chains of unforeseen and regrettable consequences at every turn. In the more radical manifestations of the anti-Promethean backlash, modern economic growth — and sometimes humanity as a whole — was condemned as a cancer that needed to be excised. In the more moderate and nearly ubiquitous variant of this mindset, progress was redefined — from putting footprints on the Moon to reducing the human footprint here on Earth.
Affluence made us postmaterialist, the rise of the service economy redirected our working lives away from manipulating matter, and environmentalism sanctified that redirection by embedding the assumption that large-scale human interventions in the natural world are inescapably treacherous. Giving up our midcentury “Jetsons” dreams of bigger, faster, and higher, we came to see progress in terms of dematerialization — piling up experiences instead of accumulating stuff, and relocating dynamism from the world of atoms to the world of bits.
This relocation of dynamism occurred with the rise of two new “industries of the future” — the “virtual” industries of information technology and deregulated finance. Both soared to prominence beginning in the 70s, just as environmentalism was sweeping the culture. Both promised clean, frictionless innovation and growth — no dirty smokestacks, no labor unrest, no controversies about environmental impact, just the miraculous alchemy of turning ones and zeroes into money. Both allowed us to think that we could retreat from the physical world without retreating from progress. Forget the Space Age, the Information Age was all we needed.
Now, decades later, we can see that it was an illusion. The promise of IT was real — although, as we have now discovered, like all technologies it has a dark side. But the idea that IT’s promise was so vast that we could afford to tune out the physical world was an intellectual bubble that has now burst. After a half-century in which non-IT productivity growth has been basically zero, it’s clear enough that swapping flying cars for 140 characters wasn’t a good bargain.
The IT revolution was a real revolution, even if it wasn’t able to carry the whole economy. The financialization revolution, by contrast, was fraudulent at its core. The essential function of finance is to connect savers with investors — i.e., people who have money with people who have ideas. To the extent that this intermediation is repressed and underdeveloped, economic performance will suffer badly. But the financial sector can go wrong by being overdeveloped as well, and that is what has happened in the United States and other advanced economies. The huge growth in the financial sector over recent decades has had very little to do with building and improving the core intermediating function of finance; instead, growth has largely consisted of facilitating and profiting off a spectacular increase in trading volumes. A certain degree of buying and selling existing financial assets is needed to ensure that prices reflect relevant market information, but the stratospheric level of contemporary financial flows is wildly in excess of what is needed for good price formation.
At its most benign, finance in its current bloated state is a parasite that creates the appearance of dynamism while rendering the economy less vital and less capable of bona fide innovation. Figuring out ever more elaborate ways of trading existing financial assets, including recombining them into derivative forms, is an essentially sterile enterprise; it can make enormous money for people who are good at it without creating anything of cognizable social value — just like lotteries, slot machines, and poker. Paul Volcker’s withering assessment of “financial innovation” is on the money: back in 2009 he said that the ATM was the only useful innovation in banking in the past 20 years (outside of banking, the rise of venture capital more or less exhausts the list of legitimate innovations). Finance acts as a parasite by draining off some of the most talented people in the country who, instead of working on AI or fusion or at least building a better mousetrap, fritter away their productive energies on building castles in the air.
And this is the best-case scenario for overdeveloped finance. As Steve Teles and I wrote about in The Captured Economy, the trend toward financialization has been accelerated by massive subsidies for leverage — i.e., for trading with borrowed money. High degrees of leverage dramatically increase the private upside of financial speculation — while creating the possibility of abysmal public downsides. Contemporary finance is a disaster waiting to happen, and the record of repeated financial crises and bailouts since the 1980s — the Continental Illinois bailout, the Third World debt crisis, the savings-and-loan collapse, the Mexican debt crisis, the Asian debt crisis, Long-Term Capital Management, and the global financial crisis of 2008-09 — makes clear that we don’t need to wait for long. Highly leveraged financialization is the rottenest of deals: vast fortunes for a few, massive wealth destruction and social dislocation for the rest of us.
Our cultural retreat to the virtual has thus left us with an economy less dynamic and productive than it otherwise could be. More fundamentally, it has warped how we think. Living in a world of artifice, no longer exposed to the unyielding hardness of physical reality, our minds now operate in a kind of cognitive bouncy castle — where all sharp edges have been eliminated and pratfalls are more exciting than dangerous. Outside on solid ground, truth amounts to correspondence with a non-negotiable reality that does not bend to suit your feelings. Inside the bouncy castle, where reality consists only of what’s inside your and other people’s heads, the truth is always negotiable — to the point where the very idea of objective truth can be dismissed as a negotiating tactic. These are the conditions that make kids vulnerable to going off the rails and winding up in wilderness therapy. And they are the conditions that make so much of contemporary culture resemble the acting out of troubled adolescents.
These are sweeping claims — what exactly am I talking about? First, I am referring to the general decline in deference to authority that has marked advanced societies since the 1960s. Ronald Inglehart insightfully distinguishes between the cultural processes of modernization and postmodernization. In the former, associated with industrialization, we see a shift in the sources of social authority from traditional to secular/rational — from the family, church, and aristocracy to science and the state; in the latter, associated with postindustrial economies, we see a general weakening of authority whatever the source. This postmodern shift can be liberating: “Question authority” was a watchword for the young Baby Boomers who marched for civil rights and against a brutal war; more recently, we have seen spontaneous upwellings of “people power” topple authoritarian regimes around the planet. But it has a darker side as well, contributing in recent years to the emergence of epistemological bubbles and “post-truth” politics.
A fascinating recent paper highlights one aspect of this ambivalent shift: a decided move from the language of thinking to the language of feeling. Analyzing Google nGram data from both fiction and nonfiction writing in English and Spanish, the authors find a steady rise between 1850 and around 1980 in “words associated with fact-based argumentation” (e.g., “determine,” “conclusion”); starting in the 1980s, however, the trend reverses and “sentiment-laden” words (e.g., “feel,” “believe”) become increasingly common.
Furthermore, I am referring to the gradual eclipse of instrumental rationality — the matching of effective means to chosen ends — by other cognitive styles. For one thing, I would argue that there is a general tendency these days toward prioritizing fair processes over actually achieving desired results. In “The Procedure Fetish,” legal scholar Nicholas Bagley bemoans the current state of the American administrative state, in which far more attention is paid to “the stringency of the constraints under which it labors” than on “how well it advances our collective goals.” And throughout American society today, concerns about “diversity, equity, and inclusion” within organizations often trump considerations of whether those organizations are achieving their goals in the world at large — a situation that has spiraled to sometimes ludicrous extremes in the nonprofit sector.
A related development is the rise of “expressive rationality” — that type of motivated reasoning that weighs evidence in accordance with how well it confirms one’s chosen identity. “Standpoint epistemology” may be based on the valid insight that what you can see depends on the angle you’re looking from, but in its popularized form it has degenerated into the idea that your personal angle (or the angle of your favored in-group) is the only one that matters. We are being scattered from the Enlightenment’s towering Babel, trading our common pursuit of “the truth” for the incoherent babble of “my truth.”
When this trend is bemoaned, it is normally criticized as a species of irrationality. But Yale legal scholar Dan Kahan argues persuasively that, in situations where there is a huge gap between personal beliefs and collective consequences, there is a kind of rationality in preferring feeling good about oneself over aligning one’s beliefs with what is actually needed to do good in the world:
Nothing an ordinary member of the public does as consumer, as voter, or participant in public discourse will have any effect on the risk that climate change poses to her or anyone else. Same for gun control, fracking, and nuclear waste disposal: her actions just don’t matter enough to influence collective behavior or policymaking. But given what positions on these issues signify about the sort of person she is, adopting a mistaken stance on one of these in her everyday interactions with other ordinary people could expose her to devastating consequences, both material and psychic. It is perfectly rational under these circumstances to process information in a manner that promotes formation of the beliefs on these issues that express her group allegiances, and to bring all her cognitive resources to bear in doing so.
These changes in how we think have had their most obvious impact in politics, and when I eventually turn my attention to the current crisis of democratic politics I will address them again at greater length. Here, where the subject is the decline of dynamism, I see the impact of “bouncy castle epistemology” in the deepening unseriousness of contemporary society in the face of deep and daunting problems. Dynamism, after all, refers to an energetic state characterized by effective, progressive change in the world. But with our lives no longer focused on manipulating the physical world to our advantage, and so heavily insulated from the harsh feedback that failure in the physical world provides, we are gradually losing the motivational and cognitive resources upon which dynamism depends.
One place the negative impact on dynamism shows up clearly is in the marked shift in what highly trained professionals are now highly trained to do. Check out this pair of charts (hat tips to Alex Tabarrok and Tanner Greer). Undergraduate STEM degrees conferred are flat or declining even as college enrollment has climbed by some 50 percent; the number of physics Ph.Ds granted to American citizens peaked in the days of Project Apollo. While the U.S. fortunately remains able to attract the best and brightest from abroad, the people we raise here increasingly lack the training and skills needed to solve big problems in the physical world.
Yet if we’ve lost interest in physical reality, it would appear that reality hasn’t lost interest in us. A global pandemic has killed millions; the upsurge in extreme weather events confirms the worsening impact of climate change; war has returned to Europe, sparking fears of famine and the rattling of nuclear sabers. We’ve been floating weightlessly in the gap between actions and consequences for decades now, but consequences cannot be postponed forever. We can think of the recent covid-19 pandemic as our privileged, misbehaving society’s first round of wilderness therapy. And with over a million Americans dead and counting, it’s obvious enough that this round didn’t take. Even with an invisible killer roaming free among us, we could not rouse ourselves to the requisite seriousness; we could not break from our habitual lying, manipulating, and bullshitting.
Reality isn’t done with us, though. More years of wandering in the wilderness are all but inevitable.
I think the absence of *maintenance* is one of the ways we lose contact with reality on the day-to-day level. If you spend a long time renting, your house's problems aren't yours to solve. Versus, if you're able to buy, you can't BS your house the way you can your boss. You have to learn how the physical systems work and how to tend them.
I worked at a start up that frequently needed ops people, and one of the best indicators we found was whether they had a background in stage crew. In theater, on the prop/costume/crew side, you can't BS for long—the show must go on. So people who had worked in that role had an essential honesty and practicality (a desire to *actually* solve the problem, not just prove they had *tried*) that was a great fit for ops.
"Living in a world of artifice, no longer exposed to the unyielding hardness of physical reality, our minds now operate in a kind of cognitive bouncy castle — where all sharp edges have been eliminated and pratfalls are more exciting than dangerous... Inside the bouncy castle, where reality consists only of what’s inside your and other people’s heads, the truth is always negotiable — to the point where the very idea of objective truth can be dismissed as a negotiating tactic."
Millions of Americans still live in a state of hyper-agency, where even minor lapses in diligence could kill them. Whether it's asthma, anaphylaxis, insulin-dependence, or something else, the risk of pratfalls that are far more dangerous than exciting (nothing exciting about "I forgot my Epi-pen") is still real for many. But it's "freak" risk, risks about *your abnormality*, rather than a risk in common, like hunger or cold would be with less material abundance.
The risks of losing the fire flint or the food on a wilderness trek is a socially-legible risk: everyone can understand it, and stands to suffer some themselves if they decide to help you after your lapse rather than abandon you. Everyone *could* go a bit hungry to help the guy who lost his rations – and expect to trade in on the favor they did him later. No-one else catches "a bit of asthma" if the asthmatic kid loses his inhaler. Indeed, one form of "character building" abuse sometimes inflicted by well-meaning counselors is advising people physically dependent on medication to leave their medication behind, as if it were more a psychological security blanket than mitigation of real physical risk. This is rarely appropriate advice.
One would think coping with bodily medical risk grounds one in non-negotiable reality, but in fact it does not, since access to medical risk-mitigation is social access. It's fairly normal for patients who aren't "the standard patient" to find their physical reality dismissed because of social judgments:
"[R]are diseases in the U.S. affect about 30 million people. It takes an average of seven years before a patient is properly diagnosed. Any sort of misdiagnosis doubles this diagnostic delay. Getting a psychological diagnosis extends it 2.5 to 14 times, depending on the disease. 'Once you’ve been labeled an unreliable reporter, it’s almost impossible to get your credibility back,' Dusenbery said. 'Anything you do will just reinforce the perception and the circular logic built into psychogenic theories.'"
https://healthjournalism.org/blog/2018/11/women-more-often-misdiagnosed-because-of-gaps-in-trust-and-knowledge/#:~:text=For%20women%20with%20non,in%2C%20according%20to%20Dusenbery.
Those with the authority to run medically-standardized physical testing will not do so unless they can be socially persuaded to do it. Unpersuaded, resorting to the Prosperity-Gospel logic that patients can will themselves well remains appealing. It's hardly new logic, after all: Job's friends thought of the same thing, that his real material difficulties were spiritual, not material, consequences.
Questions like "How can I present as a credible patient?" and "How can I persuade someone to give medication priced far above its manufacturing costs at a price I can afford?" – questions more about social manipulation than material reality – increasingly preoccupy the minds of those facing daily non-negotiable realities, too. And perhaps it was ever thus. Social manipulation is an instance, not a failure, of instrumental rationality, just one using others as means, not ends.