I’ve written about capitalism’s global triumph, first displacing traditional agrarianism and then surviving the socialist alternative. And I’ve described the resulting dispensation as a global capitalist “monoculture” to highlight the novel circumstances in which we now find ourselves: for the first time in centuries, we live in a world with no real competition among rival social systems. As I explained an earlier essay, that means an unprecedented and ever-increasing homogenization of the social landscape:
As it has spread around the planet, capitalism has been reducing the external sources of the cultural and institutional variety out of which it emerged. Yes, the system’s growing complexity of specialization and exchange has produced astonishing internal variety — in the number of different occupations, different organizations producing goods and services, different goods and services available for sale, and different subcultures and “consumption communities.” But this is variety within a particular kind of social order. And that particular kind of social order is now the only game in town.
Since the fall of communism 30 years ago, capitalism for the first time in its existence lacks any competition from a rival system. Yes, the capitalist system currently comes in many flavors: democratic and authoritarian, advanced and underdeveloped, inclusive and extractive, with public sectors ranging from lean to expansive. But in broader historical context, these differences are much less striking than the unprecedented degree of world-spanning uniformity. Virtually the entire inhabitable surface of the globe has been claimed by territorially exclusive states using the same basic forms of governance. Some two-thirds of working-age people worldwide work for money income, most as wage employees of private business enterprises. The majority of people now live in cities constructed from the same building materials and shaped by the same architectural styles. Everywhere you can find people wearing the same kinds of clothing, eating the same food, driving the same cars, watching the same movies, and obsessing about the same media celebrities.
Since America is the birthplace of the modern mass production economy, we tend to see capitalism’s homogenizing influence as making the rest of the world more like us — we used to call it “Westernization.” And indeed, the most dramatic recent episodes of institutional and cultural convergence — the collapse of communism, and the explosion of capitalist growth in the less developed world since that collapse — have occurred in faraway locales.
But in this essay, I want to focus on the advance of the capitalist monoculture here at home. When I was a kid, there were still extensive spaces in American social life relatively free from commercialization, guided by values and goals other than maximizing economic competitiveness and profit-taking. But those holdovers from a traditional, pre-capitalist past have been steadily weakened and then eliminated. And as a result, notwithstanding the post-1960s celebration of diversity and the subsequent explosion of subcultural variety, in the crucial realm of motives we are becoming increasingly uniform.
I was born two years after FDA approval of the birth control pill and one year before the publication of The Feminine Mystique. So I’m old enough to remember the last days of the traditional sexual division of labor, based on the idea of a woman’s world separate and apart from man’s — and in particular, that a woman’s place is in the home. Female participation in the paid labor force had been rising gradually over the course of the 20th century, but still the expectation was that a woman’s primary social role was mother and homemaker. Accordingly, while it was increasingly common for young women to work, they typically left the work force after marrying and having kids. And women who remained on the job were generally shoved into “pink collar” ghettos like teaching, nursing, and secretarial work; the most prestigious and well-paying occupations were almost all reserved for men.
Things are very different now. Since my birth year of 1962, women’s labor force participation has surged from around 38 percent to nearly 58 percent, while men’s has fallen from 82 percent down to 68 percent. One male bastion after another has fallen: female cops, firefighters, surgeons, pilots, astronauts, CEOs, and members of Congress are no longer novelties. And even though complaints about pay gaps and “glass ceilings” persist, often for good reason, female competitiveness in school and on the job has progressed to the point that we’re now starting to worry about boys and men falling behind.
I’m also old enough to remember when Sunday was still set apart, shielded by law and custom from the full brunt of commercial hustle and bustle. In the 1960s, Sunday closing laws had survived major Supreme Court challenges and were still in place in a majority of the states. And where laws were absent or laxly enforced, traditions of churchgoing and Sabbath observance were still strong enough to exert their own restraining influence. But with ongoing court challenges at the state and federal level and popular agitation for an extra weekend shopping day, laws were gradually repealed or hollowed out with exceptions, and customary inhibitions relaxed as well. Over the course of the 70s and 80s, Sundays lost most of their distinctiveness.
Just as the day of rest once stood apart, so did certain ways of making a living. Medicine and law became known as “learned professions” in medieval times, and practitioners developed special codes of ethics that were supposed to orient their work toward public service rather than commercial gain. In particular, the professional ideal dictated that the welfare of the patient or client was paramount, and “conflicts of interest” that gave professionals incentives to prioritize their own economic gain were actively discouraged. The 19th and early 20th centuries saw major initiatives to institutionalize that professional ideal through the establishment of new professional schools, new standards for education, and licensing regimes that subjected practitioners to detailed regulatory codes. Of course, in reality, members of the professions were always adept at padding their own nests, and their ethic of public service did not interfere with their being among the wealthiest members of their communities. Still, within the context of solo or small group practices paid directly by their patients and clients, something within shouting distance of the professional ideal was a living reality.
The special educational and licensing regimes are still in place, but under conditions of mass affluence, floods of money have done much to undermine the distinction between the professions and ordinary commerce. In medicine, third-party payments by private insurers and then Medicare and Medicaid resulted in a huge and ongoing surge in healthcare spending. Now giant regional hospital networks, with hundreds of hospitals pulling in tens of billions of dollars every year, dominate the medical marketplace. In law, the deregulation of finance in the 1970s and the stock market boom that began in the 80s created enormous fees and catalyzed the growth of huge, increasingly international law firms. Now, it’s obvious enough that law and medicine are big business, and the main effect of their regulatory structures is to restrict supply and thereby further boost incomes.
Another medieval institution that survived into the modern world, the university, also stood apart. American universities were originally dedicated to molding the character of the future elite through humanistic education. Beginning in the late 19th century, under German influence they shifted toward a greater emphasis on scholarly research. But whether they focused on teaching the liberal arts or advancing the frontiers of scientific knowledge, universities were clearly governed by a distinct ethos that prioritized the “life of the mind” over the practical business of getting and spending.
Here again, mass affluence brought transformational change. The shift to an increasingly white-collar workforce created demand for mass postsecondary education: the percentage of 25-to-34-year-olds with a college degree quadrupled between 1940 and 1980, from 6 to 24 percent. Meanwhile, Cold War imperatives led the federal government to flood the universities with research dollars. Universities were now assuming a new role, more directly integrated into the larger business of society: “knowledge factories” for the new information age. Clark Kerr, president of the University of California system and an important figure in this transition, put it this way in his 1963 book The Uses of the University: “The university and segments of industry are becoming more alike. As the university becomes tied into the world of work, the professor — at least in the natural and some of the social sciences — takes on the characteristics of an entrepreneur…. The two worlds are merging physically and psychologically.”
The dramatically enlarged student body had different expectations of what an undergraduate education would provide: instead of knowledge for knowledge’s sake, students were more interested in practical skills, the equivalent of vocational training for knowledge workers. By 1970, business and education degrees accounted for over 36 percent of all college diplomas awarded. Even so, inertia served to preserve something of the old sense of the cloistered ivory tower. In 1970, over 17 percent of students still graduated with degrees in English, history, foreign languages, or philosophy, and I imagine the percentage was considerably higher among the nation’s most selective colleges. Meanwhile, course requirements for “Western Civ” and other broad surveys ensured a decent exposure to the humanities while on campus.
But the dominant trend was clearly toward increasing vocationalism, and that trend has only accelerated in recent decades. After a big spike in the college wage premium during the 1980s, college came to be seen as the only reliable ticket to the middle class. As the influential 2010 book by Claudia Goldin and Lawrence Katz put it, prospects for upward social mobility and a more inclusive society hinged on “the race between education and technology”: because of information technology’s bias in favor of more highly skilled workers, only a relentless push toward increasing educational attainment — in other words, expanding the ranks of college graduates — could keep income inequality at bay.
With college now the nearly-exclusive gatekeeper for the meritocracy, the further commercialization of the campus experience proved irresistible. By 2020, history, English, foreign languages, and philosophy majors accounted for less than 5 percent of degrees conferred. Course requirements were scaled back or eliminated altogether; outside of the more rigorous STEM fields, college became a “buffet” of random specialized electives. Students started to be thought of as “customers”; climbing walls and lazy rivers proliferated.
The corruption of the old humanistic ideal reached its grim terminus in the “Varsity Blues” college admissions scandal of 2019. An F.B.I investigation revealed that, over a number of years, 33 parents, including some celebrities, paid more than $25 million to a college admissions consultant to secure slots for their kids at a number of different universities. The consultant bribed admissions officials, inflated test scores, and fabricated athletic and other accomplishments to earn his hefty fees. Here was the ultimate commoditization of the undergraduate degree: just another bauble for the rich to buy.
Although less important than changes in the classroom, changes on college playing fields followed the same trend. Beginning in the late 19th century, college football captured the nation’s imagination with its spirited school rivalries and the romance of the amateur athlete. Eventually, though, television turned college football into a big business. And as more and more money poured through university athletic departments, keeping it coming came to dominate all other considerations. Postseason bowl games multiplied and started naming themselves after corporate sponsors; I recall the 1990 Poulan Weed-Eater Independence Bowl as an early and especially absurd instance of this shameless money-grabbing. Money flowed under the table to get and keep star athletes; college recruiting scandals became a staple of the sports pages. More recently, the breakdown of the established athletic conferences and the associated abandonment of traditional rivalry games, the decision to allow “student-athletes” to earn money for use of their name, image, and likeness, and the rise of the transfer portal and its frictionless free agency have succeeded in removing whatever fig leaf still obscured the fact that college football is and has long been a professional sports league. And in professional sports, as TV executive Roone Arledge once said, “The answer to all your questions is money.”
The class structure that governed American life as the country entered mass affluence was another pre-modern holdover: a quasi-hereditary ruling class, the last gasp of the old aristocratic ideal. I’m speaking here of the old WASP upper class, sustained by inherited wealth, raised to value “good breeding” and despise grasping parvenus, and guided (quite inconsistently!) by an ethos of noblesse oblige to place a high value on public service. Then came the mid-century meritocratic turn, paced by elite colleges’ decision to shift their admissions policies to favor smart kids of whatever background, which gradually reconstituted the American elite and in the process reoriented it. The new elite, now open to women and ethnic minorities, owes its position entirely to competitive success — and thus is vulnerable to downward mobility should its competitiveness falter. This vulnerability induces a deep and abiding insecurity, which we see in the feverish efforts of upper-middle-class “helicopter” parents to ensure that their kids come out on top. We see it as well in the rich’s increasingly brazen sense of entitlement (see “Varsity Blues,” above): now commonly seeing themselves as “wealth creators,” they too often equate their net worth with their contribution to society and so imagine themselves great benefactors even while, all too often, their fortunes came at our expense.
Finally, let me discuss the oldest of the pre-modern institutions that have shaped American life: the church. It’s hard to overstate the degree to which the teachings of Jesus diverge from the materialist striving of capitalist life: “It is easier for a camel to go through the eye of a needle, than for a rich man to enter into the kingdom of God.” Of course Protestantism ingeniously succeeded in reconciling the two, yet even so Christian religiosity did exert some kind of moderating influence — at the very least by reserving the Sabbath as a day of rest. I’ve written already about the decline of organized religion, so I can be brief here: the fading of Christianity has badly weakened another source of pushback against the monoculture. Let me add just a couple of quick points. First, the victory of the monoculture can be seen in the transformation of the Christmas season into a massive, global, months-long shopping binge: Black Friday is now observed more dutifully than Good Friday. Second, it’s noteworthy that the one branch of American Christianity that has actually seen growth since the 1960s is evangelical Protestantism — and especially the Pentecostal and charismatic strains in which the “prosperity Gospel” is now so popular. God, it seems, can only maintain market share by merging with Mammon.
Just as rivers can carve deep canyons into stone, so have the torrents of money produced by capitalist wealth creation gradually eroded and effaced all cultural and institutional resistance to the logic of commerce. In the end, liquidity conquers all.
Please don’t get me wrong: I’m not nostalgic for the “good old days.” The old sexual division of labor squandered talents and thwarted dreams on a massive scale. The professionalization movement that tried to buffer law and medicine from commercial pressures turned out to be the origin story for a good chunk of today’s “captured economy.” Expanding educational opportunities outside the old-money elite was a triumph of mass uplift. The self-toppling of the old ruling caste in favor of a “career open to talents” must be seen as the WASPs’ finest hour: nothing in their reign became them like the leaving it. And as a religious nonbeliever myself, it’s hard for me to criticize others for coming to agree with me.
I feel about the passing of the pre-modern holdovers in American life much as I do about the passing of the industrial working class. I don’t mourn either’s exit, but I do lament what has followed. Blue-collar life was physically debilitating and intellectually stunted; we cannot wish for the return of filthy smokestacks and soul-deadening assembly lines. Even so, when we look out at the atomization and anomie so prevalent today outside the elite, we are hard pressed to deny that, in important ways, the quality of life for ordinary people has declined. In similar fashion, I’m no reactionary who wants to live in a world of invidiously restricted opportunity. On the contrary, I celebrate the fact that women and minorities are no longer excluded from the economic mainstream, society’s juiciest plums are no longer reserved for an inbred old boys’ network, and serious academic study is no longer available only to a cossetted few.
The demolition of the old institutions and social structures was a necessary step in social progress, clearing the landscape for new and better social forms. But we have bulldozed away the old, only to leave the social landscape flattened and denuded. Our failure to build anything new — to move past deconstruction and on to reconstruction — strikes me as a kind of regress. We have been reduced to a grim, economistic tunnel vision: all means other than competition, all ends other than profit, have been shunted to the margins of life. Some recent poll results are illustrative. In a survey of people ages 18 to 40, 75 percent said that making a good living was necessary for a fulfilling life, while only 32 percent said the same thing about marriage. Meanwhile, when parents were asked what they consider most important for their children, 88 percent said financial independence, and the same percentage said a job they enjoy; only 21 percent said marriage, while a mere 20 percent said children.
It is appropriate, I believe, to associate increasing riches with increasing variety. Increased wealth isn’t simply a matter of enjoying more and more and more of the same; it allows us to expand our horizons and enjoy new and different things. For example, consider progress within the capitalist system. The first great strides of material progress enabled by industrialization took the form of fungible, mass-produced goods that were generally lower in quality than artisanal products, but which were so cheap that they were now affordable by ordinary people. The tradeoff involved — reduced quality and variety in exchange for dramatically improved affordability — led to withering critiques of mid-century capitalism: it was churning out cookie-cutter products for cookie-cutter people living cookie-cutter lives of stifling conformism. But those complaints died down in the ensuing decades, as capitalism made the turn toward product differentiation and mass customization, triggering in the process a dizzying proliferation of subcultures organized as “consumption communities.” These days, the critiques of options under capitalism tend to run in the opposite direction: the overwhelming surfeit of different products on the shelves supposedly amounts to a stress-inducing “tyranny of choice.”
Yet outside the capitalist system of competition and maximization, our increasing riches have brought only increasing simplification and sameness. We are rich enough now that many of us no longer need to prioritize maximizing competitiveness and pecuniary gains. There are so many other important values to pursue: most obviously, love and friendship, community engagement, and the pursuit of outside interests and hobbies. It is perverse, then, that instead of using our greater wealth to develop all kinds of different and attractive ways of life, we are all crowding into one, hyper-competitive rut. It seems to me that the advance of the capitalist monoculture, in which all of us are reduced to playing the same game, constitutes a serious waste of our wealth and powers. What a dreary failure of imagination, when so many different and interesting games are possible.
We want more, that’s for sure. As I wrote about in The Age of Abundance and have mentioned repeatedly on this blog, mass affluence triggered a cultural shift in values: with existential security now assured, we are free to focus more on the higher values of belonging and meaning and self-realization. And thus most of us who are doing okay financially worry about our “work-life balance,” and many of us take affirmative steps to improve it by pulling back from a full commitment to economic maximization. (For my part, I long ago traded away the greater pecuniary income from lawyering for the greater psychic income from a more intellectually rewarding life as a think tank scholar.) Yet we are hampered in our ability to achieve the wanted balance because our present conception of it is purely individualistic rather than social. The balancing act is something each of us must attempt personally, but while living and operating in a larger system that itself is badly out of balance — that is, a system that actively encourages us, as both producers and consumers, to choose economic gain and material comfort over our deep-seated psychic needs for connection and belonging.
Rising to the challenge of the permanent problem requires us to move past the monoculture: to erect on its foundations a new, reconstituted pluralism. We’ve already seen the mass customization of capitalist goods and services, and that constituted real progress. Now we need mass customization of ways of life, and enabling households, neighborhoods, and communities to provide more for themselves can create the social space within which such customization can occur.
Seeing progress in greater pluralism is counterintuitive for us. Our instinct is to regard pluralism as backward and benighted, because the pre-modern world from which we’ve emerged was itself so highly pluralistic. And in the great historical struggles to liberate ourselves from those traditional structures, those on the other side were the ones who denounced progress and defended the old ways. We thus tend to associate progress with rationalization: in particular, with the substitution of the vast impersonal “extended order” of state-regulated markets for the dense undergrowth of traditional social arrangements, and the pursuit of legal, political, and social equality within the new order.
But as Jacob Levy reminds us, in a free society the forces of rationalism and those of pluralism are always in tension. Push things too far out of balance one way or the other, and freedom is undermined. To reinvigorate our freedom, to liberate our agency and initiative from the suffocating weight of our top-heavy elite’s “immense and tutelary power,” we need to push back against the excesses of rationalization. To bring capitalism back into balance, to bring richness and variety and depth back to social life, we need more than subcultures of consumption. We need subcultures of independent production as well.
Hi Brink, I've recently come across and highly appreciate your writings and find then really resonate with me in a way that no single source of insight on the problems in American society has provided before. I hope to be able to contribute solutions towards exactly the lack of flourishing that's growing evermore prevalent.
When you refer to "subcultures of independent production" is this merely the alternative methods of production of goods and services? Such an example would be baugruppen housing with community gardening for food production? Or are you referring to the production of other intangible goods of self actualization, community, and soul building experiences that are not mediated through capitalistic market dynamics?
I really appreciate having discovered your writings and hope to conceptualize then actualize solutions to help Sisyphus find his happiness.
Thanks, once again, Brink, for these. I especially enjoyed the review of your journey from Ayn Rand to pluralism in your last post. And this post is a worthy successor. I have two observations about monoculture and pluralism: First, monocultures are weak because they are subject to viral diseases that rage through their unprotected genotype. I would suggest that our social monoculture is weak in the same way. And the virus that seems to be sweeping through our social "memotype" is authoritarianism. We seem to be losing the ability to think for ourselves, perhaps because it is so easy and convenient to adopt the prevailing monoculture. This seems top leave us susceptible to conspiracy theories and power-hungry leaders. Just add it to the list of reasons in support of pluralism.
Second, pluralism is hard because opting out of the prevailing system involves significant risk, both as an individual and as a community. And as we know, fear of downside risk often overwhelms upside potential because we can go down to zero, which is very painful, while upside benefits are unknown and uncertain.
To me, this argues for a form of "social insurance" on the downside that empowers experimentation and risk and enables the benefits of pluralism. This might be some form of universal income or even a literal insurance program for communities or individuals interested in trying alternative forms of social organization. Just a suggestion....