Saturday round-up
A compendium of recent finds that have in common some form of the theme of whether we can retrieve meaning
Thanks for upgrading to a paid subscription. Writing is my job. Your support of that means everything to me.
These periodic round-ups seem to arise organically. They include some things I’ve had in the queue for some time, but generally become fully ripened when I come across one or two more on the morning I decide it’s time for one. These are generally pieces that provide an “a-ha” moment, revealing to me a thematic thread running through everything I’d had in storage, and my inspiration crosses the time-to-do-this line.
The thread I see running through the essays I’m about to turn you on to is the problem of restoring meaning in post-American collective life. Some of them get into some areas that you Precipice readers know to be ongoing preoccupations of mine: the positive and detrimental effects of the Enlightenment, the positive and detrimental effects of urbanization and industrialization, which views of the mass consumerism ushered in by the 20th century are most accurate, the shape-shifting of our sense of identity, and whether the West in its present state of atomization is prepared for the danger level on the world stage.
So here’s a stack of reads to keep you occupied this weekend. (Well, take time to go to church, and to see if the Colts returning Anthony Richardson to the starting-quarterback position has a discernible impact on tomorrow’s game with the New York Jets.)
Last month, Ian Leslie, at his Substack The Ruffian, published a post explaining “Why Giving Up on the Humanities Is Self-Destructive.” Watch for some ideas he discusses to show up in some subsequent pieces in today’s round-up - most prominently, the collectivist impulse to see education as the provision of automatons to the enterprise of making material things:
A drumbeat of doom accompanies the rapid progress of AI. It’s going to steal our jobs, mash our brains, subvert our democracy, defile our womenfolk - and that’s if it goes well. Believe me, I do take these threats seriously, but the future is stubbornly unpredictable, and when we focus on the problems ahead we sometimes take our eyes off the ones in front of us. Insofar as I have a razor for the whole discourse, it’s this: everything we worry AI is going to do to us is already happening, because we’re doing it to ourselves.
We worry that AI is going to flood the world with mediocre content; I worry that we won’t notice the difference (does the latest data-driven Marvel blockbuster really bear the inimitable stamp of human genius?). We worry that AI-powered machines will overpower human intelligence; I notice everyone, including me, is mesmerised by their phones. We worry about being ruled by robots, when we are already ruled by politicians who act like automatons (with a few exceptions - some admirable, others less so). Recent reports that students no longer read books add to a pile of evidence that we’ve given up on the struggle to be human.
There is a familiar story in consumer categories: an established, market-leading brand comes under attack from cheaper, lower-quality competitors. In response, rather than adapting and bolstering the qualities that made it successful in the first place, the leader cravenly attempts to imitate its challengers - and ends up being swallowed by them. This is the strategy our species seems to be pursuing in response to machines that can provide a “good enough” emulation of our most valued product attributes, like the ability to use language and solve problems. (In this case, the brand is the sponsor of its own competition, but let’s not stare at this analogy for too long).
This self-abnegating approach is embodied by the sharp decline in the study of the humanities, or liberal arts. From 2012 to 2020, the annual number of humanities bachelor's degrees awarded in the US fell by almost 16%. The share of such degrees is now at less than 10% of all bachelor's degrees awarded, the lowest level ever; English and history fell by a full third over that period. In the UK as well as the US, universities and schools are organising themselves around the primacy of STEM, cutting programmes in classics, history, music, arts and drama.
We’re abandoning the humanities. The clue is in the name; I mean it could hardly be more on the nose, could it? We’re giving up our USP in order to meet the machines on their turf. Meanwhile we’re training humans to think and act algorithmically, following rules and checking boxes. Here’s one prediction I will risk: the machines are going to be better at imitating humans than humans ever will be at imitating machines. We do not have the comparative advantage here. We should be leaning into, not away from, our humanness.
I believe people should read great books and listen to great music for their own sake rather than to make themselves better employment prospects. The humanities help us think about how to be; not just what to do. But even if we’re being utilitarian about it, ditching the humanities is a mistake; a well-rounded liberal education makes more economic and commercial sense now than it ever did. Only if we use AI to support us in a quest to be more human will we reap the rewards of the coming revolution.
If governments, universities and employers are starving the humanities of resources, that’s partly because they are under the spell of ‘human capital analysis’, pioneered by the late Gary Becker, winner of the 1992 Nobel Prize for economics. It was Becker who first made a systematic argument that education and training are investments in human capital, in the same way that businesses invest in machines or buildings.
On this basis, the intuitive inference is that if the global economy is to be dominated by AI, countries and companies should be allocating the maximum amount of human capital to these technologies. We simply don’t need graduates who are experts in Greek civilisation, nineteenth century novels or twentieth century philosophy, even if it’s nice to have them around.
The inventor of the theory took a different view, however, as I discovered via an excellent post by the economist Peter Isztin. This is from an interview with Becker:
Becker: …What people should look for then as they invest in their human capital is more flexibility. Instead of having human capital that would be particularly useful for one company or even one occupation narrowly defined, you should try to recognize that the future may involve working at another company or in a somewhat different occupation. So look for flexibility.
Interviewer: What kind of education affords such flexibility?
Becker: A liberal arts education. I wrote about this 40 years ago, but I think it’s become even more important today. In an uncertain world, where you don’t know what the economic situation will be like 20 years from now, you want an education based on general principles rather than on specific skills.
This makes sense for a few reasons. A person who is educated in the liberal arts or humanities - not necessarily instead of maths and science and engineering - is acquainted with a range of different fields and ways of thinking. That makes them better able to adapt to an economy that moves in unpredictable ways.
Leslie extolls the virtue of “thinking slower”:
Reading a book is a drag because the information goes in so slowly, but learning to think well entails the sacrifice of speed for depth. Isztin, borrowing from the economist John List, defines critical thinking as the habit of “thinking slower”; of being wary of our instincts and intuitions and able to analyse them (which is not the same as ignoring them). There is no better discipline for doing that than philosophy. Socrates, the greatest innovator in Western thought, made it his business to stop smart people leaping to conclusions. A surprising number of Silicon Valley’s most successful entrepreneurs and investors are philosophy grads, Reid Hoffman being a prominent example.
Literature and history are good ways to learn about the complexity, potential, and frailty of human beings. No matter how tech-dominated our workplaces become, the biggest decisions that leaders make will always concern people, with their messy feelings and maddening, glorious irrationality. It requires something more than technical competence to get those calls right. Reading widely is no guarantee of wisdom, of course, but it does indicate a lively mind. People laugh at Elon Musk’s enthusiasm for Homer, but it’s not a coincidence that many of the most successful tech leaders are voracious readers. Success in an unpredictable world correlates with intense curiosity about all human endeavour.
A knowledge of the humanities also makes life, and work, more interesting, and in a world where the top companies are in fierce competition for the smartest minds, interestingness is valuable. This point is made by Nabeel Qureshi in a recent post on his time at Palantir, the software company founded by Peter Thiel and Alex Karp. Nabeel, as those of you who have heard him on podcasts will know, is himself the model of a twenty-first century renaissance man: a top software engineer who is at ease discussing Empson’s Seven Types of Ambiguity or comparing recordings of the Goldberg Variations (without for a moment sounding like a show-off).
At the Hedgehog Review, Olivier Zunz gives us an interesting framing: an “Age of the Average,” and tells us “What Its Loss May Mean For the Future of Democracy”:
At a symposium marking the two hundredth anniversary of Thomas Jefferson’s Virginia Statute for Religious Freedom, the philosopher Richard Rorty delivered an address titled “The Priority of Democracy to Philosophy.” Rorty began by quoting Jefferson on toleration: “It does me no injury for my neighbor to say there are twenty Gods or no God” (Notes on the State of Virginia). Rorty went on to argue that democratic citizens, following this example, should be willing to reconsider “matters of ultimate importance” that give “sense and point to their lives” when their “opinions entail public actions that cannot be justified to most of their fellow citizens.”1
There is much to ponder here now that we are living through a time when “hanging together” as Americans is proving to be a huge challenge.2 Elements of paralyzing discord surround us, and the journalistic conceit of a divided “red” and “blue” America has become a widely acknowledged reality. I want here to draw a contrast between these days of intense discord and division and another time—roughly the twenty years following World War II—when Americans of different political persuasions, who had fought together against totalitarianism, raised “consensus” to the pantheon of national values. It was in this period that Americans talked most about consensus as both a desirable and practicable ideal, and they continued to do so well into the 1960s, when confidence in that ideal (or what might even be described as consensus about the value of consensus) began to unravel until its complete disappearance now. I am not suggesting we should revive this consensus, if only because it was masking too many inequities. But the postwar years stand in such sharp contrast with our current deep dividedness that it is important to understand a distinctive feature of a world we have lost before turning to some of our challenges.
Defining This Lost Moment
I call this lost moment the age of the average in large part to contrast it with our “age of fracture.”3 How did we reach the age of the average, and what did it mean for American democracy? When we invoke the average, we often think of compromise, conformity, even mediocrity, but I do not mean this at all. I want to retrace how, not so long ago, in this age of the average, an increasingly educated citizenry invested in multiple communities of inquiry. Are we capable of doing the same in our disunity—and also under a new scientific paradigm?
The age of the average emerged from the engineering of high mass consumption during the second industrial revolution of the late nineteenth century, when tinkerers in industry joined forces with scientists to develop new products and markets. The division of labor between them became irrelevant as industrial innovation rested on advances in organic chemistry, the physics of electricity, and thermodynamics.4 Working together, these industrial engineers and managers created the modern mass market that penetrated all segments of society from the middle out. Thus, in the heyday of the Gilded Age, at the height of the inequality pitting robber barons against the “common man,” was born, unannounced but increasingly present, the “average American.” It is in searching for the average consumer that American business managers at the time drew a composite portrait of an imagined individual. Here was a person nobody ever met or knew, merely a statistical conceit, who nonetheless felt real.
This new character was not uniquely American. Forces at work in America were also operative in Europe, albeit to a lesser degree. Thus, Austrian novelist Robert Musil, who died in 1942, reflected on the average man in his unfinished modernist masterpiece, The Man Without Qualities. In the middle of his narrative, Musil paused for a moment to give a definition of the word average: “What each one of us as laymen calls, simply, the average [is] a ‘something,’ but nobody knows exactly what…. the ultimate meaning turns out to be something arrived at by taking the average of what is basically meaningless” but “[depending] on [the] law of large numbers.”5 This, I think, is a powerful definition of the American social norm in the “age of the average”: a meaningless something made real, or seemingly real, by virtue of its repetition. Economists called this average person the “representative individual” in their models of the market.6 Their complex simplification became an agreed-upon norm, at once a measure of performance and an attainable goal. It was not intended to suggest that all people are alike. As William James once approvingly quoted an acquaintance of his, “There is very little difference between one man and another; but what little there is, is very important.”7 And that remained true in the age of the average.
Majority and average converged in the prosperous postwar years when Americans not only produced most of the goods sold in the world but also lent money to other countries rebuilding their economies to purchase them. Economist W.W. Rostow, President Johnson’s adviser, was perhaps the most articulate voice describing high mass consumption as a new stage in American life.8 Being a consumer in the mass market became the sign or indicator of entry into the broad American middle class and participation in its benefits. It was a case of organization and ideas converging on American soil to the point at which a feeling of general abundance blinded too many Americans to deep inequities in their midst.
There were, of course, doubters. When I entered graduate school in history in the late 1960s, historians questioning this notion were busy poring over old censuses and other quantitative records to measure the true extent of social mobility ordinary Americans had experienced over time. Was it illusory or was it real? They questioned the assumption that American prosperity was easily attainable and shared. But they did not find what they wanted. In a flurry of “mobility studies” tracing the careers of thousands of ordinary Americans from one manuscript door-to-door census enumeration to the next, historians proved that upward mobility had never been as easy as political rhetoric claimed, but they could not deny it either.9 However small the individual increments, climbing into the middle class was real. Americans saw social mobility as a reliable mechanism for enlarging the ranks of the broad middle. Ordinary people expected financial gains, and many experienced them. Intergenerational mobility was common. Failure was often temporary, and geographic mobility offered those willing to relocate a second chance. In short, abundance helped produce the American consensus.
Political theorist James Burnham was one of the first to identify the cause of the economic transformation as “the managerial revolution” in a 1941 book of the same name. There he posited that administrators, executives, superintendents, engineers, and bureau heads came to assume a “peculiar importance” in directing and coordinating the multifarious factors of production and distribution, so that materials, machines, plants, workers, and foremen were available in their proper quantity and time.10 But it was only in the 1970s that historian Alfred Chandler Jr., from his influential position at the Harvard Business School, offered an in depth analysis of how the “visible hand” of corporate managers had, through stages he identified, replaced the previously “invisible hand” of the market.11 Railroad managers were pioneers in accounting techniques. With vertical integration, managers in steel, oil, and electricity then came to exercise full control over both the production and the distribution of goods—from the procurement of raw material to the delivery of finished goods in the hands of the consumer. Their reach, already commanding during the Gilded Age, survived the challenges of market fluctuations and of oversight from both the courts and the regulatory state. Managers claimed full victory in the postwar period, when policymakers routinely embraced neo-Keynesian policies of pump priming to strengthen purchasing power when needed.
Chandler’s important work and that of his students was based mostly on flow charts and analysis of organizational structures. I turned to social history to give their story a human face and gain a sense of its consequences for daily life. In my own Making America Corporate (1990), a collective biography of late-nineteenth- and early twentieth-century corporate managers and office workers, I drew portraits of the men and women whose jobs it was to organize new work hierarchies and generate the production and consumption of goods reaching the market in ever-greater numbers. I hoped this foray into social history would personalize this impersonal transformation and give a more granular sense of its consequences. Prominent among the latter was my finding that the people we believed were subject to a strict corporate hierarchy or victims of repetitive tasks—or indeed both—actually had significant agency. A new middle class came to encompass a whole gamut of occupations. In the formative years of corporate capitalism, middle-level managers, engineers, white-collar employees, salesmen, and other representatives of growing corporations resolved conflicts arising from their multiple loyalties to employers, independent professional organizations, and community associations by defining a new work culture.12
Market management had consequences for the entire American social structure. Managers wanted to enlarge the broad center of consumers. At the same time, they meant to understand the distinct groups the market should recognize—according to income, taste, gender, and a few other criteria—to improve the targeting of their products. In People of Plenty (1954), a short, brilliant analysis of American society, historian David Potter described the new class structure created by the managerial revolution. He argued that if the American class structure was “in reality very unlike the classless society which we imagine, it [was] equally unlike the formalized class societies of former times, and thus it should be regarded as a new kind of social structure in which the strata may be fully demarked but where the bases of demarcation are relatively intangible.”13 Why intangible? Because one could climb or go down a notch on the ladder of consumption without changing the overall convergence toward the average of a broad middle class.
Income differences narrowed significantly. Influential economist Simon Kuznets, who established principles of national accounting and historical series of inequality measures, argued in his American Economic Association presidential address in 1954 that the advanced phase of industrial development, after an initial period of increased inequality, led to a reduction in inequality.14
Consumption grew. The theory behind the Kuznets curve added fuel to the widespread belief that more disposable income created an increasingly large middle class of consumers. The American population distributed by income had come to resemble the bell-shaped curve of a Gaussian distribution, with the majority massing around the average and the very rich and the very poor as outliers. Most simply put, with a bulging middle class, majority and average were finally converging.
This simple fact of convergence led to a conceptual change of magnitude among Americans across divides of class and race. United Auto Workers’ Walter Reuther acted on it by pushing through his famous “Treaty of Detroit,” with auto manufacturers assuring huge financial gains for autoworkers.15 Even Martin Luther King Jr. embraced consumerism as a weapon in the fight for integration. In his last book, Where Do We Go from Here: Chaos or Community? (1967), King argued, “People must be made consumers by one method or the other.”16 Once “transformed into purchasers, Negroes...will have a greater effect on discrimination” with “cash to use in their struggle.” More people feeling the tangible effect of belonging to a broad middle class could become more tolerant of one another and negotiate ideological and religious differences.
Hence convergence in politics appeared more frequently, and conversation across the aisle ensued in ways unthinkable today. Controversies certainly abounded, but historian Arthur Schlesinger could write meaningfully about a “vital center.”17 He argued that the 1952 election of Dwight Eisenhower was the epitome of the American consensus because it marked the permanent acceptance by “the Republican party, as the party of conservatism,” of “the changes wrought in the American scene by a generation of liberal reform.” In Ike’s own words, “Should any political party attempt to abolish social security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history.” Ike disliked the idea of “partisanship”; he resisted that of “special interest.” In historian James Patterson’s judgment, the two presidential candidates of 1952—Eisenhower’s opponent was Illinois governor Adlai Stevenson—resembled each other. Ike “shrank from involving the presidency in controversial questions. Better, he thought, to stand above the battle and in so doing preserve his political standing. ‘Partisanship,’ moreover, was to Ike a word every bit as dirty as ‘special interest.’” Stevenson “did not differ greatly from Eisenhower. He was an ardent Cold Warrior. He opposed public housing and was ambivalent about repeal of the Taft-Hartley Act. He castigated ‘socialized medicine.’” As social critic Irving Howe summed it up, “‘Adlaism’ was ‘Ikeism’...with a touch of literacy and intelligence.”18
To describe the cumulative effect of these changes on the American psyche, sociologist Daniel Bell (over-)confidently predicted “the end of ideology.” “Old goals have been displaced,” Bell wrote in 1960, “and the American Dream has been given a new gloss.” Witness the autoworker, he wrote, “the seedling of the indigenous class-conscious radical” has grown into a consumer “working toward a ‘nice little modern house.’”19 The new left would later accuse Walter Reuther of having negotiated a Faustian bargain with capital. Feminists rebelled as well. Betty Friedan hated that “thing ridden house” turned into “the end of life in itself.”20
But Vice President Richard Nixon was oblivious to feminists’ calls when heralding the “house” and consumer products, which Friedan so despised, as symbols of American prosperity and freedom. Nixon in 1959 promoted the consumer ideology when confronting Khrushchev in the famous “kitchen debate” at Sokolniki Park, near Moscow, in a ranch house Americans had built for displaying to the Russians the benefits the American household enjoyed. In so doing, Nixon was turning consumer culture into an export product. The American middle class of consumers was the new universal class, not the working class of the Marxists reduced to silence on American soil. As German economist Werner Sombart had predicted much earlier in the century, “on the shoals of roast beef and apple pie, all socialist utopias came to nothing.”21
The Innovative Citizen
Historians often describe the age of the average as one of excessive conformity amid self-congratulation. There was also plenty of innovation. Historian Richard Hofstadter captured both trends in his classic The Age of Reform (1955) when remarking, tongue in cheek, that America was “the only country born perfect that aspired to progress.”22 “Progress” in knowledge could be seen in the vast expansion of a science-based economy aimed at servicing the average. Forces that created the age of high mass consumption also transformed our knowledge economy. Mass consumption depended on the ongoing creation of new products. Developing this point in full in Why the American Century? I commented upon the decline of the lone inventor and the rapid rise of creativity within a matrix of inquiry. This creativity found its best expression through a network of new institutions of knowledge committed to a science-based consumer economy that transformed natural resources into mass-produced goods available to the growing market of averaged Americans.
As part of the early twentieth-century Progressive Movement against the prevailing concentration of wealth, philosopher John Dewey promoted a spirit of inquiry to advance democracy. Dewey pushed relentlessly for an innovative citizen involved in the task of inquiry. Dewey first presented his ideas on inquiry and democracy in How We Think (1910). Dewey wanted to reconcile the time-honored tradition of tinkering with the newer and more rigorous methods of science. He attempted somehow to combine in the same vision for science a task-oriented pragmatism and the pursuit of “highly specialized ends.” He wanted to see citizens “adopting as the end of endeavor that attitude of mind, that habit of thought, which we call scientific.”23
Dewey’s understanding of science was met with skepticism in highbrow circles. As British philosopher Bertrand Russell argued, Dewey wrongly focused on the spirit of “inquiry” rather than “a search for truth.” Russell went on to ridicule Dewey, who presumably could not see the difference between a scientist and a bricklayer. It was a petty swipe, but Russell had a point in arguing that Dewey’s pragmatism was “in harmony with the age of industrialism and collective enterprise” and naturally “his strongest appeal should be to Americans.”24 As a matter of fact, American business managers, scientists, and the military were cooperating in a matrix of related institutions of inquiry that would only expand under the pressures of market imperatives and serve what seemed to be an endlessly growing number of takers. The American research university grew in tandem with corporate research.25 All of this underwrote a soul-searching quest for a creative relationship between basic and applied science, with the goal of satisfying the wants and needs of the average American. After World War II, engineer and inventor Vannevar Bush proclaimed science the “endless frontier” when proposing the federal government fund a National Science Foundation.26 If Bush’s idea for the new foundation’s program was basic science, the need for multiple applications soon became explicit, reflecting, as political scientist Donald Stokes put it, “a broad awareness of how deeply modern science is inspired by societal need.”27
Dewey also called for “[finding] some unity” in research and “some principle that makes for simplification.”28 One common denominator across developing and emerging fields of inquiry was an emphasis on quantification. In the age of the average, American investigators invested heavily in statistical analysis. Social scientists pushed for empirical research in large university departments. They moved away from their late-nineteenth-century roots in the Social Gospel movement and invested in the science of quantitative measurement. They sought to understand society both at its broad center and at all its strata—how the parts differed, were alike, or combined. The methods and mindset developed initially in the rarefied scientific world of nineteenth-century Belgian pioneering statistician Adolphe Quetelet, who invented “l’homme moyen,” and quantification-mad positivists such as Saint-Simon and Auguste Comte came fully into their own in this American moment, with social statistics becoming a regnant science. The American mathematical community followed suit with significant advances in probability sampling and all other forms of measurement. There was immense creativity invested in the measurement of society.29
So long as average and majority overlapped, the average, however reductive a notion it was in important ways, became an easy reference point most Americans could relate to. Americans invested faith in an easy-to-join broad center. The feeling of being in the range of the average promoted a sense of unity among otherwise diverse people. When it came to inquiry, Americans approached the task in terms close to those that John Dewey codified. To do so, they built “a skein of networks” and somehow made it work for the common good.30Bertrand Russell’s snub notwithstanding, individuals could feel a sense of agency in embracing pragmatism by navigating in the broad matrix where “ordinary” Americans could do “extra-ordinary” things.
The End of the Age
This is a world we have largely lost, and, although it may seem contrary to the spirit of my reflections so far, I do not regret it. The statistical management of society is still very much with us. We continue to live with mass consumption, an abundance of data on everything, a world of statistics and surveys. We have an avalanche of products. But, conceptually, the age of the average is over. The formula that helped Americans hang together in the early postwar heyday of American prosperity and global influence is no longer operative. The ideology of the average rested on a vast oversimplification of American life. It is remarkable that it lasted so long, although it did so largely by effectively masking or obscuring wide differences and great inequalities in wealth, opportunity, education, justice, and other areas of economic and civic life. The challenge of exposing and analyzing the causes and consequences of these inequalities mobilized many in the generation that came of age in the 1960s, I among them. I will not tell that story here, except to say that many of us confronted the world of the average, exposed its excessive conformity, and refused to confuse equality with uniformity.
Ronald Reagan was perhaps the last president who thought he knew the average American. He recognized the fabricated character well enough to describe him or her: “By average American,” he declaimed at the 1985 Conservative Political Action Conference, “I mean the good, decent, rambunctious, and creative people who raise the families, go to church, and help out when the local library holds a fundraiser; people who have a stake in the community because they are the community.”31
But that description was already obsolete. There was a brief interlude of self-congratulation reminiscent of the 1950s in the 1990s, when the optimistic vision of Herbert Croly’s American promise briefly returned after the collapse of the Soviet Union.32 The end of the Cold War raised hope that globalization under American auspices would successfully export the American textbook vision of consumerism sustaining a middle-class ideology. Such wishful thinking lasted about a decade, from the fall of the Berlin Wall in 1989 to the attack on New York City’s Twin Towers on September 11, 2001. When I published Why the American Century? in 1998, in the midst of globalization, the image I chose for the cover was a striking 1953 photograph of a nuclear household of four—father, mother, and two children—surrounded by the cornucopia of consumer goods that abundance brought. An Italian translation came out the year after September 11, and its publisher chose as a cover, unbeknownst to me, a rendering of the Twin Towers before their destruction, with the Statue of Liberty in the foreground. A new reality was here.
Majority and average no longer converge in a broad middle class. The once-influential interpretive works of Chandler, Potter, Kuznets, Bell, and Rostow address another age. The trends they identified in middle class investment in the corporate world, belief in a flexible social structure, and faith in steady progress stand in sharp contrast with what we see in our highly polarized society. Overlapping majority and average, real or illusory, was critical in maintaining the egalitarian promise of American liberty, and with it a spirit of socially progressive inquiry. But the concept of the average American withered as the many excluded, in one fashion or another, from the assimilative center exposed its fallacy.
The New Dispensation
The questioning of the American promise came from all political persuasions. A few widely shared observations bear rehearsing, if only to underscore the demoralizing effects of our current situation.
In the field of business organization, an alternative conception of the business corporation emerged. Financial capitalism took over managerial capitalism. At the heart of Chandler’s thinking about the visible hand was the idea that managers cared about the long-term health of the organization (and theirs in the same movement). Milton Friedman had no use for Chandler when he argued that a corporate executive was merely an “agent” on behalf of his (or her) employers—i.e., the stockholders—and not a “principal.” As Friedman noted in his book Capitalism and Freedom, “there is only one social responsibility of business[:] to increase its profits...in open and free competition without deception or fraud.”33 The trend toward tangible return on investment has also affected scientific fields heretofore granted more time to produce usable outcome. Biologists must generate marketable drugs, computer scientists practical applications.34
Wealth inequality has returned, and the twenty-first century has been dubbed a new Gilded Age. The center is bulging no more.
At City Journal, Martin Gurri coins a discussion-worthy term: the Endarkenment:
Asked whether she could provide a definition of the word “woman,” Ketanji Brown Jackson, Supreme Court nominee, magna cum laude at Harvard and graduate of Harvard Law, seemed perplexed: “I’m not a biologist,” she observed. Yet we are told by Jeremi Carswell, a specialist in the field, that children know perfectly well which of many genders they wish to grow up to be “from the moment that they have any ability to express themselves.”
During the 2020 pandemic, because of safety concerns, San Francisco took draconian measures to keep adults apart and children out of school, even as it promoted and protected the use of dangerous drugs by a large homeless population. That year, 257 San Franciscans died from the virus, while the number of overdose deaths climbed to 697.
In May 2024, former president Donald Trump was convicted in a Manhattan courtroom of a crime most Americans would be hard-pressed to describe. Three months earlier, a special prosecutor found that Joe Biden had mishandled classified documents but refused to bring charges because the sitting president of the United States was “an elderly man with a poor memory.”
These recent episodes are symptoms of a mass decline in America into unreason—bordering, at times, on a psychotic breakdown. Strange fantasies have overwhelmed reality: it’s an age of delusion, impossible longings, and ritual self-mutilation. The causes are many and complex, but the syndrome deserves a name. I’m going to call it the “Endarkenment” because it rises, like an accusing specter, out of the corpse of the fallen Enlightenment.
The Endarkenment is the pathological disorientation that convulses a society after it has extinguished all sources of meaning and lost sight of all paths to a happier future. It’s the triumph of wish over facts, the infantilization of top echelons of the social pyramid—of hyper-credentialed, globally mobile people, wielders of power and wealth and media, who, on a routine basis, confuse their self-important imaginings with the world itself. It’s the widespread descent of everyone else, now deprived of teachers, preachers, and role models, into a cognitive underclass, prone to the most bizarre theories about how things work.
The Endarkenment is experienced collectively as the disintegration of institutions, a traumatic fracturing of social life, and the seemingly ceaseless perpetuation of political conflict. But it is also experienced at the personal level in the form of heightened anxiety, depression, drug addiction, “deaths of despair,” and a loss of interest in family and procreation—even in sex.
The chaos has infected every level of contemporary society. For many, its perfect avatar is Trump—a man who selects his facts out of his fantasy life. Trump is a worthy representative, but I prefer outgoing president Biden because the light has literally gone out in his eyes and in much of his mind. Though the most powerful man on earth, decider between peace and war, he is unable to complete a coherent sentence. At the fateful June presidential debate, he made Trump sound like Pericles by comparison.
Biden is a stumbler in the dark. He, or those acting on his behalf, assembled an administration of aging retreads, cliché spouters, identity maniacs, cross-dressers, and vulgar Marxists, who, from Afghanistan to the Mexican border, failed at every task they set for themselves. With Biden and his enablers, progressive politics surrendered unconditionally to the Endarkenment.
From the pinnacle of government to the youngest generation—the Zoomers—the same existential confusion prevails. Materially, the Zoomers are a privileged cohort. They benefit from more education and higher income than preceding generations. Emotionally and spiritually, however, their lives are parched of meaning and oppressed by fear of the dark. According to social psychologist Jonathan Haidt, Zoomers suffer from unprecedented levels of anxiety, depression, and suicide. Haidt blames the cell phone and social media. I would add: and the emptiness, too—the lack of anything else. The digital world, with its subjectivist distortions, has become God and religion for the Zoomers, their source of identity and measure of self-worth. It’s a generation imprisoned in a house of mirrors.
Fevered attempts to break out have only led deeper into the maze. Young gays and transsexuals, for example, have been, after the October 7 terrorist massacres in Israel, among the fiercest defenders of Hamas—an Islamist movement that condemns their behavior as a capital crime. “Be grateful that I’m not just going out and murdering Zionists,” warned a Zoomer of uncertain gender at Columbia University’s anti-Israel protests. That chilling mix of self-righteousness and verbal threat is the starting point of Endarkenment politics.
Apocalyptic prophecies cast a deep shadow over the future. Each successive year, we hear, is the hottest on record. Every moment brings us closer to climate horror: humanity will end in burning desolation. The only cure to the sickness of industrial society is “degrowth”—a monkish embrace of poverty.
Democracy keeps dying on the information sphere—because of Trump, or populism in general, or the Deep State, or social media, or white supremacy. “What is democracy if a trail of broken promises still leaves black communities behind?” wondered Biden, who, at the time he spoke these words, was still in charge of delivering on those promises.
Meantime, our public debates have come to resemble the shouts and moans emanating from a lightless lunatic asylum. Should men compete in women’s sports? Should children have the right of self-mutilation as soon as they have “any ability to express themselves”? Should the word “mother,” until now revered in every culture, be banned in polite society as too offensive to the barren? Such controversies are possible only in a place of impenetrable gloom, where the mind can trick itself into believing that it has finally overcome reality.
Two questions arise out of our current leap into the dark. First: How did we get here? Second: Can we turn on the lights again?
The Enlightenment,” wrote historian Peter Gay, “may be summed up in two words: criticism and power.” Inspired by an almost religious faith in reason and science, that criticism swept everything before it. The remnants of feudalism in Europe, the spiritual domination of the Church, the rule of men rather than laws—all disappeared in less than a century. The American and French Revolutions were offsprings of the Enlightenment. Liberalism, the official doctrine of the democratic West, must be considered its ideological grandchild (I will use the two terms, liberalism and Enlightenment, as roughly equivalent). The enlightened economy did away with tolls, tithes, and robber barons, and established the prerequisites for the Industrial Revolution, thereby normalizing affluence.
By every known measure, the Enlightenment inaugurated an unparalleled improvement of the human condition. But a penalty was to be paid: the critical impulse lacked a logical stopping place, an end of history of the kind that the Hegelians and Marxists promised. The monarchies and the old nobility were overthrown, representative democracy expanded the suffrage to all citizens, universal literacy and education were achieved, and free markets and science generated undreamed-of wealth and vastly enriched lives—yet still the criticism continued its inexorable assault on social relations.
Criticism was unrelenting because the gaze of science is total: no exceptions are tolerated, no waivers issued for angels in the great beyond. The Old World was enchanted. Meaning flowed from heaven to earth. Social structures reinforced the linkage: religion, class, guild, family, village, and neighborhood—all inserted the individual into communal arrangements rich in memory and certainty. But these were precisely the bastions of conservatism that liberal thinkers and statesmen undertook to tear down. By the mid-nineteenth century, the disruption of traditional forms had attained escape velocity. Listen to Marx in 1848: “All fixed, fast-frozen relations, with their train of ancient and venerable prejudices and opinions, are swept away, all new-formed ones become antiquated before they can ossify. All that is solid melts into air, all that is holy is profaned.”
Liberalism sought to solve the problem of meaning by privatizing it. Individuals were free to believe whatever they wished, so long as it remained within the law. But this could only be a provisional expedient. The massive weight of the culture pressed against traditional sources of shared meaning, grinding them down. One was free to believe that the sun orbited the earth, but not if one wished to be taken seriously. The same became true of Christianity and religion in general. To be enlightened—or “modern”—came to mean disenchantment, skepticism, the impossibility of settled belief. Marx’s “fast-frozen relations” had aroused powerful feelings of belonging; for these, liberalism substituted the cold scalpel of the statistician and the bureaucrat.
Having severed all connection to the absolute, the entire project seemed to hang, magically, in midair. Liberalism made claims to universality—but on what basis? Most liberal ideals, like humanism, were secularized versions of Christian virtues—how could they survive the repudiation of the original? As Darwinian organisms in an indifferent universe, what, other than discredited custom, stood in the way of a “revaluation of all values” that would exalt the superior predator—the “blond beast”? Such questions, central to those like Marx and Nietzsche, who detested the system, somehow wound up elided in the mainstream of liberal thought. We look in vain in the many pages of John Stuart Mill for a moment of anguish over the matter. A curious lack of self-awareness clouded the critical enterprise.
A little later, Gurri gets into something you’ve seen me explore here at Precipice: this whistling-past-the-graveyard way we have of assuming that the continuity of everyday life will surmount the proliferating variety of hiccups:
Can the lights be turned on again? That, I would think, is an important question, though it rarely gets asked.
Many insist that the blackout is temporary. All it takes is to put Trump in prison—then we’ll be back to normal. Or maybe if we can use really good words to explain the benefits of reason and science, light will be restored to the twenty-first century. That seems to be Steven Pinker’s premise in Enlightenment Now, where he goes on at length about “how journalists, intellectuals, and other thoughtful people . . . might avoid contributing to the widespread heedlessness of the gifts of the Enlightenment.”
It’s too late, I fear. Trump was a late-stage symptom, not the cause of death. Appeals to utility and self-interest have always failed because they lack spiritual substance. Elite voices like Pinker’s grate on the ears of the deplorable class. Humanism, torn from any transcendental framework, is thin gruel at best—and irrational at worst. The Enlightenment is over. We should turn our minds to what comes next.
Some aspects of society remain reassuringly familiar. For all our glib talk of postmodernism as the repudiation of the idea of progress, we still expect the economy to grow, and we panic when it doesn’t; we still expect to be cured of rare diseases and are shocked when we aren’t; we still expect the latest technology, like artificial intelligence, to be placed at the disposal of ordinary persons, and we’re willing to smash trillion-dollar corporations if it isn’t. And our expectations are largely met. Part of the darkness is rhetoric and hypocrisy—celebrities railing about climate change while moving around the world in private planes and enormous yachts. The Endarkenment isn’t a total eclipse—yet. It isn’t a Dark Age. We aren’t quite ready to surrender 2,500 years of Western civilization to the barbarians. That fight continues, though the outcome is uncertain.
Still, the sustained assault of the disenfranchised on the system has generated dangerous instability. The institutions of democracy and modernity totter on the edge of collapse. From one perspective, this conflict has been viewed as the equivalent of a barbarian invasion—a horde of lumpen-proletariats on the march, bent only on destruction. Yet, from inside, the movement is experienced as a revolt of the “normals” against a ruling class eager to sacrifice every shred of meaning on the altar of critical theory. There may be truth in both accounts, but that’s immaterial. The Enlightenment’s dismissal of the rabble is simply no longer viable. The deplorables, with all their anger, must somehow be brought inside the tent. The means are up for debate; the will, at present, is nonexistent.
The most radical departure from Enlightenment ideals will concern the manner in which we address the problem of meaning. Criticism is necessary for modernity. Meaning and moral aspiration are necessary for humanity. A balance must be struck that lifts us out of pure randomness and materialism to a credible—and shared—higher purpose. The famine of meaning can be fatal. The rise of totalitarianism and, to a lesser extent, the “established church” of identity and climate doom today are examples of the political deformations that occur when the balance breaks: the hunger will be satisfied in some way.
Abigail Shrier, writing at The Free Press, gives us a bracing heads-up regarding how public schools in post-America are flat-out indoctrinating their charges with Jew-hatred:
In August, the second largest teachers union chapter in the country—there are more than 35,000 members of United Teachers Los Angeles—met at the Bonaventure Hotel in L.A. to discuss, among other things, how to turn their K-12 students against Israel. In front of a PowerPoint that read, “How to be a teacher & an organizer. . . and NOT get fired,” history teacher Ron Gochez elaborated on stealth methods for indoctrinating students.
But how to transport busloads of kids to an anti-Israel rally, during the school day, without arousing suspicion?
“A lot of us that have been to those [protest] actions have brought our students. Now I don’t take the students in my personal car,” Gochez told the crowd. Then, referring to the Los Angeles Unified School District, he explained: “I have members of our organization who are not LAUSD employees. They take those students and I just happen to be at the same place and the same time with them.”
Gochez was just getting warmed up. “It’s like tomorrow I go to church and some of my students are at the church. ‘Oh, wow! Hey, how you doing?’ We just happen to be at the same place at the same time, and look! We just happen to be at a pro-Palestine action, same place, same time.”
The crowd burst into approving laughter.
Seated at a keffiyeh-draped table, Gochez said, “Some of the things that we can do as teachers is to organize. We just have to be really intelligent on how we do that. We have to know that we’re under the microscope. We have to know that Zionists and others are going to try to catch us in any way that they can to get us into trouble.”
He continued: “If you organize students, it’s at your own risk, but I think it’s something that’s necessary we have to do.” He told the audience of educators that he once caught a “Zionist teacher” looking through his files. Gochez warned the crowd to be wary of “admin trying to be all chummy with you. You got to be very careful with that, even sometimes our own students.”John Adams Middle School teacher and panelist William Shattuc agreed, a keffiyeh around his neck. “We know that good history education is political education. And when we are coming up against political movements, like the movement for Zionism, that we disagree with, that we’re in conflict with—they [Zionists] have their own form of political education and they employ their own tools of censorship.”
What are the “tools of censorship” employed by Zionists? Apparently, they include accusing teachers who rail against Israel in the classroom of antisemitism.
“They try to say antisemitism, which is really ridiculous, right?” said Guadalupe Carrasco Cardona, ethnic studies teacher at Edward R. Roybal Learning Center in Los Angeles. Cardona recently received a National Education Association Foundation Award for excellence in teaching. “What they do is they conflate. Part of that is by putting the star on their flag,” Cardona said, referring to the Jewish Star of David. “Religion has nothing to do with it.”
But, she insists, that the course she teaches, and whose curriculum she helped develop—ethnic studies—is fundamentally incompatible with supporting Israel. “ ‘Are you pro-Israel—are you for genocide?’ And if anybody were to say, ‘Okay, sure,’ that’s really not ethnic studies.” (Gochez, Shattuc, and Cardona did not return requests for comment.)
It’s tempting to dismiss this as one more bull session among radical teachers leading a far-left public-sector union. If only.
Four years ago, I was among the first journalists to expose the widespread incursion of gender ideology into our schools. Once-fringe beliefs about gender swiftly took over large swaths of society partly thanks to their inclusion in school curricula and lessons.
Today, extensive interviews with parents, teachers, and non-profit organizations that monitor the radicalism and indoctrination in schools convinced me that demonization of Israel in American primary and secondary schools is no passing fad. Nor is it confined to elite private schools serving hyper-progressive families. As one Catholic parent who exposes radicalism in schools nationwide on the Substack Undercover Mother said to me: “They’ve moved on from BLM to gender unicorn to the new thing: anti-Israel activism. Anti-Israel activism is the new gender ideology in the schools.”
Parents who watched in alarm as gender theory swept through schools will recognize the sudden, almost religious conversion to this newest ideology. And very few educators are standing against it.
Michael Lucchese makes a point at The Washington Examiner that I made a lot during the just-concluded election cycle: that the Harris-Walz and Trump-Vance campaigns had something in common: a dangerously myopic framing of foreign policy in terms of “ending wars”:
In his first presidential administration, President-elect Donald Trump certainly faced immense national security challenges. But nothing compares with the scope or complexity of the situation he will inherit upon returning to the White House. While counterterrorism and border security remain essential missions, the United States has fully entered a new period of great power competition that could deteriorate into global emergency at nearly any moment.
Without question, the single greatest threat to American national security is increasing cooperation between President Vladimir Putin’s Russia, the Islamist regime in Iran, and the Chinese Communist Party. Though each power has distinct ideological goals, their shared purpose is to roll back American primacy around the world. Neither liberal shibboleths nor isolationist mantras will ever be enough to deter this aggression — only the careful cultivation of American power can. If the incoming Trump administration wants to keep America safe from this threat, it must avoid simplistic populist clichés about “restraint” and “nonintervention.”
Over the last four years, the Biden administration utterly failed to contain aggression from these three enemy regimes. On their watch, Putin launched his bloody invasion of Ukraine, and the Iranian-backed “Axis of Resistance” began a new terrorism campaign across the Middle East from the Gaza Strip to the Red Sea.
Meanwhile, the CCP was not only quietly supporting Russian and Iranian aggression but also advancing its own agenda toward regional hegemony in East Asia. From a national security perspective, Joe Biden’s presidency was nearly as disastrous as his one-term forerunner Jimmy Carter.
The chief reason for Biden’s failures is simply that he could not convince enemy powers that they would suffer consequences for aggressive actions. He entered office, much like Trump today, as a president who promised to end wars. But the short-sighted policies developed from those pacific promises were really a sign to America’s enemies that he lacked any sort of resolve to defend our interests. Each of our three major rivals understood that Biden was unwilling to take major steps to deter them, and so they forged new relationships among themselves to take advantage of American weakness.
Increasingly, this anti-Western bloc is openly binding themselves to one another with public statements and treaties. President Xi Jinping declared a “no limits” partnership between Russia and China in February 2022 and ever since has backed up that pledge with concrete action.
More recently, Russia’s foreign minister, Sergei Lavrov, has been pursuing an actual legally binding treaty with the Iranian regime. The threat is not simply informal cooperation between these enemy powers but rather the confluence of their interests and action.
One painful illustration of this problem is the presence of North Korean troops on the Ukrainian battlefield. Not only do these massed forces demonstrate the budding relationship between Putin and North Korean leader Kim Jong Un, but they also show how much passive support the CCP is giving to Putin’s war effort. There is simply no way these troops could have gotten all the way to Europe without Beijing’s consent.
Combined with the Iranian-manufactured drones Russia is still deploying against combatants and civilians alike, the war in Ukraine is a terrifying image of what global conflict will look like over the next four years and beyond. Biden simply did not have the strength or force of will to prevent these hostile connections from developing.
In his several presidential campaigns, Trump unfortunately trucked in the same kind of quasi-isolationist rhetoric that Biden did. And although his last administration was less disruptive than many anticipated, he still implemented policies that sapped away at deterrence. It was Trump and then-Secretary of State Mike Pompeo, for instance, who negotiated the deal with the Taliban that eventually led to Biden’s shameful retreat from Afghanistan.
Even with the supposedly hawkish team staffing the incoming administration, Trump’s isolationist instincts and rhetoric could undermine American interests and repeat the very same mistakes his predecessor made.
What the incoming Trump administration needs to learn from this season of catastrophe, then, is that weakness invites aggression. A competent national security strategy would acknowledge the dire global situation and enact policies to put America back on top. Biden did not fail because he overextended American resources or pursued too many interventions — he failed because he refused to make national security the top priority it ought to be.
He then lists some priorities that the incoming administration ought to have: massively increasing overall defense spending, and shoring up alliances ( such as letting Ukraine know it’s as indispensable to a West-oriented world as Israel, South Korea and Taiwan).
But it’s the third one that I want to bring some attention to. Thank you, Mr. Gurri, for your ringing defense of free trade:
Another step Trump should consider is using trade policy to solidify America’s relationship with our allies. Although he and other protectionists are clearly wrong about tariffs as a means to economic prosperity, they are entirely correct that the United States is far too reliant on China for manufactured goods. Rather than ill-considered policies designed more out of nostalgia than from strategic calculation, though, this problem would be better addressed by pursuing trade deals with countries that share our geopolitical interests. Asian allies such as Japan or the Philippines would love increased trade with America, and European allies such as the United Kingdom have indicated an openness to some kind of trade deal.
At his You Are Not Your Own Substack, O. Alan Noble advises us to “Love What Is Timeless as a Response to Chaotic Times”:
I want to argue that particularly as our culture seems to be spinning apart and the newscycle grows shorter and shorter and our anxiety increases, there is a greater need for us to cling to timeless works, particularly the Bible and works of art: poems, novels, songs, paintings, etc. But I want you to understand how I came to this insight.
Yesterday the news broke that former and future President Trump has selected Matt Gaetz to be his Attorney General. Of course, he still has to get through Congress, and some pundits think he doesn’t stand a chance, but who knows. As I’ve said before, I’ve given up making predictions. It’s a bad business to be in. When I saw this news, as well as some of Trump’s other picks (some, like Rubio I was pleasantly surprised by!), I caught myself being pulled into the same cycle of infotainment and anxiety that gripped me for much of 2016-2020. It was a kind of political news fever driven by the belief that I had to Be Informed or else I was Being Irresponsible. The feeling that Something Big was Happening and I had to have an Opinion and Make My Voice Heard. It’s not a good feeling, but it’s not entirely a bad feeling either. It certainly hits the ol’ dopamine receptors.
But this political news fever is a physically and psychically draining activity. It sucks away all your imagination. Your creative energy is spent imagining the potential outcomes of (legitimately) awful political decisions which are entirely out of your control. It causes you to hope in ridiculous conspiracy theories which come to nothing. Then it dashes your hopes with some new headline predicting doom for the nation so that you are left in anxiety and despair about the future. You are frenzied, frantic, and feverish. Always posting, always scrolling, always secretly hoping for some new piece of terrible news about the administration to confirm your priors1.
Lots of evangelicals I know were sick with this fever, and I myself struggled with it off and on for years until I finally realized what a horrible, disordered way to live this is. A part of me knew I was falling into the sin of curiositas, but as with many sins, my sinful heart denied it. I justified my curiositas by conflating it with studiositas, the desire to rightly know the truth of things. My hope was that by being informed I could advocate for justice, another virtue! I hid my vice behind virtues. I don’t think I’m alone in that. I think many people fighting for justice hide their viciousness behind a veil of virtue.
To complicate matters, I think sometimes as I engaged with the newscycle during those four years, I really was pursuing justice and rightly pursuing knowledge. But the temptation to venture into curiositas was always very strong because of technology and my weak flesh. There’s always more to learn, always another person to engage, always another angle to take, always more to read. This is the information age, and everything is available to me. The lesson here is that to pursue knowledge and justice in this media environment requires great personal vigilance and the virtue of temperance to know when to refrain from engaging, reading, watching, scrolling, etc.
So as I read this news of the appointment of Matt Gaetz to Attorney General, a position I believe he is completely and utterly unfit for, I found myself once again feeling that familiar tug to keep checking Twitter for more news of other appointments that would disappointment (a failure on my part to hope all things!). I stopped myself and prayed that Trump would make good appointments that would honor God. Rather than think about all the potential damage to this country Gaetz could cause as AG, I felt a distance from the “event” of his appointment. A healthy distance. It was not indifference, after all, I still prayed, but an acceptance that this was not something I could immediately do something about, and that frantically scrolling to read everyone’s hot takes on how bad this appointment was would solve nothing. I attribute that distance to four things:
Reading my Bible and praying in the morning.
Listening to T.S. Eliot read the entire Four Quartets on the drive to work.
Teaching Their Eyes Were Watching God to my students.
Meditating on the virtues as I write this new book, Re-Collecting Your Life.
What all four of these things have in common is that the are engagements with timeless works. The Bible, the Four Quartets, Their Eyes Were Watching God, the virtues—all of these are timeless truths worth meditating on. Obviously, the Bible is infinitely superior to the others, but they have value, too. Part of their value is that they ground us in a time out of our time. They remind us that the concerns of today will not be the concerns of tomorrow. They convict us that the human heart is terribly wicked and has always been and yet God has endured our wickedness in longsuffering and provided a way of redemption. They remind us, as Eliot does in “East Coker,” that “Houses live and die,” and so do parties, administrations, and even nations. All in God’s time.
The foundation for this practice is reading our Bible. For in it we find the cosmic scope of God’s redemptive history put into perspective, which shrinks down our current political history into a tiny pin-prick of a moment (without ever denying that real injustice is occurring and must be opposed, but with temperance!). We learn here about the importance of living peaceful lives, honoring our rulers, and praying for them (1 Timothy 2:1-2).
You Are Not Your Own Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
What the Bible and literature and all great art can also do for us is cause us to attend to something with our imagination, something meaningful—which is exactly the opposite of what the political news fever demands of us. The fever demands all of our imagination and it wants our attention to be scattered across many ephemeral items. But as I was carefully listening to Eliot read Four Quartets on the way to work yesterday, I was practicing attention. I had to carefully study and imagine each image and its significance. I had to meditate on the overarching meaning. I had to slow my thinking down. Rather than frenzied, frantic, and feverish, I felt myself become calm, attentive, and contemplative. This kind of deep reading ought to mark our engagements with the Bible, with good music, great film, and so on.
Deep reading. Kind of echoes what Ian Leslie had to say above about slower reading, I think.
Deciding to be a full human being is a conscious choice in 2024 post-America. The default position to which the machine that is our society will consign one is automaton, a beauty-and-virtue-starved bundle of distractions and ill-thought-out presumptions, without the vaguest notion of what a lodestar is, much less why it’s essential.
You can be much more.
The Endarkenment article by Gurri was very good, but he stood short of the full solution, primarily because he is not there himself. He says that we need not re-embrace religion in order to fill the hole (because he hasn’t) but then he just waves his hands and says we might come up with something. But the source of the problem is the decline in religious faith and only a revival of such can alleviate it. Otherwise a good analysis. The internet was the catalyst for what had already happened in our culture but which hadn’t become entirely apparent. We are living in the aftermath of a deep hollowing-out, and out-of-wedlock births and decline in religious faith are two major symptoms and causes of the hollowing-out.