Skip to content

The Zoomer Question

REVIEW ESSAY
Gen Z, Explained:
The Art of Living in a Digital Age
by Roberta Katz, Sarah Ogilvie, Jane Shaw, and Linda Woodhead
University of Chicago Press, 2021, 280 pages

Generations: The Real Differences Between Gen Z, Millennials, Gen X, Boomers, and Silents—and What They Mean for America’s Future
by Jean Twenge
Atria, 2023, 560 pages

In the fall of 1954, the children of an industrial district of Glasgow, learning that a local vampire with iron teeth had “killed and eaten two wee boys” from their neighborhood, decided to mobilize hundreds of their best and storm the cemetery where it was thought to live. The hunt commenced one afternoon after school; a procession of students assembled and grew as they headed for the Southern Necropolis. Upon arrival, finding the gates closed, the children—“some were so young they could scarcely walk,” reported a city paper—scaled the walls. Once inside, “their excited shouts and screams became so loud that normal conversation was impossible” and continued past nightfall. The opera­tion was the fifth of eight such hunts documented between the 1930s and 1980s, one of which—an operation against “Springheeled Jack”—involved thousands of children and lasted for several nights.

It is often said that today’s eighteen-year-olds behave like the fifteen-year-olds of the 1990s, but this is altogether too generous: one would be hard-pressed to find a grown adult alive today with the organizational competence of the midcentury Glaswegian second-grader. The hale boys and girls of our own country’s republican age, barreling down waterfalls and skirmishing among themselves over miles of town and field—“there were really no children in 19th-century America, travelers often claimed, only ‘small stuck-up caricatures of men and women’”—seem yet more foreign. And when compared to even the most precocious young activ­ists of our time, the “abbot of the local youth” of seventeenth-century Saint-Jeanne, a small village on the French Riviera, who extorted a tax from village newlyweds with the threat of public ridicule in song from a chorus of rowdy children, appears as the right hand of God.

The full, impossible humanity of such children comes to us shrouded in a halo of myth. In our age of impotence, the notion of a child commanding a regiment or levying a tax is so fantastic, so perfectly and flippantly opposed to our own frustrations of energy, that one instantly understands the awe with which rougher, toiling generations could imagine a Paul Bunyan scooping out the Finger Lakes with his right hand, or an Alexei Stakhanov mining two hundred tons of coal in a single shift. Even the prospect of a thirteen- or fourteen-year-old teen­ager hitchhiking alone across the country—something which has doubt­less happened many times within the lives of those still living—would, if realized today, likely make for national news.

In a time in which generations struggle to understand each other, everyone, young and old, is nevertheless in agreement that something has happened to the young. Young people are now consistently more distressed than our elders, a fact whose daily confirmation through experience is the only thing keeping us from recognizing its utter his­torical perversity. Youth has been stripped of its natural tendencies to energy, autonomy, and subversion. In our turning away from life, the generation christened with the albatross of the alphabet’s final letter has become a picture of something altogether foreign to the typical condi­tion of youth: quiet resignation, despair, premature exhaustion. Recent studies of youth well-being have generated data points to this effect so hellish as to strain belief. The CDC reported this February that in 2021, 10 percent of high school students reported attempting suicide within the previous year, around twenty times the national average. (This was not an artifact of the Covid pandemic: the figure for 2019 was 9 percent). A full quarter of high school girls reported having “made a suicide plan,” while 30 percent seriously considered attempting suicide. A high school­er may, of course, answer an official questionnaire about suicide in a mood of despair, boredom, fascination, ridicule, or unseriousness; and though a full 3 percent of high schoolers reported suffering injury from a suicide attempt, the suicide rate among those aged fifteen to twenty-four remains lower than any older age group. Yet a question deeper than whether they “really mean it” arises. Young people, especially girls, seem less attached to the idea of living than anyone can remember.

This distress does not appear to be subsiding with age: last school year, 47 percent of university students reported having screened positive for a mental disorder in their lifetime, while 44 percent actively screened positive for depression, 23 percent for major depression, and 37 percent for a clinical anxiety disorder. A staggering 28 percent reported self-harming in the past year, while 29 percent had taken psychiatric medication during the same time. Nor is the malaise limited to anxiety and depression. In 2019, the American College Health Association, taking 2010 as its baseline, recorded a 57 percent increase in bipolar disorder, a 100 percent increase in anorexia, a 33 percent increase in substance abuse, and a 67 percent increase in schizophrenia among undergraduates.

Such demoralizing readouts of data have attracted a fresh wave of attention to what must be termed, harshly but not inaccurately, “the problem of Generation Z.” In response to the latest CDC report, the Washington Post solemnly intoned that “teen girls across the United States are ‘engulfed in a growing wave of violence and trauma.’” A recent spate of articles has attributed the crisis to a variety of ills: social media use, supposedly increasing competition in schools, parental failure (here both authoritarian “tiger moms” and neurotic “helicopter parents” often come in for chastisement), and—rather unconvincingly—a cyclical return to teenage depression after the anomalously happy late 1990s and 2000s. Such bursts of analysis and counter-analysis have been occurring for several years, ever since psychologists like Jean Twenge and Jon­athan Haidt brought the ongoing inversion in the spirit of youth to the attention of the public. A small but influential literature has sprung up since then, including such titles as Haidt’s The Coddling of the American Mind (cowritten with Greg Lukianoff in 2018) and Twenge’s iGen (2017) and Generations (2023). A team led by Stanford anthropologist Roberta Katz has also tried their hand, with the ambitiously titled Gen Z, Explained (2021).

While the discussion rambles on, one fact has already emerged which feels weightier than any particular hypothesis or piece of evidence: young people are now primarily an object of study, a darkened prism through which various shocks to human behavior may be discerned, rather than a cultural force in themselves. This is not entirely the fault of the researchers. American zoomers are indelibly stamped by one of the most totalizing and rapid perturbations of human behavior ever experi­enced: the coming of the iPhone and the social media which it enabled. The year 2012, when smartphone ownership reached critical mass in what Twenge calls “the fastest adoption of any technology in human history,” marks a turning point, if not in human history then at least in humanity’s experience of the same. We might term this the “iPhone shock.”

In the American context, it is easy enough to identify this as the trigger for the coterminous formation and malaise of Generation Z. Twenge, more than any of her colleagues, has demonstrated the link between the coming of the smartphone and the onset of youth distress in the United States. Around 2012, virtually every metric of teen well-being took a sharp downturn. Between 2012 and 2015, the number of teens failing to get seven hours of sleep per night increased by 22 percent, ultimately meaning that by 2015 a full 57 percent more teens were sleep-deprived than in 1991; presently, 60 to 70 percent of Ameri­can teens “live with a borderline to severe sleep debt.” After 2012, the proportion of teens who regularly skip breakfast spiked from around a quarter to a third. Between that year and 2015, Twenge writes, “boys’ depression increased by 21% . . . and girls’ increased by 50%,” while between 2007 and 2015, the suicide rate among twelve-to-fourteen-year-olds doubled for boys and tripled for girls. In the years following the iPhone shock, the number of high schoolers and college students seriously considering suicide jumped by 34 percent and 60 percent, respectively. The 42 percent of teens who reported feeling persistently sad or hopeless to the CDC in 2021 represented the largest such proportion on record, with a sharp difference between boys and girls—29 percent and 57 percent, respectively.

It is no wonder, then, that the questions of technology and youth seem to have merged. Here we have the first human subjects for whom digital reality is as natural as speech and flesh, for whom the underlying grammar of technology has slipped from the sphere of rational understanding into that of instinct—and they are miserable. Have we not now reached a clear and early answer in our search for what ails us? Much, indeed, can be noted about the adverse impact of ubiquitous computing, as we know it today, on humanity in general: its swift conquest of boredom through chronic stimulation, its erosion of the individual’s capacity to concentrate attention and therefore create culture, its media­tion of collective expression and instinct in such a way as to inflame social contagion and distressing habits of comparison (especially among young girls).

There is truth in all of this; the securing of a productive entente between human life and digital technology may be the most pressing social question of our time. But contemporary generational analysis tells us little about how to reach it. In fact, it tells us little about zoomers at all. None of the analysts have sufficiently explained why and how Amer­ican young people were primed for such a swift emotional implosion when life was digitized. This failure becomes particularly glaring when we realize that not everywhere on earth has seen a technologically induced explosion in youth despair and neuroticism. Twenge’s arguments to the contrary are unconvincing. It may be, as she writes, that since 2012, reported teenage loneliness has doubled in Europe, the English-speaking countries, and Latin America, and has increased by 65 percent in Asia; and that the smartphone shock seems to have generally increased youth distress throughout the nations of the West, especially in the Anglosphere. But the mild increase in distress (from 15 to 20 percent) among thirteen- to fifteen-year-old Swedish boys after 2010, which she uses to prove her point, seems hardly worth the graph, espe­cially when compared to their female compatriots and their American counterparts.

More importantly, an axiom only needs a few exceptions to become inadequate. A recent literature review has found no evidence for a smartphone-induced mental health crisis in Hungary, the Netherlands, and (strangely, though this is the one counterexample corroborated by Twenge) South Korea. Youth self-harm rates in Sweden and Denmark actually fell after the introduction of the smartphone. In Japan, frequent Instagram use is associated with decreased symptoms of distress among young adults, though Twitter has the opposite relationship. As the shock of the digital subsides and 2010s-style techno-pessimism grows stale, we have a chance to realize that, over the broad surface of the earth, different forms of life subsist. The world is thankfully larger than the American high school, though not overwhelmingly so.

Seen in this light, the iPhone shock is perhaps the least interesting thing about the American zoomer. Our mass developmental stunting cannot be laid solely at the feet of Instagram: we must also understand the institutional forces which made us this way. Compiling a list of our habits of being or quantifying the vagaries of our distress is not a substi­tute for this task—nor is a doleful wallow in the sorry catalog of our tics and slackened tongues, politely or obliviously ignored by the analysts and yet unavoidable for the honest observer. As the most basic problems of living reassert themselves, zoomer life becomes much more than a story of “life online”—it is a crisis to be grasped and overcome.

Therapeutic Control

Since the middle of the twentieth century, childhood in America has been slowly squeezed in a double envelopment: public disorder on one side, therapeutic smothering on the other. Key to this was the rapid collapse of urban life between the 1950s and the 1970s, with the beleaguered city displaced by the automobile-based suburb—a living environment deeply hostile to pedestrians, particularly those shorter than a car hood. This transformation dramatically reduced the physical space available for free, unsupervised play. More important, however, were the ways in which the passing of the high-trust, democratic social order—a very real crisis—echoed, twisted, and curdled in the national mind, giving rise to a new, paranoid outlook in the style of American parenting, which had long been laissez-faire to the point of anarchy.

First came the worries over juvenile delinquency in the 1950s and ’60s, including over sixty films on the subject and a successful effort by the Senate to effectively ban horror comics—an act replicated across the pond by Parliament in the wake of the Glasgow vampire hunt of ’54. Then came a more intense and less rational paranoia in the 1980s cen­tered on “stranger danger” and increasingly lurid tales of child abduction, accompanied by more mundane campaigns of safety, such as the national push to simplify playgrounds triggered by the paralysis of the Chicago toddler Frank Nelson, caused by falling off a slide in 1978, one year before the infamous kidnapping and murder of Etan Patz in SoHo. This justified a parental and legal regime built around management, professional intervention, and—above all else—constant supervision.

It had long been normal for American parents to let their children roam the streets, forming malicious or merely mischievous gangs, “al­ways stoning something, birds, or dogs, or mere inanimate marks,” as the writer William Dean Howells reminisced of his childhood on the Ohio River in the 1840s. But within a few decades of the advent of our contemporary low-trust norms of child-rearing, merely letting a child walk alone could earn parents an arrest on charges of neglect or “risk of injury to a minor.” This is almost certainly an unprecedented historical norm—and it makes our own past look like that of a foreign republic. Harry Truman’s second memory, from some time in the mid-1880s, was of being dropped by his mother out of a second-floor window into his laughing uncle’s arms. A repeat of such a stunt today would likely result in a visit from Child Protective Services, at least one arrest, a minor media firestorm, and the confiscation of the child.

A large study from the early 2000s found that most mothers restricted their children’s outdoor play, with 82 percent citing “safety concerns, including fear of crime, as reasons for doing so.” By 2021, the CDC found that 86 percent of high schoolers experienced “high parental monitoring,” defined as a student’s parents knowing “most of the time or always . . . where they were going or whom they would be with”—a rate that was remarkably consistent across racial groups. Katie Julian has also noted that “parents today spend significantly more hours caring for children than parents did 50 years ago,” despite an increase in hours worked outside the home. Between the 1970s and the turn of the millennium, children “lost about 12 hours per week in free time.” Par­ents lose out in this new order; so, too, do their overmanaged children.

Parental supervision, however, is not the only—or even the primary—form of therapeutic control. More important is the ongoing overreach of what we can call the therapeutic system—a disjointed and pious apparatus of well-meaning mental health professionals, social workers, and school administrators in symbiosis with the state—into both private life and civic institutions. Christopher Lasch, always more interesting as a historian than a cultural critic, traced the emergence of this system to the professional-class reaction of the late nineteenth century, but noted that it did not become hegemonic among the middle class until after the coming of the Cold War. It has remade the world of the young in its image, eroding their capacity for spontaneous organization and self-government, in favor of stage-managed dialogue, sterile “extracurriculars,” and arbitra­tion mediated by school administrators. Along with the “urban renew­al” campaigns of midcentury came the clearing away of the messy and enchanted world of children’s culture, with its recessed treehouses, deep politics, and “codes of oral legislation” for everything from contract enforcement to property disputes.

Children do not naturally submit to managerial authority, and so this process has almost always necessitated the ad-hoc suspension of social­ization as it arises. Most Americans under forty will remember the tendency of elementary school supervisors to counterpose authority and hold “dialogue” between students in dispute, even in cases of students banding together to confront bullies. Access to peers is often withheld: I myself received a week’s suspension from recess in the fourth grade for, in the words of my teacher, “moseying down the aisle” while the other kids were standing in line; I was made to sit in silence with her in a darkened classroom as she ate her lunch and “reflect” on the harm that I had caused, a sort of Quaker hell which did me no certain good. The most direct method of demoralization is of course biochemical: around 10 percent of U.S. children are now diagnosed with ADHD, more than 90 percent of whom receive prescription stimulants.

In the context of the therapeutic school, Twenge cites the work of sociologists Bradley Campbell and Jason Manning to claim that “the United States has shifted from a culture of honor, in which people respond to a perceived slight themselves, to a culture of victimization, in which people avoid direct confrontation and instead appeal to third parties and/or public shaming to address conflict.” This is not entirely wrong, but it is disingenuous to describe mainstream American culture as being one of “honor” at any time since the nineteenth century. A society in which people are instinctually able to handle their own affairs is not necessarily ruled by dagger and vendetta; there is little reason to bring the Pashtuns, Sicilians, and Hatfields into this. A more accurate term than “honor” would be “democracy,” or perhaps “civil society.” By all accounts, millennials and zoomers are the product of a quite recent departure from the norms of self-rule, both at home and at school. This is much more a retreat from the demands of civilization than a pacification of the residual barbarian in us. (Twenge has since substituted “dignity” for “honor.”)

Learned helplessness must first be taught, and the consequences for the American polity have been grave. As Tanner Greer has written, “in the 21st century, the main question in American social life is not ‘how do we make that happen?’ but ‘how do we get management to take our side?’” The resulting failure of American institutions to produce inte­grated adults—let alone address the root causes of mental illness and social decay—is finally being acknowledged, if only elliptically, by mainstream opinion as the therapeutic system grows in power; Haidt’s Coddling is the most obvious example. But the discussion of zoomer dysfunction, even as it intensifies, rarely moves from the question of increased or, at best, improved therapeutic interventions to the deeper question of social organization and the renewal of national life. Remedi­al support for anxious children, though necessary, is not enough to reconstruct an institutional imperative toward true education and the development of children’s latent capacities.

Could the supposedly overinvested helicopter parents pick up the slack in the interim? Perhaps so—if the family itself was not rapidly weakening. The dual thrust of declining male incomes and increasing reliance on the therapeutic system (chiefly in the form of expensive childcare), described early on by Lasch in his 1977 Haven in a Heartless World, has likely contributed to this. The process has been swift: Twenge writes that “36% of [Gen Z] babies were born to unmarried mothers, up from 25% during the Millennial birth years.” The breakdown of the family, which so preoccupied midcentury sociologists as a problem peculiar to the black community, has now spread to all but the educated elite (the divorce rate is at a fifty-year low, entirely because the marriage rate is at an all-time low; in Massachusetts, the former has fallen by 32 percent since just 2011). Only 38 percent of zoomers grew up having nightly family dinners, compared to 46 percent of millennials, 59 percent of Gen Xers, 76 percent of boomers, and 84 percent of silents. Smaller family sizes have led 39 percent of zoomers to report feeling “lonely at least once a week growing up,” compared to similarly descending numbers across the older generations.

Even more remarkable, however, are the prospects for family for­mation among zoomers ourselves, as we deepen the trend set by the late millennials. In 2015, Twenge writes, 39 percent of high school seniors “expected to get married in the next five years,” a decline of 22 percent from 2007: the country now seems to be converging on what were previously professional class norms. Moreover, “the percentage of 18- to 29-year-olds who are married was cut in half in just eight years, from 32% in 2006 to 16% in 2014,” to which we might add that 63 percent of men under thirty are now single, up from 51 percent in 2019. Such changes are mostly driven by cultural norms and imitation, rather than material reality itself. It was not until after the advent of social media in 2012, Twenge notes, that the proportion of teens saying that they want­ed children one day began to plummet (it is still in the low nineties, but falling fast). It seems that an increasing proportion of us will not just delay marriage, but forgo it altogether.

As young men and women anxiously retreat from each others’ lives and men drop out of the college circuit—soon two-thirds of college graduates will be women—we appear to be comfortably returning to premodern patterns of male success, in which the clear majority of men fail to reproduce, without a whisper of social upheaval. As Twenge writes, zoomers are “on track to be the generation with the largest number of single people in US history and the lowest birthrate on record.” It is easy to forget that the American birth rate was above replacement as recently as 2008. It is not so preposterous to think that the majority of zoomers will not have children. At work here is a logic much deeper than material decline: it is the increasing inability of men and women to reach an attractive settlement of life.

One last aspect of our collective arrested development must now be discussed: that its causes are, in part, physiological. Zoomers are the most chronically ill generation in modern history, and this despite the generational declines in smoking, drug use, precocious sex, and even fast food. Sleep deprivation and stress can explain some, but not all, of our mental and physical distress. The rapid proliferation of chronic health disorders since roughly 1970—autoimmune diseases, autism, allergies, obesity, dysbiosis and maladaptation of the human microbiome, among others—is necessarily concentrated among the newest members of society, and appears to be caused by a variety of environmental stress­ors: adulterated food, various toxins, perhaps some degree of iatrogenesis. Twenge, to her credit, notes that rising childhood obesity cannot be explained by dietary choices alone, and while she makes a plausible argument that the iPhone shock has led to a marginal acceleration in weight gain—“between 2012 and 2019,” she writes, “the number of preschool and elementary school age children who were physically active less than half of days doubled,” and the rate of increase in BMI doubled during the Covid pandemic—neither sedentism nor the post-2012 surge in sleep deprivation can explain the general trend. Over half of Americans now suffer from a chronic illness.

It is strange, then, that both Katz and Twenge place such stock in the expectation of rising life expectancy among the young. Twenge does acknowledge that U.S. life expectancy took a sharp downturn during the Covid pandemic, but she fails to mention that American “healthy age life expectancy,” the number of years a person could expect to live in full health, has been falling since 2010. Life expectancy proper peaked in 2014, due mainly to a surge in drug overdoses; even without the Covid virus, American life expectancy would have still dropped by an entire year in 2020, mostly due to excess deaths among the young from over­doses (and, secondarily, problems of public order: shootings and road deaths). Over the longer term, since chronic illnesses usually accelerate the aging process, it does not appear that U.S. life expectancy will exceed its pre-Covid peak anytime soon.

American institutions seem wholly unwilling, and perhaps unable, to address the systemic roots of popular sickening. Even attempts to change consumer choices—Michael Bloomberg’s politically suicidal ji­had on soft drinks, Michelle Obama’s gentler initiative to increase the number of American children able to do a jumping jack—have given way to a sort of therapeutic enabling: a credo of “acceptance” that easily crosses over into a normalization of obesity and other chronic debilitations. It seems that we will have to wait for clinical practice, and above all social organization, to catch up with medical research before we can realize the true promise of modernity: longevity graced with vigor, with one’s years spent in the full exertion of mind and body. In the meantime, care has replaced cure.

The “slow life history strategy”—later marriages, smaller family sizes, more resources per child, longer developmental processes—which Twenge postulates as a response to competition and technological pro­gress is evidently failing to make up for institutional decay and health decline. In fact, boys and girls are now showing physiological responses that are more characteristic of a “fast life history strategy”—in which adolescents, under the stress of a resource-rich but also hostile and uncertain environment, mature more quickly in order to reproduce early and often. Since the middle of the twentieth century, the average age of onset for puberty has dropped by 1.5 years in boys, while the average age of breast budding in girls has dropped by roughly two years (making depression among teenage girls more likely). Obesity can explain some but not all of these changes; the more likely culprits are endocrine disruptors. Lastly, the rapid fertility decline in men and women alike, perhaps the most grim piece of evidence for ongoing biochemical disruption, makes it more difficult to delay family formation. The tradeoff between quantity and quality—the linchpin of demographic transition theory—is now squeezed from both sides: people are having fewer children, but those children are still made sick by an inhospitable environment.

Physiologically and socially, young people are forced into a sort of perverse adaptation. Sexual maturity comes early (even if its consummation is an increasingly rare thing), but mental maturity comes very late—if ever. Under the strain of this uneven maturation, young people are taught to transact with each other in a sort of mutually paranoid emo­tional surgery. A strange new ethic of detachment is emerging: one must never allow oneself to feel, to commit, to sin or to repent, to stake one’s ground. Options must remain open. As the structures which have lent direction to the individual fall away—civil society, family, courtship, apprenticeship, citizenship, patronage, education, belief—the self is at once bared to the world and made limp. Masks drop, relations flatten—and the walls around the heart and mind draw tight in defense. Just as our social order preempts any stirrings of personal development in the individual, so too must emotions be cauterized at the source. To be mature now is to fend off any incursions of the world on the self. The central task of the therapeutic system is to suspend this state indefinitely, to manage an increasingly feeble impetuousness of life—a task which, as the destruction of the human capacity for culture, is the only culture war worthy of the name.

Stillborn Radicals

In spite of all conditioning to the contrary, zoomers do seem to believe that the world can change. According to Katz, 27 percent of zoomers believe that the U.S. political system needs “some” reform, while 40 percent believe that it needs “a lot” of reform. She cites a study of zoom­er life goals that finds that, after a “happy marriage or life partnership,” zoomers rank “having a positive impact on the world” above all other options, including, in order, career success, “having fun,” realizing one’s potential, and having a good education. Twenge notes that three out of four zoomers desire “significant changes” in the “fundamental design and structure” of the government. Both authors also document an emerging metropolitan hegemony in outlook among both millennials and zoomers, superficially anti-systemic in inclination or at least mood, which is subsuming the two-camp Kulturkampf of our parents. Com­mendably, Katz identifies a streak of “stagnatophobia” among zoomers—we are afraid not of change, but of the possibility that things will continue on as they are.

Yet there is something perverse in our reformism. We are, it would seem, “socially conscious,” strongly dissatisfied with the world as it exists, deeply skeptical of “traditional institutions,” and . . . unprecedentedly inert.

Contrast, for example, Katz’s findings of rampant anti-institutional sentiment with our near-perfect everyday subjection. Our reflexive distrust of “systems” and “institutions” in general rationalizes our own inability to build anything of substance, and leads to an implicit affirmation of today’s institutions in particular. Folk skepticism of “authority” and “religion” is not, as Katz would have it, the same as questioning authority as it is currently constituted; it seems, in fact, to be a nostalgic tic received from our elders, one which preempts the possibility of new organization. It warms the corpse of the old order strung up to keep watch over the living.

This contrast is most visible on campus, where “affinity groups” mushroom in inverse proportion to the level of real self-governance, and where activists can imagine nothing more than to demand greater administrative control and more sophisticated or invasive therapeutic services. We are united in our general frustration with life, and yet incapable of imagining ourselves advancing a concrete task of civilization or self-governance. Our frustrated energies thus turn inward. Social controls intensify in service of a vague and predictable resentment. De­corum is endlessly renegotiated. New forms of authority and litigiousness come to dominate human affairs and yet hold to no positive law. The young person is given the choice between participation in this stilted pantomime of discourse on the one hand and resigned disengagement on the other.

In the context of this reality, Katz’s claim of a “new participatory politics” arising among the young is deeply misleading, and when she opens a chapter with a credulous epigraph from a campus activist—“at the click of a button, we can start a movement”—her analysis veers into the absurd. Zoomers are likely the generation least capable of social organization in American history, simply by virtue of the institutions which formed us. In the place of public deliberation we have only inherited a pew in a choir of numbingly resourceful euphemists. The simultaneous atrophy of public life, private life, and civic organization—and the triumph of therapeutic social management within the American mainstream—has allowed us to confuse social froth for “politics.” A civilization caught in an eddy will spin quickly around its central axis, but it will not move.

Of course we know nothing of politics: we do not know its preconditions. None of us ever stormed a cemetery at the head of a teeming host of grade schoolers. Those who consider themselves high-octane activists would do well to consider that until around 1960, many Ameri­can public high schools had a more energetic network of secret societies than any university existing in the developed world today. These were capable of remarkable feats of organization: in the Chicago of 1902, after football team captains arrived to practice and found, to their indignation, that they had been usurped by professional coaches—the failed drill sergeant and the Little League dad had yet to emerge as archetypes on the schoolyard scene—the societies initiated a wave of mass protests which lasted for six years, stopping only when the courts intervened. Between the Progressive Era and the Cold War, school boards and courts crushed these groups and replaced them with the supervised form of “extracurriculars” which every student knows today, but as late as 1952, a single school district in Portland, Oregon, counted 280 such societies.

Though the internet, social media, and the group chat are miraculous gifts in an age when most institutions have lost the right to our attention, American young people have yet to use these tools to regain their capacity for social organization. Against the notion that our improved communications technology and busy discourse has expanded our powers of self-rule, as Katz would suggest, consider only the words of the liberal French statesman Michel Chevalier, writing in 1835 of his experience in Cincinnati:

A half-word, they say, is enough for the wise; but cleverer than the wisest, the Yankees understand each other without speaking, and by a tacit consent direct their common efforts toward the same point. To work Boston-fashion means, in the United States, to do anything with perfect precision and without words.

Nostalgia for the political age—usually expressed in the desire for some sort of “revolution” and a return to the class organization of the early twentieth century—seems to have died with the millennials. Given their qualities, goals, and choice of tactics, their failure comes as no surprise. Zoomers are, at least, more honest; we seem to be interested in securing the basics of life, some personal settlement with a heartless world, and little else besides. There is dignity in this—a chastened, mercenary attitude is what the age demands. Where millennials made poor leaders, zoomers may yet make good followers. But we should not delude ourselves with the thought that we have broken with the fundamental attitude of the millennials. One of Katz’s interviewees put it well: “sometimes it feels like we’re screaming at these institutions to take care of us, at least together.” This is the clearest articulation of the social contract of the therapeutic system that you will find among the young. Against it there have been no credible demands for autonomy or self-rule, political or material change.

In truth we have forgotten the very meaning of “change,” because we are unable to reckon honestly with the current state of our living and to name the forces which made us this way. “Change” for us is something abstract, something to be invoked only when it is impossible to imagine. It is a picture hung superfluously on the wall of a barrack whose con­fines we have mistaken for the edges of the world.

Still, we can at least say that we are on the right track. The spirit of the age is closer to that of Brezhnev’s Russia, numbed and deadened into stability, than it is to the tedious self-expression of Abbie Hoffman’s America. Residual skeptics will see the burgeoning desire to have one’s own garden—ashamed though we are to acknowledge its pull—and talk only of bourgeois capitulation. But since so much of the intercourse of national life is denied us and so many of the basic functions of the social order have frayed, any escape must proceed downward. Not into ourselves, no—there is very little to be found there—but toward the very fundamentals of life; ad fontes, to the source, as the humanists said. Activism, politics, and even high policy have become increasingly unintelligible, impotent before the deeper problems of the age—to say nothing of “ideas” or “philosophy.” Our age can only be one of preconditions. Our highest and only accessible task is to take stock of our resources and our technology, and through them recover the pos­sibility of human development.

The Hope of the Task

Considering the immense amount of attention, analysis, jealousy, resentment, and hope that the leading lights of our social order accord to the problem of youth, it is remarkable how little they actually have to say. Take, for example, the suggestions of Twenge in her concluding chapter, “Understanding—and Saving—iGen.” Our salvation is to be achieved by going outside, spending time with family and friends, at­tending regular therapy sessions, sleeping, eating a diet high in omega-3 fatty acids, “avoiding rumination,” and—naturally—putting down the phone. One finds little to negate here. A social animal should eat, sleep, and social­ize, just as he should also breathe and excrete. How all of this is to be accomplished is anyone’s guess. You cannot simply tell people to live.

“Put down the phone,” says Twenge, “and do something else”—but what? Commentators of every stripe have tried to insert their preferred schemata into the space opened by this question, without success. A recent selection will suffice: community, movements, organizing, values, culture, identity, tradition, vitality, new religions, new cultures, new elites—nearly all without reference to particular factions or goals. These are mundane delusions of the kind that can only sit comfortably in the mouths of the educated and over-educated. Such a groping recourse to the highest level of abstraction, to such an unimaginative social functionalism, is the generalized equivalent of telling a depressive to find meaning in life. You cannot simply tell a society to live. In all of this we see what Spengler called “the horror of values supervening from no­where.”

The skins of past ages—whether one prefers the 1960s or the Coun­ter‑Reformation—have been drawn so taut and vice-like over the mind that we seem to have forgotten that they once belonged to living crea­tures. “Tradition,” with its sanitized children “community” and “identi­ty,” is only ever born as a by-product of living and acting for some purpose. Only after that process has been exhausted are the skins molted and ready for veneration. It is as if a titanic boxer has barreled his way through a forest, and all we have left from his charge are the burs which once stuck to his gloves, frozen under the microscopic gaze of our dissertations.

Skillfully ignored in all of this, in a way that only those smart enough to think themselves outside of reality can manage, is the prospect of a change in the actual conditions of our life. But this is the most pressing question for all of us, young and old. The developmental failures of the present reveal, in their negative, a great transformation of American civilization. Each task of this process, in its elegance and inner logic, may be completed essentially without regard for social utility. Yet if any piece of it is done, it will trigger an organic, natural, and irrepressible demand for the expanding possibilities revealed in the grain.

The many failures of the educational system, in particular the desiccation of the universities, call for a supplemental network of new academies and peripatetics who can provide true education for children and young adults as they negotiate the demands of institutional life. Virtually the entire food system must be remade from scratch, including not only the removal of toxins and factory farms but also the regeneration of the North American edible forest biome. Chronic illness calls for a new school of lifestyle medicine, a thorough review of the molecular spectrum of the American environment, and likely a transition in the materials used for everyday life. Much of our built environment, at every scale from interior design to architecture to urbanism, demands to be rebuilt according to human order. Rarely in American history has there been an imperative so transcendent, demanding the efforts of every field of human inquiry, of both government and society. The disaggregated legions of young men who spent their childhoods leveling moun­tains in Minecraft, and who are now languishing numbly in their base­ments, may yet be swayed by the promise of raising new cities, burning off swathes of prairie, and feasting in beer halls.

Such tasks cut to the deepest, most fundamental layer of our national life, and they are therefore the highest form of pragmatism. It may sound strange to say this in a country where the most pressing questions seem to be those of nanometer-scale semiconductor design or interest rate policy. But this is only because we are the victims of a civilizational myopia. In the course of the life of nations, there are a series of brutally simple constraints which must be broken if society is to develop: enemies as ancient as Malthusian traps, punishing heat and humidity, and backward institutions. For the last fifty years, we have been sliding back into a muck of such intractability that it threatens to restore the historical norm of permanent stagnation. The reasons for this go beyond the possible exhaustion of the industrial breakthroughs of the 1870s. A chronic illness of the social order—pervasive rentierism, excessive management, cities which are increasingly unable to grow or support life, cost disease, the decline in popular competence and energy, the shredding of national talent on non-problems—has swallowed the revo­lution in digital technology without much improvement to national life, and already threatens to neutralize the turn toward industrial policy and social spending.

The question facing this generation is whether modern society, a fleeting animal more strange than we can know, is still possible. Such ideals as truth, self-rule, or rule of law stir up among us a sort of discomfort, as if we do not quite know what to do with them; there is a creeping sense that the basic impulses underneath the forms have died. The crises of the social contract—the loss of popular vocation, the inaccessibility of urban life, the passing of the promise that every man and woman will have the chance to marry within their social class—are, however, felt more acutely. We are slipping into a world of caste, superstition, and quiescence. It will be a sweet and stupid world, and a long time will pass before it comes to ruin. Against this there is demand for a renewed settlement of life, but it is still inchoate—and the shape of an era is usually negotiated only once.

Our immense reservoirs of money and instrumental cleverness no longer look so impressive when compared to our inability to change the fundamentals of life. A country which neglects the development of its people, institutions, and environment will have no success in transforming itself, especially if it aims to recapture the industrial capacity that depended, ultimately, on social facts that no longer exist. We are thrown back, embarrassed on the most basic questions: not even an empire’s worth of effort can compensate for our intolerance of treehouses for children. We must learn to speak of true education, of health, food, cities, and sculpted land. We must speak intelligibly, or not at all.

This article originally appeared in American Affairs Volume VII, Number 2 (Summer 2023): 192–208.

Sorry, PDF downloads are available
to subscribers only.

Subscribe

Already subscribed?
Sign In With Your AAJ Account | Sign In with Blink