Skip to content

America’s Unhealthy Gerontocracy

America in its present state of decline increasingly resembles the late Soviet Union, but one of the most unsettling parallels is its unmistakable slide into gerontocracy. From Trump to Biden to Sanders to Pelosi to most of the Senate, one might think that the biblical three score and ten had become a mandatory minimum for holding office in this country. By 2024, for twenty-four of the previous thirty-two years, America will have been led by people born in or before 1946.

While gerontocracy is most obvious in politics, it is present throughout American life. The average ages of university professors and administrators, banking executives and corporate CEOs, and many other leading figures have all been steadily rising for some time. Perhaps Silicon Valley has been so successful precisely because it is the only place in America where people who are not on the cusp of senility can get promoted or raise capital. Conversely, perhaps the pharma lobby is so successful because it is not only the biggest donor but probably the largest vendor to the assisted living facility that is Congress.

The fragility of this gerontocracy has been ruthlessly exposed by the Covid-19 epidemic. The crisis has also shown the damage that can be caused by a ruling class more qualified to be in long-term care than to hold important and intensely demanding positions.

The Gerontocracy Devours Its Children

One of the most amazing aspects of the global response to the coronavirus has been the total refusal to classify and treat populations according to age-related risk. Although Covid-19 can be quite deadly among older populations, it is well established at this point that it poses a fairly minor risk for people under fifty. For those under thirty, the risks associated with lockdowns—increased domestic abuse, suicide, depression, drug abuse, economic hardship—are almost certainly worse than the disease itself. There are of course exceptions (e.g., young people with compromised immune systems, old people who have already had the virus), but these are easily accounted for. Yet the gerontocrats who set policy are apparently unable to even consider these factors.

Instead of imposing blanket lockdowns on everyone, it would make far more sense to isolate the elderly, while allowing those under fifty (or even forty if one wants to be really cautious!) to go to work, visit restaurants, go on vacations, attend gatherings, and have real lives. Fortuitously, places like large office buildings, restaurants, and airports are already well quipped to check IDs and ages, as opposed to enforcing (somewhat arbitrary) “social distancing” requirements. Temporary housing would need to be arranged for multigenerational households, but these are relatively rare in the United States, and we have already shown some capacity to improvise additional housing during the crisis. All of this certainly would have been—and still would be—much better than shutting down the entire country for everyone. But it was never even considered.

Moreover, the political controversies arising from months of lockdown policies could have been largely avoided. The whining and double standards around protests, for example, were utterly farcical. The political content of a protest clearly does not affect whether it is good or bad from a “public health” perspective, but the fact that most protestors were young limited any risks. And for all the wild stories coming out of the Seattle “autonomous zone,” a Covid-19 catastrophe is not one of them—maybe because everyone there is fairly young. Likewise, rather than racialize mask mandates (as one Oregon county did) or selectively enforce them (more or less inevitable everywhere), they should be totally unnecessary unless one is in the presence of old people.

Instead, policymakers pretend that there is no difference between a thirty-year-old and a seventy-year-old. Why? Perhaps because baby boomers cannot imagine the world existing without them? Because they can’t even conceive of the existence of anyone but themselves?

One statistic not readily available is the number of young people’s lives that have been stifled or destroyed to preserve the precious egos—and assets—of the boomer generation, both now and over the past few decades. Indeed, one reason I suspect that so many statues are being torn down today (often indiscriminately) is because it is frankly easier to topple a statue than it is to displace the boomers and their failed policy consensus (though at this point many of the statues are probably not as old), even in the midst of a pandemic.

A case in point is universities: Many are refusing to open or will switch to virtual learning for the upcoming semester. The risks to the student population are exceedingly low, but seventy-plus-year-old professors say they don’t want to teach with the threat of Covid (as if they ever wanted to!). There are thousands of younger academics, however, who have faced dismal professional prospects for years, and who could probably do a better job. Yet our gerontocratic institutions would prefer to sacrifice education for everyone in order to pad the incomes of septuagenarians who should have retired years ago.

The Aging of Science

The effects of these perverse generational dynamics are not just economic. Fittingly, perhaps, they are most pronounced in the realm of science, even the science of viruses. To quote the principle named for Max Planck, “Science progresses one funeral at a time.” As Planck himself put it:

A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it. . . . An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out, and that the growing generation is familiarized with the ideas from the beginning: another instance of the fact that the future lies with the youth.

This scenario plays out every day in American universities, especially in arcane specialties (most of them) where peer review is not subject to any wider scrutiny. As long as senior figures remain in their positions, articles challenging their theories are unlikely to make it past peer review. It is not difficult to argue that the slowing of scientific progress in recent decades is at least partially due to the refusal of aging academics to retire, even when they are well past their prime.

To be sure, aging boomers are not the only problem facing contemporary science. Take, for example, the case of Neil Ferguson, who is only fifty-two. Despite a career littered with failure and wildly inaccurate forecasts of basically every major epidemic during his professional life, he remained the UK’s top infectious disease expert. And despite relying on a buggy code in his modeling for years, severely overestimating the coronavirus death rate, and violating his own health advice to engage in an affair at the height of the crisis, he is still cited as an expert in major media. A sociology of the scientific establishment that could explain the persistence of figures such as Ferguson is desperately needed. One harsh explanation is that public health is simply not a high prestige discipline in academia or the private sector (though of course it is not the profession’s fault that society prefers its statisticians to become financial speculators and the like). Perhaps that will change now, but the brutal reality is that the field has not attracted the best and the brightest over the last few decades—and it shows.

American scientists also seem bound by intractable and seemingly inexplicable hierarchies in their approach to the coronavirus. For instance, after a preliminary paper came out in February arguing that the virus originated recently in a bat cave, there has been virtually no effort to investigate this claim further. But Indian scientists have shown that a highly similar virus infecting humans was discovered in an abandoned mineshaft in China as early as 2012. The Wuhan Institute of Virology initially misclassified it as a fungal infection, though the error was later corrected. If Western scientists are interested in pursuing these questions, however, it is, at the very least, not widely discussed. Perhaps too many important people would be embarrassed about being wrong for years.

The specific problems of America’s scientific gerontocracy, however, are perhaps best epitomized by the seventy-nine-year-old Dr. Anthony Fauci, who has bizarrely emerged as a media hero. Note that in criticizing Fauci, I in no way wish to exculpate Trump or his administration, which deserves all the criticism it has received for its astonishing incompetence, as do other agencies like the CDC. But the media elevation of Fauci is especially ridiculous (though, like his foil Trump, Fauci loves the camera and seems willing to say whatever he thinks his audience wants to hear).

The fact is that Fauci totally missed the significance of the virus for many months when it would have been easiest to contain it—an astonishing and inexcusable professional lapse, which Trump cannot be blamed for. As late as January 21 of this year, Fauci gave an interview to NewsmaxTV(!) downplaying the severity of the virus and its risk to Americans. At this point, the threat posed by the virus was reasonably well understood in Asia and even among lay commentators in America. Indeed, a blogger formerly known as “Mencius Moldbug” offered a more accurate picture of the virus around this time than “America’s foremost infectious disease expert” Anthony Fauci. As the kids say, let that sink in.

One might have thought this glaring failure would have chastened Fauci—or provoked his resignation—but it seems to have only fed his hunger for the spotlight and another fawning profile. At the same time, the media made little effort to scrutinize Fauci’s advice behind the scenes. Insider accounts (some reported secondhand, some not) reveal that he was hesitant to aggressively expand testing to contain the epidemic, initially over concerns about accuracy, even as late as May. On the other hand, he has been an early and aggressive promoter of Remdesivir, despite comparatively little evidence of its effectiveness. Curiously, this mirrors a previous episode of his career, when he was an early and aggressive promoter of another expensive antiviral drug, AZT, used to treat AIDS, which proved to be highly controversial and overhyped. I of course am not qualified to second-guess Fauci on AZT or Remdesivir, but it’s hard not to notice the appearance of a near-octogenarian desperately reaching back to his glory days and grasping at straws—as well as the naivete of his beatification in the media.

Regardless of how those questions are ultimately answered, it is clear that Fauci has no ability to escape ingrained path dependencies, no ability to examine issues beyond the most superficial level of case numbers, and no ability to differentiate risk. In fact, so far there is no indication that Fauci has any specialist knowledge of this particular coronavirus at all. In his public statements, he mostly just repeats the news headlines and issues vague calls for lockdowns—a sensible approach for someone with the risk profile of an eighty-year-old, but perhaps not for most everyone else. Look beyond the glowing media portrait, and one sees an overwhelmed, exhausted old man, unfit to hold a demanding and critical position that he should have retired from years ago.

Would younger people have done better in that and other offices? Who knows? But they could not have done any worse. And it’s rather pathetic that this country apparently cannot find a disease expert young enough to qualify even as a baby boomer, never mind finding one under the legal retirement age!

Sunset Provisions

The same can be said for the rest of the American gerontocracy—in politics, business, academia, and beyond. For the boomers and near-boomers, the decisions they made during their prime were mostly terrible, and their record during their twilight is even worse. It is time for younger people—that is, people of normal working age—to take over these positions.

If anything, the Covid crisis has shown that it is dangerous and irresponsible to have almost every important office filled by the population cohort most susceptible to being wiped out by a pandemic. In addition to adjusting lockdowns and quarantines to account for age, mandatory retirement policies should be instituted. (This would merely bring the United States into line with Europe.) Higher tax rates should be imposed on anyone who continues to work full time or hold “systemically important” positions after age sixty-five, and those over seventy should be automatically forced into retirement where they belong. The elderly can still serve as part-time advisers, freelance writers, and so on, but they should not hold important positions. Current political officeholders over the age of seventy should have the good sense to resign, and the Constitution should be amended to include maximum ages in addition to minimums. If a world of pandemics is the “new normal,” then let’s be serious about it.

When the baby boomers were young, they coined the expression “don’t trust anyone over thirty.” Like so much of boomer thinking, that is completely idiotic. But there is something to be said for the notion that anyone too old to drive on a busy highway should probably not be trusted to run the country’s most important institutions in a time of crisis.

In the end, and insofar as generational stereotypes have any validity, the boomers will be remembered for being born in a country in the midst of its New Deal, postwar apogee and leaving behind a neoliberal hellscape. In the prime of their life, they inherited a unipolar hegemon and turned it into an increasingly failed state. Perhaps it is no wonder, then, that they would rather preside over the destruction of every future generation rather than honestly confront their own mortality.

No one likes to admit it, but politics follows Planck’s principle, too.

This article is an American Affairs online exclusive, published June 25, 2020.

Sorry, PDF downloads are available
to subscribers only.

Subscribe

Already subscribed?
Sign In With Your AAJ Account | Sign In with Blink