The Modern Diet Is a Biosecurity Threat

Annie Sprat/Tools in greenhouse

In 1931, an American dentist named Weston Price closed his practice and began a long tour of the known world. Entering his early 60s, he had already enjoyed a long and illustrious career and could be counted among the most influential practitioners of dental medicine in the United States. As chair of the American Dental Association’s research committee, he had helped lead the charge—later reversed—against root canals and in favor of tooth extraction, and in his research publications he had pioneered new technologies to monitor dental health. But his voyage abroad would be no end-of-career capstone. Price was determined to find the answer to something that had started to bother him: the gnawing suspicion that the modern diet was causing a serious degradation in human health.

Price had noticed a rapid increase in dental problems roughly around the end of the nineteenth century. Before, he had been able to expect a baseline of dental health in his patients and research subjects. Now, that was rapidly slipping away, and cavities, once rare, were quickly proliferating. Tooth decay was everywhere. Something had clearly changed, and the modern diet—then only in its infancy—was the leading suspect. Medical professionals across the Western world were observing the same trend: as the Boston dentist Gustave Wiksell noted in 1902, “before American-process flour was shipped to Sweden, two generations ago, two dentists were enough for the whole city of Stockholm
now they are as thick as in an American city.”

Price wanted to find out what exactly was going on. The deterioration was due not just to the newfound presence of bad substances in diets, he thought, but the absence of some unknown “essential factors” that had gone missing with the new diet that society was adopting. So he was determined to go to places where the modern diet had not yet found its footing, and study the health of groups living on their traditional diets.

Over the following years, Price traveled to virtually every corner of the globe. First, it was off to the Alps, to spend time among Swiss villagers; then to the isolated Outer Hebrides off the coast of Scotland, and then to Inuit communities in Alaska, indigenous communities in northern Canada, and parts of the United States. Then he journeyed to the archipelagos of the South Pacific to survey the health of traditional Melanesian and Polynesian tribes; to central and eastern Africa, meeting thirty different tribes, from the Maasai to the Tutsi; and to Australia and New Zealand, to observe the diets of Aboriginals and Maoris. Finally, he went to Latin America to study the teeth of ancient Andeans and their Peruvian descendants. In 1939, after eight years of intensive travel, Price published his findings in a lengthy book titled Nutrition and Physical Degeneration.

Price’s book is a strange read. There is little hint of the condescension and chauvinism of his time toward “primitive” societies, or of the disinterested gaze of the contemporary professional researcher. Price seems to fall in love with nearly every traditional group he encounters, sometimes romanticizing them to the point of excess, but nonetheless treating them with a deep respect bordering on awe. “The reputation of the Maori people for splendid physiques has placed them on a pedestal of perfection;” the skill of the Maasai in killing lions with a spear is “one of the most superb of human achievements;” the Swiss of one village in the Lötschental Valley have “some of the finest physiques in all Europe,” along with such a remarkable level of social trust that they “have neither policeman nor jail, because they have no need for them.” It is a highly discursive and repetitive book, filled with lengthy asides and observations—on the splendor of Inca civilization, child-rearing among the Aboriginals, and the methods by which Torres Strait islanders kill sharks. 

But the book is most interesting not as a travelogue, but as an account of Price’s painstaking medical examinations. The data they produced is documented to an exhausting degree and is accompanied by striking photographs of his examinations. Virtually everywhere he went, all those who had not yet adopted a modern diet and were still eating their ancestral foods—whether based on fish, meat, dairy, or fruit—exhibited far better dental health than those eating modern diets. People who had never seen a dentist or orthodontist were in no need of them. In many isolated groups, Price found no instances at all of tooth decay. Among the Dinkas of the Nile, for instance, he found that only 0.2 percent of their teeth were affected by decay or cavities; among a group of Inuits along a tributary of the Kuskokwim River in Alaska, 0.3 percent; among the Maasai in southern Kenya, 0.4 percent.

When those same groups started eating what he called “the foods of commerce”—white flour, sugar, jams, marmalades, canned goods—their teeth rapidly degraded, with cavities becoming endemic. He was horrified by what colonization had done to the health of the Aboriginals of Australia: “In their native life where they could get the foods that keep them well and preserve their teeth, they had no need for dentists. Now they have need, but have no dentists.”

And it was not just a matter of rotten teeth. Almost everywhere he looked, whether among Swiss breadmakers or Nilotic hunter-gatherers, he found that traditional diets were more conducive to physical development than modern ones. In traditional communities, people had levels of strength and physical ability that astonished him. Swiss village children could go barefoot in glacier-fed rivers without developing tuberculosis; the Maori could allegedly see moons of Jupiter “which are only visible to the white man’s eye with the aid of telescopes;” the swimming ability of remote Scots was “almost beyond belief.”

In the years since Price’s book was published, it has attracted a small but devoted following. His arguments—that traditional food reflected time-tested wisdom; that soil and grass quality affected food quality and thus human health; that modern food was deeply unconducive to healthy living—have only gained currency in the last few decades, and influential food writers like Michael Pollan and Gary Taubes have cited Price as an influence on their work.

Though unscientific by modern standards—Price was more of a nineteenth-century natural scientist, part ethnographer and part medical researcher—it remains a startling book. Much about Price is strange, like his monomania about poor nutrition as a cause for juvenile delinquency or war. But his book is remarkably persuasive in making the case that something had gone very wrong with the modern diet.

The Nutrition of Our Discontent

In the 2000s and 2010s, many decades after Nutrition and Physical Degeneration was published, there was a flurry of efforts to tackle the “lifestyle diseases” that had by then become endemic in Western nations. These included Michelle Obama’s ill-fated “Let’s Move” public awareness campaign about childhood obesity, complete with a companion song by BeyoncĂ©; the banning of particularly large sugary drinks in New York City, soon found unconstitutional by an appeals court; and a variety of short-lived government advertising pushes to encourage people to eat healthier or simply take the stairs, all of which rapidly hit the rocks of widespread apathy.

Americans did not get any healthier, or any less prone to chronic diseases. By the second half of the 2010s, political elites had quietly given up. Both Democratic and Republican platforms dropped mention of the obesity epidemic in 2016, with no reappearance in 2020.

When COVID-19 arrived in the West, population health became a key variable, with the highest death rates seen in the countries with the most obese populations. The twentieth-century shift in mortality from infectious to noncommunicable disease—what demographers call the “epidemiological transition,” linked to the “nutrition transition” that reshaped global diets—largely eradicated diseases like polio and yellow fever, but it did not lead to conditions of general health. Instead, it created populations that are chronically ill, and thus require near-constant medical attention; in turn, healthcare systems shifted from treating acute diseases to managing populations that are permanently, but “manageably,” ill. More medicine than ever, but less health.

It’s easy to view this as a necessary cost of undeniable progress. People are richer and better-fed, so more people will become fat; people are living longer, so they’ll die of cancer or heart disease, rather than typhoid fever or cholera or a spear in the side. There are negative externalities, but what matters is that the lines have been going up.

But this triumphalist narrative is only a partial account of what has changed in the last two hundred years. It is true that the “average person” is dying later than they would in prior centuries, though crude demographic averages typically present a more brutish picture of the past than is accurate. People who survived childhood could regularly expect to live about as long as Westerners do today. In his study of the !Kung bushmen of the Kalahari desert, the anthropologist Richard Borshay Lee writes that the proportion of individuals over 60 years of age “compares favorably to the percentage of the elderly in industrialized nations.” Likewise, the modal lifespan among the TsimanĂ© of Bolivia is 70 years—not bad for a group that practices a subsistence lifestyle combining hunting, gathering, and farming.

It is not just hunter-gatherers who are overlooked by life expectancy statistics. Consider another maligned population: the working classes of mid-Victorian Britain (roughly the years between 1850 to 1870). Accounting for infant mortality, life expectancy for mid-Victorian Britons at age five was 75 for men and 73 for women, not too different from what is enjoyed today in the U.K. The numbers look even better relative to the current life expectancy of British working-class men (compositionally more directly comparable to the mid-Victorian population)—around 72 years for men. Life expectancy for working-class women has increased only slightly, to 76 years.

Of course, in some regards health in modernized societies has improved: we do indeed live longer than in the past and are much less affected by hunger and infectious pathogens. But in other regards, health has become much worse. When scholars like Marshall Sahlins or Jared Diamond contrast hunter-gatherers to pastoralists, they do so with the suggestion that hunter-gatherers were in many ways healthier than us. Diamond reaches the conclusion that the Neolithic agricultural revolution was “the worst mistake in the history of the human race.”

Many elements of that thesis are true: the TsimanĂ©, the Hadza of Tanzania, and the Kitavan Islanders of Papua New Guinea are fascinating for their remarkably good health. Numerous studies exist on the total absence of the “diseases of civilization” that ail us today. Acne, which is so common in the West so as to be mundane, simply does not exist among the Kitavan Islanders or the AchĂ© of Paraguay. Among the TsimanĂ©, brain atrophy with age occurs far more slowly, and the population has the lowest levels of coronary artery disease ever recorded.

But emphasizing the hunter-gatherers overlooks just how recent much of this deterioration has been. The advent of agriculture did inaugurate an epochal decline in health, with populations becoming shorter and more prone to a variety of illnesses. The advent of the modern food system, however, only took place in the nineteenth and twentieth centuries. That change brought on a similarly traumatic deterioration in health, of similar magnitude—one that is still playing out. 

Look, for instance, to a remarkable set of studies published in the 2000s in the Journal of the Royal Society of Medicine on the diet and health of mid-Victorian Britons. These found that the conventional idea of Victorian eating patterns—emaciated orphans begging for gruel—is far from the truth. The British working poor and peasantry of the 1850s and 1860s could enjoy a diet “vastly superior to that generally consumed today, one substantially in advance of current public health recommendations.”

The mid-Victorian diet, the study found, was rich in vegetables, fruits—especially cherries and apples, with between eight and ten portions a day—Omega-3 fatty acids, nuts, whole grains, and meat. This meat, in turn, did not come from expensive prime cuts, but mainly from micronutrient-rich organ meats, drippings, and bones. Though the mid-Victorian working poor may have been hungry more frequently than we are—food was not always instantly available for “snacking” in the way it is today—they almost certainly ate more in absolute terms: “Due to the levels of physical activity routinely undertaken by the mid-Victorian working classes, calorific requirements ranged between 150-200 percent of today’s historically low values.” It was not a perfect diet—food quality, especially when it came to meat, left much to be desired—but it was by no means the starvation diet we typically think of.

All of this created a population that was far healthier than we are accustomed to believing, reared on “something closer to the Mediterranean diet or even the Paleolithic diet than the modern Western diet,” with far higher intakes of micronutrients and phytonutrients than we enjoy today. A land of widespread malnutrition, mid-Victorian Britain was not.

Corresponding to a better diet were much more vigorous lives: nearly all British workers would have qualified as very active by today’s ultra-sedentary standards. At the top end of physical hardiness were the railway builders, who could “routinely shovel up to 20 tons of earth per day from below their feet to above their heads”—a feat requiring tremendous amounts of strength that few people today could match. Leisure activities, from gardening to informal football, were similarly vigorous.

Cancer incidences were shockingly low as well. In 1869, a doctor in Charing Cross Hospital in London described lung cancer as “one of the rarer forms of a rare disease. You may probably pass the rest of your student’s life without seeing another example of it.” When they did occur, cancers were much less rapidly progressive than they are today. The physician James Paget suggested that people with Stage 3 or 4 breast cancer could expect to live four years after diagnosis, or eight years with surgery—far longer than is common today.

But for all the surprising health benefits of the mid-Victorian diet, it would prove no match for a key shift of the 1870s and 1880s: the remarkable decrease in the price of food. Technological improvements (refrigeration, steamships, railways, improved milling techniques) and free trade policies allowed for the emergence of the first truly global regime of food production and circulation.

Outside of Britain, this meant the destruction of old orders that had been in place for centuries. Entire biomes—like the sprawling grassland prairie of the United States, stretching from the Rockies to the Mississippi River—were rapidly uprooted. In the American West, traditional nomadic bison herding was replaced by vast cattle ranches, with bison exterminated and hunter-gatherer Plains Indian nations decimated and dispersed. With the departure of these local actors, ecological economies that had sustained the landscape for generations—far more intricate and subtle than anything the successors could invent themselves—simply collapsed. The ranching economies that followed, defined by barbed-wire fencing and artificial water developments, deeply deformed the region’s ecology in order to create huge surpluses of meat for foreign export.

Accordingly, after 1870, Britain was flooded by meat and grain from the U.S., as well as Argentina, Australia, and New Zealand. British domestic agriculture, now outcompeted, entered a long depression. For most Britons, meanwhile, food became remarkably plentiful and inexpensive. Between 1877 and 1889, the cost of a national weekly food basket fell by about 30 percent.

The fallout of this period transformed the British diet. Foods that had been previously consumed rarely were now widely accessible. Consumption of meat and grains grew at a rapid pace, along with “luxuries” like cheese, butter, and milk. Sugar consumption enjoyed a particularly remarkable boom, with the new craving satisfied by plantations in Southeast Asia and the Caribbean. From 1820 to 1846, per capita consumption of sugar had been about 18 pounds; by 1901, it had grown fivefold to 91 pounds, with jam, condensed milk, and sweetened tea being prime vectors.

It was the birth of the mass modern diet: high in ultra-processed carbohydrates, sugar, and fats, and deficient in many of the phytonutrients and micronutrients that defined traditional diets. For ordinary Britons, it was a dramatic transformation in lifestyle that seemed liberatory at the outset. Cheap food—in common parlance, “big loaf, little loaf”—became a political rallying cry and a major force in Britain’s late-imperial inability to reform trade policy in spite of growing industrial competition from the United States and Germany.

But even as the British populace welcomed it, the cheapening of food led to a significant deterioration in physical health. The British diet entered a severe downturn at the end of the 1870s, one from which it has arguably never recovered. The decline of foods like offal made nutrient deficiencies much more common. The increase in sugar consumption alone did so much damage to people’s teeth that many were now unable to chew tough foods. Tooth decay had been rare in Britain until the mid-nineteenth century; now it was ubiquitous. So stark was the problem that a 1909 visitor to Sheffield found people with “loose-set mouths with bloodless gums and only here and there a useful tooth.” At the beginning of the twentieth century, there was a dramatic increase in chronic degenerative diseases—like cancer and diabetes—that had previously been rare.

During the late Victorian period, there was clear evidence of a worsening in health. By the Boer War at the turn of the century, 50 percent of young working-class recruits were so malnourished as to be fully unfit for service—a problem that had never been reported during the Asante or Zulu Wars in the early 1870s. In 1901, the British infantry was forced to drop its minimum height for recruits from 5’ 4” to 5’. The British government responded by setting up a “Committee on Physical Deterioration,” in order to address a problem that had not existed a few decades prior. But this did little: a similar recruitment crisis occurred during the First World War, with David Lloyd George mourning “how many more men we could have put into the fighting ranks if the health of the country had been properly looked after”—concluding that “we have used our human material in this country prodigally, foolishly, cruelly.”

It is easy to see why so many commentators at the turn of the twentieth century perceived the overwhelming trend in physical health to be not “progress,” but some sort of deterioration. Publications like the North American Review and the British Medical Journal frequently referenced questions of “physical deterioration” and “national vitality.” These signs of physical decline were soon incorporated into a rather hyperbolic theory of general civilizational decay, expressed in books like Max Nordau’s Degeneration—which regarded modern trends, especially artistic and literary ones, as manifestations of physical illness. Nordau described the French symbolist poet Paul Verlaine as having, “in astonishing completeness, all the physical and mental marks of degeneration.”

It is through this narrative of weakening and decay—the sense, ably expressed in Kipling’s “Recessional,” of lost strength—that many of the intellectual and social trends of the turn of the century emerged: Muscular Christianity and the YMCA; physical culture, especially bodybuilding and gymnastics (symbolized by Eugen Sandow, “the perfect man,” who built his physique to resemble classical Greek statuary); Theodore Roosevelt’s ideal of “the strenuous life”; or the return to nature represented by the Boy Scouts, the Wandervogel, summer camps, and nudism. Nordau himself found something of a solution to degeneration in his embrace of Zionism, which for him meant forging a “new muscle Jew” free of the marks of physical degeneration.

Much of this discourse of degeneration was tinged with a degree of hysteria, leading to a variety of ephemeral trends that had little to do with reversing symptoms of physical decline. One fad responding to the diet problem, “Fletcherizing,” involved chewing food up to a hundred times until it was liquified; John D. Rockefeller was an adherent. But the problem that these figures described was very real. The advent of the modern diet really was a disaster for health, one that some elites briefly felt they could fix. Even amid all the hot air, the actual evidence of physical deterioration was hard to dispute.

But the burst of enthusiasm for healthy, more naturalistic living eventually began to ebb. As late as the 1920s, writers like the British parliamentarian Pierse Loftus could reflect on the “degenerative effects” of eating white bread; but this was a losing battle. The modern diet, along with the increasingly sedentary mode of living it enabled and depended on, had become too entrenched to overturn. By the 1920s and ‘30s, the prerogative for responding to the problems created by diet had moved from the cultural realm to the spheres of medicine and public health.

Any sense of a coherent, holistic response to the health problems created by the advent of industrial diet and lifestyle had been lost. Instead, institutions adopted a variety of “public health” tweaks to ameliorate the problem. Brushing teeth was an early example: campaigns to encourage brushing teeth began to proliferate at the beginning of the twentieth century, with proponents claiming that it was necessary at least “until the ways of the race have become more simple, more primitive and more healthful.” The fluoridation of water followed in the 1940s and ‘50s in order to improve dental health. These interventions did improve specific health outcomes, but only by treating emergent symptoms while the underlying malignancy—a physiologically harmful way of eating and living—continued to metastasize.

The modern diet underwent a second phase of rapid degradation in the late twentieth century. With the advent of fast-food chains and addictive snacks that contained high-fructose corn syrup—consumption of which increased tenfold from 1970 to 1990—non-processed food sources eroded rapidly. Oreos, potato chips, and Coca-Cola could soon be found all over the world, owing to the lowering of trade barriers, brilliant corporate marketing strategies, and the fundamental addictiveness of the products being sold. Coca-Cola was not regularly consumed in Mexico in the 1950s; by 2019, residents of Chiapas, Mexico’s poorest state, drank an average of 2.2 liters of Coke a day.

At the same time, the international agricultural system experienced another transformation in the decades after the Second World War, fully globalizing the modern diet. In regions like the Caribbean or much of Africa, trade liberalization and development programs rewired agricultural economies away from food independence and toward cash crop exports. Local populations, increasingly pushed from peasant communities into urban slums, became dependent on processed foods imported from abroad. In others—most famously in India—the Green Revolution and its new varieties of wheat and other staples defused the Malthusian “population bomb,” which had seemed imminent at mid-century. But it also meant national agricultural systems that focused increasingly on staple crops, pushing toward a global convergence on industrial diets. India alone has lost more than 100,000 varieties of rice since 1970, and the farmland accorded to ancestral coarse cereals like sorghum and millet, highly adaptive to Indian conditions and considerably more nutritious than their replacements, has declined significantly. 

As homogenized rice, wheat, and corn gained global dominance, driven by huge increases in yield, their nutritional value declined. Since the mid-1960s, the mineral density of wheat—the amounts of zinc, iron, copper, and magnesium it holds—has declined significantly, coinciding with the rise of the Green Revolution’s high-yield, semi-dwarf wheat cultivars. Similar trends have been found in rice, which, combined with wheat and corn, constitutes 51 percent of global calorie intake. Of about 30,000 edible plant species worldwide, modern agricultural systems cultivate only about 150, with 30 of those providing 95 percent of calories consumed. Everywhere, ancestral staples gave way to the familiar, highly centralized dietary regime.

Nutrition was transformed and delocalized in accordance with a system that emphasized yield above all else. These changes brought to the “developing” world a dietary pattern that had already led to physical deterioration in modernized countries. The same transition that played out in Victorian Britain now took place throughout the entire periphery. Price, horrified as he was by the poor teeth of modernizing populations, only saw the beginning of what was to come in the places he studied. In New Caledonia, whose people he had praised for the “very high order” of their physical development, 64 percent of men and 60 percent of women are now overweight or obese. The global obesity rate tripled between 1975 and 2016. It would have already been harmful enough to switch from traditional diets to the late-Victorian one, with its jam and sweetened tea; but that diet seems quaint next to the sugar-packed food and drink of the late twentieth century.

The effects of this transition can be seen in the globalization of “diseases of affluence” to non-affluent countries. The country with the highest prevalence of diabetes is now not the U.S. or Britain, but Pakistan. The countries with the highest rates of obesity are the impoverished statelets of the South Pacific, like Nauru and Tonga, which now neglect nutrient-rich local staples like coconut or breadfruit in favor of processed foods imported from Australia, Vietnam, and Thailand. Even the Yucatec Mayans of southern Mexico have begun to suffer greatly from Western lifestyle diseases.

Hawaii represents a particularly tragic story of nutritional decline. Upon European contact, native Hawaiians were entirely self-sufficient in food: they cultivated taro extensively, venerating the plant as an elder brother to the Hawaiian people and eating up to 15 pounds of it a day. On this diet, they enjoyed such good health and freedom from disease that, per the Hawaiian historian Samuel Kamakau, “the art of healing” faded away simply because “there was not much sickness within the race.” But the restructuring of Hawaiian agriculture toward exports (of sugar and pineapple, and later of coffee, avocado, and macadamia nuts) in the nineteenth and twentieth centuries destroyed taro production, resulting in a rapid loss of traditional knowledge of its cultivation. Today, more than 80 percent of Hawaii’s food is supplied by imports, and taro is grown on just a few hundred acres—a shift corresponding to the advent of modern lifestyle diseases, with native Hawaiians particularly affected.

In this historical context, the “nutrition transition” of the last two centuries begins to look more like a dietary apocalypse. It represents the extirpation of traditional, local foodways across the world in favor of an inflammatory, nutrient-poor diet of processed foods, with catastrophic consequences for human health.

Morbid Symptoms

All of this would be bad enough if it meant only an increase in chronic degenerative diseases, worse teeth, or a loss in nutritional diversity and richness. But it goes deeper: we do not yet fully understand what the modern diet and lifestyle are doing to our bodies. The changes are strange, swift, and profound; they go far beyond the degradation Price observed in the 1930s.

Consider the example of puberty: in both boys and girls, puberty seems to be beginning earlier and lasting longer than in the past—a shift whose consequences have received remarkably little attention. In 1835, the median age of first menstruation for girls was 16; by 1970, it was 12. Scientists have cited factors from higher rates of obesity to endocrine-disrupting chemicals to overall diet changes as possible explanations, but no precise causality has been demonstrated. The costs of this maldevelopment seem to be significant: earlier puberty has been linked to higher rates of breast cancer in women, and to the emergence of mood disorders.

The destruction of the old agricultural order has also severed modern humans from the foods and environments in which we acquire our natural gut microbiome as children—a crucial process for proper immune system functioning. We are only now beginning to understand the microbiome, and its depletion, but the implications are clear: in losing our microbiomes, we are losing an entire organ. In addition to the inability to digest certain foods, microbiome loss has led to complex second-order effects, such as immune system maladaptation. This likely plays a role in the explosion of various unrelated chronic diseases—from Crohn’s disease to diabetes—seen since the beginning of the twentieth century, and accelerating in the last 50 years.

This means that both sexes are now far more likely to develop autoimmune diseases, like ulcerative colitis, multiple sclerosis, rheumatoid arthritis, and psoriasis, characterized by the immune system erroneously attacking healthy cells. These illnesses can be permanently managed but never cured—barring a restoration of microbiome health at the societal level. Individual restoration of microbiomes doesn’t work well, since we have no idea how to mechanistically recreate the complex natural ecosystem that builds immunities; therapeutic success has been limited to the diarrhea-causing microbe Clostridium difficile.

Microbiome depletion is likely linked to the growth in mental disorders over the last few decades: researchers have increasingly focused on the “gut-brain axis” and its relationship to mental health. There is substantial evidence that the remarkable increase in depression and anxiety over the last few decades might not just be a product of loosening interpersonal bonds, but also a reverberation of these widespread physiological problems. Common deficiencies in minerals like magnesium, for instance, have been linked to depression and anxiety in women; a number of studies have found a connection between low intakes of Omega-3 fatty acids and depression and bipolar disorder, on both the individual and population levels. Even rats exhibit some link between anhedonia and unhealthy diets. The soft decline in IQ in several Western nations over the last few decades—the reversal of the famous “Flynn effect,” seemingly linked to environmental and not genetic factors—may also implicate worsening health and nutrition.

One remarkable physiological difference comes in our jaws. Human jaw shrinkage, commonly linked to the Neolithic agricultural revolution, has accelerated rapidly in the last two centuries with the advent of ultra-soft processed foods. Because so much of the modern diet is defined by softness—or by simply being liquid, from sugary drinks to protein shakes—humans spend much less time and effort chewing than they used to, leading to the severe underutilization of masticatory muscles. This underutilization of jaw muscles leads to smaller jaws and causes a host of formerly rare orthodontic problems—malocclusions, like overbites and teeth crowding, and non-eruption of wisdom teeth—which necessitate medical intervention throughout the early years of life. This change in chewing affects jaws, but it also affects the broader skull. Compared to pre-industrial populations, modern people tend to have smaller jaws, smaller teeth, and smaller faces more generally.

But perhaps the most worrying sign has to do with fertility. In the West, sperm counts have been in severe decline for decades; the number of men with less-than-normal levels of sperm has increased significantly, along with the number of men with such low sperm levels that they require fertility treatment to reproduce. It is clear that more couples will have difficulty having children. Studies have repeatedly linked diet to worsening fertility problems in men; endocrine disruptors, like phthalates—commonly found in the packaging of store-bought food—also appear to play a major role, along with obesity and deficiencies in nutrients like iron or folic acid.

Many of these changes are still shrouded in mystery: the precise etiologies remain unclear or have yet to be fully explained. We do not know what exactly is going on with our bodies, or what these ongoing changes will mean. But since the underlying pathology continues unchecked, the grim truth is that, for developed and “developing” countries alike, the strange and malign health trends of the last few decades are likely only to worsen.

Approaches and Exits

Some—those with the means to afford it—can cushion their exposure. You can find the healthiest American populations today on elite college campuses or in upper-class enclaves like Nantucket or Loudoun County. America’s rich have, at great expense, acquired a degree of relative immunity from the health disaster. Wealthy neighborhoods are dotted with health-food stores, “slow food” restaurants, and farmers’ markets; the new elite diets like paleo and veganism, have, for all their cloying moralism, a similar function of shielding those who can afford them. Modern people can still have the type of non-poisonous diet ancestors of the year 1850 might have eaten; it just entails a sharp upfront cost.

The simple fact that whole, organic food is expensive creates a class divide in health that is severe enough to be visible in the geography of American cities: the disparity in life expectancy between the Roxbury and Beacon Hill neighborhoods of Boston, for instance, is now about as large as the gap between El Salvador and Finland—with the poorest sub-neighborhoods of Roxbury, like Egleston Square, reporting life expectancies lower than those of Haiti or Liberia.

But this metastasizing crisis will not spare the health-conscious: even people who avoid the modern diet still bear the costs, direct and indirect, of living in unhealthy societies. Because contemporary Western populations are still so vulnerable to mass death events like COVID-19 due to their general unhealthiness, and because pandemics will only become more common occurrences on account of industrial agriculture and globalization, such costs are only likely to grow. 

Governments might adopt increasingly desperate maneuvers to protect against these biological threats, but they are unlikely to touch the things making people so vulnerable to them in the first place. Bill Gates’ book How To Prevent The Next Pandemic—essentially an outline of how the Gates Foundation will use its endowment in the coming years—spends 304 pages on ideas like “get better at detecting outbreaks early” or “find new treatments fast,” with only a single mention of obesity and not a word on the autoimmune diseases which make people more vulnerable to pandemic disease. Improving overall population health to reduce risk does not merit his attention.

Such approaches will only manage diseases, chronic and pandemic, without solving the malignancies from which they stem. Bad nutrition creates damage at a scale that cannot be fully remedied through past facto intervention: the public health and medical frameworks which respond to problems after they emerge are much less effective than simply avoiding those problems in the first place. These approaches essentially operate within the modern dietary paradigm, disciplined into a symbiotic relationship with it. They create societies with more resources dedicated to medicine—more drugs, more treatments—but not necessarily one with a greater degree of population health. They present not an exit, but a trap.

Actually escaping the health trap requires moving from managing poor health to preventing it: an immense reform of how societies grow and consume food, and by extension how it structures and regulates human behavior. Logistically—not to mention politically, culturally, and socially—this is an immense challenge. It requires not just weaning populations off highly addictive diets, but attempting to recreate and even improve the healthier agricultural economies that were decimated by twentieth-century centralization. But it is not impossible: America, as a continent, has the natural resources to transform its food system and make it the nutritional basis of a healthy population. Even smaller polities like premodern Hawaii could sustain large, healthy populations on whole foods; nothing physically prevents this from happening again.

This is a civilizational question—broad matters of agriculture and food always are, as Price recognized. But the reality is that no current institutions have the capacity or incentive to do much of anything about the diet question: the food industry has tremendous power, and most of the medical establishment is more focused on post facto intervention and selling drugs than on general population health. The problem would have to be solved by a state that is willing to reshape the modern diet.

This would be a stark departure from the current framework—which dictates that diets, falling under questions of “lifestyle,” should be left to fast food companies and advertising agencies. But it would, if anything, be a return to the historical norm. States have always been responsible for food; the modern diet, rather than the natural outcome of autonomous “market forces,” was in many ways the creation of conscious government choices. This begins with the British state’s pursuit of imperial “free trade” in food but also includes more recent shifts: modernizing Mexican elites pushing Rockefeller Foundation agronomists to prioritize breeding wheat over traditional maize, or Earl Butz, Richard Nixon’s agriculture secretary, supercharging American corn production, eventually creating a corn surplus so large that it spurred the mass production of high-fructose corn syrup.

The modern diet was the creation of conscious decisions; the same could be true for a health-restoring successor. Ecological and social projects at this scale have been attempted, and accomplished, before—in the U.S., clean air, wildlife protection, and the current diet and agricultural regime itself are all examples. On the practical level, restoring population health would demand a vast recovery of lost agricultural knowledge, the cultivation of new cadres of agronomists and ethnobotanists, sketching what sustainable biome-specific permacultures might look like, and how they can be scaled up using new techniques. On the theoretical level, it will require a new state paradigm where the regeneration of a healthy populace—and not just the management of its illnesses for profit or stability—is a matter of species-level biosecurity, and afforded the priority that such a problem demands.

David Oks is a Palladium correspondent who covers health, society, and international affairs. A graduate of Oxford University, he lives in New York and Europe. You can follow him at @davideoks.