There was a time, at the start of the 20th century, when the field of public health was stronger and more ambitious. A mixed group of physicians, scientists, industrialists, and social activists all saw themselves “as part of this giant social-reform effort that was going to transform the health of the nation,” David Rosner, a public-health historian at Columbia University, told me. They were united by a simple yet radical notion: that some people were more susceptible to disease because of social problems. And they worked to address those foundational ills—dilapidated neighborhoods, crowded housing, unsafe working conditions, poor sanitation—with a “moral certainty regarding the need to act,” Rosner and his colleagues wrote in a 2010 paper.
A century and a half later, public health has succeeded marvelously by some measures, lengthening life spans and bringing many diseases to heel. But when the coronavirus pandemic reached the United States, it found a public-health system in disrepair. That system, with its overstretched staff, meager budgets, crumbling buildings, and archaic equipment, could barely cope with sickness as usual, let alone with a new, fast-spreading virus.
By one telling, public health was a victim of its own success, its value shrouded by the complacency of good health. By a different account, the competing field of medicine actively suppressed public health, which threatened the financial model of treating illness in (insured) individuals. But these underdog narratives don’t capture the full story of how public health’s strength faded. In fact, “public health has actively participated in its own marginalization,” Daniel Goldberg, a historian of medicine at the University of Colorado, told me. As the 20th century progressed, the field moved away from the idea that social reforms were a necessary part of preventing disease and willingly silenced its own political voice. By swimming along with the changing currents of American ideology, it drowned many of the qualities that made it most effective.
♦♦♦
Public health’s turning point, according to several historical accounts, came after the discovery that infectious illnesses are the work of microbes. Germ theory offered a seductive new vision for defeating disease: Although the old public health “sought the sources of infectious disease in the surroundings of man; the new finds them in man himself,” wrote Hibbert Hill in The New Public Health in 1913. Or, as William Thompson Sedgwick, a bacteriologist and a former president of the American Public Health Association (APHA), put it, “Before 1880 we knew nothing; after 1890 we knew it all.”
This revolution in thinking gave public health license to be less revolutionary. Many practitioners no longer felt compelled to deal with sticky, sweeping problems such as poverty, inequity, and racial segregation (or to consider their own role in maintaining the status quo). “They didn’t have to think of themselves as activists,” Rosner said. “It was so much easier to identify individual victims of disease and cure them than it was to rebuild a city.” Public-health leaders even mocked their predecessors’ efforts at social reform, which they saw as inefficient and misguided. Some dismissively billed the impressive work of the sanitarian movement, which had essentially plumbed entire cities, as “a matter of pipes.”
As public health moved into the laboratory, a narrow set of professionals associated with new academic schools began to dominate the once-broad field. “It was a way of consolidating power: If you don’t have a degree in public health, you’re not public health,” Amy Fairchild, a historian and the dean of the College of Public Health at Ohio State University, told me. Mastering the new science of bacteriology “became an ideological marker,” sharply differentiating an old generation of amateurs from a new one of scientifically minded professionals, wrote the historian Elizabeth Fee.
Hospitals, meanwhile, were becoming the centerpieces of American health care, and medicine was quickly amassing money and prestige by reorienting toward biomedical research. Public-health practitioners thought that by cleaving to the same paradigm, “they could solidify and extend their authority and bring public health up to the same level of esteem and power that medicine was beginning to enjoy,” Fairchild told me.
Public health began to self-identify as a field of objective, outside observers of society instead of agents of social change. It assumed a narrower set of responsibilities that included data collection, diagnostic services for clinicians, disease tracing, and health education. Assuming that its science could speak for itself, the field pulled away from allies such as labor unions, housing reformers, and social-welfare organizations that had supported city-scale sanitation projects, workplace reforms, and other ambitious public-health projects. That left public health in a precarious position—still in medicine’s shadow, but without the political base “that had been the source of its power,” Fairchild told me.
After World War II, biomedicine lived up to its promise, and American ideology turned strongly toward individualism. Anti-communist sentiment made advocating for social reforms hard—even dangerous—while consumerism fostered the belief that everyone had access to the good life. Seeing poor health as a matter of personal irresponsibility rather than of societal rot became natural.
Even public health began to treat people as if they lived in a social vacuum. Epidemiologists now searched for “risk factors,” such as inactivity and alcohol consumption, that made individuals more vulnerable to disease and designed health-promotion campaigns that exhorted people to change their behaviors, tying health to willpower in a way that persists today.
This approach appealed, too, to powerful industries with an interest in highlighting individual failings rather than the dangers of their products. Tobacco companies donated to public-health schools at Duke University and other institutions. The lead industry funded lead research at Johns Hopkins and Harvard universities. In this era, Rosner said, “epidemiology isn’t a field of activists saying, ‘God, asbestos is terrible,’ but of scientists calculating the statistical probability of someone’s death being due to this exposure or that one.”
In the late 20th century, some public-health leaders began calling for a change. In 1971, Paul Cornely, then the president of the APHA and the first Black American to earn a Ph.D. in public health, said that “if the health organizations of this country have any concern about the quality of life of its citizens, they would come out of their sterile and scientific atmosphere and jump in the polluted waters of the real world where action is the basis for survival.” Some of that change happened: AIDS activists forced the field to regain part of its crusading spirit, while a new wave of “social epidemiologists” once again turned their attention to racism, poverty, and other structural problems.
But, as COVID has revealed, the legacy of the past century has yet to release its hold on public health. The biomedical view of health still dominates, as evidenced by the Biden administration’s focus on vaccines at the expense of masks, rapid tests, and other “nonpharmaceutical interventions.” Public health has often been represented by leaders with backgrounds primarily in clinical medicine, who have repeatedly cast the pandemic in individualist terms: “Your health is in your own hands,” said the CDC’s director, Rochelle Walensky, in May, after announcing that the vaccinated could abandon indoor masking. “Human behavior in this pandemic hasn’t served us very well,” she said this month.
If anything, the pandemic has proved what public health’s practitioners understood well in the late 19th and early 20th century: how important the social side of health is. People can’t isolate themselves if they work low-income jobs with no paid sick leave, or if they live in crowded housing or prisons. They can’t access vaccines if they have no nearby pharmacies, no public transportation, or no relationships with primary-care providers. They can’t benefit from effective new drugs if they have no insurance. In earlier incarnations, public health might have been in the thick of these problems, but in its current state, it lacks the resources, mandate, and sometimes even willingness to address them.
♦♦♦
Public health is now trapped in an unenviable bind. “If it conceives of itself too narrowly, it will be accused of lacking vision … If it conceives of itself too expansively, it will be accused of overreaching,” wrote Lawrence Gostin, of Georgetown University, in 2008. “Public health gains credibility from its adherence to science, and if it strays too far into political advocacy, it may lose the appearance of objectivity,” he argued.
But others assert that public health’s attempts at being apolitical push it further toward irrelevance. In truth, public health is inescapably political, not least because it “has to make decisions in the face of rapidly evolving and contested evidence,” Fairchild told me. That evidence almost never speaks for itself, which means the decisions that arise from it must be grounded in values. Those values, Fairchild said, should include equity and the prevention of harm to others, “but in our history, we lost the ability to claim these ethical principles.”
This tension has come up over and over again in my reporting. Although the medical establishment has remained an eager and influential participant in policy, public health has become easier than ever to silence. It need not continue in that vein. “Sick-leave policies, health-insurance coverage, the importance of housing … these things are outside the ability of public health to implement, but we should raise our voices about them,” said Mary Bassett, of Harvard, who was recently appointed as New York’s health commissioner. “I think we can get explicit.”
Public-health professionals sometimes contend that grand societal problems are beyond the remit of their field. Housing is an urban-planning issue. Poverty is a human-rights issue. The argument goes that “it’s not the job of public health to be leading the revolution,” Goldberg said. But he and others disagree. That attitude emerged because public health moved away from advocacy, and because the professionalization of higher education splintered it off from social work, sociology, and other disciplines. These fragmented fields can more easily treat everyone’s problems as someone else’s problem.
The future might lie in reviving the past, and reopening the umbrella of public health to encompass people without a formal degree or a job at a health department. Chronically overstretched workers who can barely deal with STDs or opioid addiction can’t be expected to tackle poverty and racism—but they don’t have to. What if, instead, we thought of the Black Lives Matter movement as a public-health movement, the American Rescue Plan as a public-health bill, or decarceration, as the APHA recently stated, as a public-health goal? In this way of thinking, too, employers who institute policies that protect the health of their workers are themselves public-health advocates.
“We need to re-create alliances with others and help them to understand that what they are doing is public health,” Fairchild said. The field in the late 19th century was not a narrow scientific endeavor but one that stretched across much of society. Those same broad networks and wide ambitions are necessary now to deal with the problems that truly define the public’s health.
Ed Yong is a science journalist who reports for The Atlantic. He is based in Washington, DC.
For his coverage of the COVID-19 pandemic, he won the Pulitzer Prize in explanatory journalism; the George Polk Award for science reporting; the Victor Cohn Prize for medical science reporting, the Neil and Susan Sheehan Award for investigative journalism; the John P. McGovern Award from the American Medical Writers’ Association; and the AAAS Kavli Science Journalism Award for in-depth reporting.
I CONTAIN MULTITUDES, his first book, looks at the amazing partnerships between animals and microbes. Published in 2016, it became a New York Times bestseller, and was listed in best-of-2016 lists by the NYT, NPR, the Economist, the Guardian, and several others. Bill Gates called it "science journalism at its finest", and Jeopardy! turned it into a clue. His second book, AN IMMENSE WORLD, will look at the extraordinary sensory worlds of other animals.
Prior to joining the Atlantic, Ed’s writing also featured in National Geographic, the New Yorker, Wired, the New York Times, Nature, New Scientist, Scientific American, and other publications. He regularly does talks and interviews, and his TED talk on mind-controlling parasites has been watched by over 1.5 million people.
Ed cares deeply about accurate and nuanced reporting, clear and vivid storytelling, and social equality. He writes about everything that is or was once alive, from the quirky world of animal behaviour to the equally quirky lives of scientists, from the microbes that secretly rule the world to the species that are blinking out of it, from the people who are working to make science more reliable to those who are using it to craft policies. His stories span 3.7 billion years, from the origin of life itself to the COVID-19 pandemic. He is married to Liz Neeley, founder of Liminal Creations, and is parent to Typo, a corgi. He has a Chatham Island black robin named after him.
The Atlantic. Be surprised. Be challenged. Be a subscriber.
Spread the word