Historical essay
The Problem with Medical History in the Age of COVID-19

The Problem with Medical History in the Age of COVID-19

Agnes Arnold-Forster and Caitjan Gainty

The pandemic has prompted a proliferation of newspaper articles, think-pieces, and other public writing on the history of medicine. Some have been quite thoughtful, offering new perspectives on the past and present of science, technology, and healthcare, and making radical suggestions for the post-coronavirus future.

Others, however, have indulged some of our worst instincts about how best to understand medicine’s history. These straightforward, triumphalist accounts of uninterrupted progress are the last bastion of the “great white men doing great things” narrative that has been so rightly called out and dismissed in other areas of popular and academic historical writing.

A recent example came in a New York Times article by Steven Johnson, one part of his “multiplatform ‘Extra Life’ project”, which, as its website explains, traces the “revolution in medicine and public health” that has given us so much “extra life.” Johnson capitalizes on our pandemic moment, opening with contrasting accounts of life expectancy during the 1918 influenza and COVID-19. The argument accompanying the intervening “revolution” is familiar: a story of the genius and generosity of a select set of public health and medical heroes whose work has doubled our life expectancy over the last century.

The argument is carefully curated to avoid the complexity of our global health community. It focuses instead on the two intertwining aspects of healthcare history that have attracted particular attention in the global north. These are the intersecting histories of infectious disease control, which was a critical topic at the beginning of the pandemic as we became accustomed to life in lockdown, and the production of a Sars-CoV-2 vaccine, which has prompted much talk of the pandemic’s end. More critical medical histories of both allow us to correct the triumphalist accounts with which the news media has recently been awash.

Infectious Disease

While the story of infectious disease control stretches back centuries, a critical and often overlooked moment in this history came in the 1950s, when doubts about simplistic renderings of the relationship between humans and infectious diseases emerged. In 1955, the incoming president of the American Association of Immunologists, Thomas Magill, laid out what he viewed as the “simple” but overlooked truth of infectious disease: that for the most part, the incidence of these illnesses had decreased on its own, before any of the medical interventions credited with their demise had appeared. Magill used a graph showing pneumonia death rates in New York State from 1850 to 1930 as evidence. “It is clear,” he wrote, “that the rapid decline in pneumonia death-rates began in New York State before the turn of the century and many years before the ‘miracle’ drugs were known.” Magill’s conclusion was that the war with infectious disease was not a war at all – or at least not one that medicine could win – but instead a process of acclimation in which medications played little to no role.

A black and white photo of a white man with short cropped hair, wearing a bowtie.
Thomas Pleines Magill (1903–1999). (Courtesy American Association of Immunologists)

Since the 1950s, medicine’s efficacy has improved but the notion that neither antibiotics nor vaccinations had significantly altered or curtailed the trajectory of infectious disease did not go away. Indeed, these ideas reemerged during the COVID-19 pandemic. Some people initially believed that the best approach to controlling infection was to do nothing: to reach the all-important herd immunity “naturally,” a theory notably put into practice in Sweden.

Another critical strand of this mid-century conversation made itself clear in the decades immediately following the observations of Magill and like-minded others. The demographer Thomas McKeown attributed the longstanding decline in tuberculosis over the course of the nineteenth century not to improved clinical achievement, but to broader changes in infrastructure and nutrition. This tendency toward medical nihilism also appeared in the work of the now celebrated champion of evidence-based medicine, Archie Cochrane. In his 1972 call to arms against ineffective healthcare, Effectiveness and Efficiency, Cochrane observed that, “in comparison with the recuperative powers of the human body,” the effectiveness of therapeutic intervention came a distant second.

Later, more critical authors went even further. Perhaps none were as outspoken as the prominent social critic Ivan Illich, who suggested that the problem of medical ineffectiveness was just the tip of the iceberg. Injecting a new term into the critical parlance of the day, Illich decried medicine as “iatrogenic” – far from improving health, medicine was actively damaging it. At the clinical level, this came in the form of ineffective or toxic treatments and medical error. At the social and cultural levels, iatrogenesis took the form of medicine’s increasing encroachment on all aspects of everyday life.

These claims were especially useful for the correction they offered to the grandiose, superficial, and sometimes dubious nature of the claims that medicine was making for itself. But these correctives seem to have dissipated. During the current pandemic, our replacement of this deeply complex narrative of infectious disease control with a triumphalist account of laboratory- and hospital-based medicine has caused us to both underestimate the utility of lockdowns and overestimate the much-touted ability of pharmaceutical intervention, especially vaccines, to release us from this current plague.

Vaccines

Despite what we have all heard, repeatedly and consistently in what the media has dubbed the pandemic’s waning days, vaccines are no panacea. They are a key component of public health’s arsenal, but they are not the neutral, benign objects that we generally perceive them to be. They have been, and remain, political objects that connect the problematic ethics of pharmaceutical companies to the checkered history of institutional racism and sexism embedded in medicine’s clinical and research practices, to the troubled politics of American healthcare, and even to the place of the United States in the world.

Indeed, vaccination programs were a key geopolitical tool in US Cold War efforts to gain a toehold in strategic areas before their Communist foes could make headway. The United States revisited this rather troubled legacy in its 2011 CIA-backed effort – thinly veiled as a hepatitis-B vaccination campaign – to find Osama bin Laden’s compound in Abbotabad, Pakistan. The whole affair made conspiracy theorists seem as though they were historically astute political savants.

Black and white poster featuring faces with and without smallpox marks.
Poster created prior to 1979 promoting the importance of smallpox vaccination. (Courtesy Wikimedia)

Even vaccination’s quintessential success story – the WHO’s smallpox eradication campaign in the 1960s and ’70s – is not the clear-cut victory it seems. Though the campaign was finally successful, it was so costly, difficult, and labor-intensive that those running it recommended locally led, infrastructurally focused public health interventions going forward. It is notable that although other campaigns have been launched since, only rinderpest, a disease affecting cattle, has joined smallpox on the list of eradicated diseases.

By contrast, the rhetorical appeal of vaccination has only grown, at least in the media mainstream. In part, this reflects the preference for quick medical fixes so frequently lauded since the 1930s, the same obsession with “miracle drugs” that Magill bemoaned. But it also reflects the entangled trajectories of national security and global health since 9/11, when pathogens joined al Qaeda in the category of nonstate aggressors. For all of their inefficiency in the global control of infectious disease, vaccination campaigns articulated well the counterterrorism credo of quick, targeted strikes.

Vaccines do not, of course, hold universal appeal. People worldwide raise serious and, given the problematic legacy of global vaccination programs, sometimes quite legitimate objections to vaccination. However, anti-vaccination voices are louder than they are numerous, and by and large, we have embraced vaccination as the best, and perhaps the only, way to get ourselves out of this crisis. Vaccines will help, undoubtedly, but we need to stop treating vaccines as the obvious or only answer and start acknowledging the real and difficult questions their legacies pose.

Conclusion

Johnson is not wrong to say that the global average life expectancy has more than doubled since 1920. Yet it only took until the 1930s for medical practitioners to take credit for the gains already seen. Vaccinations, miracle cures, and all of the other fruits of medicine’s collective “inventive genius,” made available by the “unparalleled generosity” of “men of science,” were uniquely responsible for Americans’ extended lifespans.[1] This was what they said then. And it is still what we say now. This view of medicine has settled into the mainstream and become embedded as a piece of intrinsic cultural knowledge.

Johnson neatly plays on our addiction to these triumphalist tales in his retelling of the story of how some (but by no means all) of us came to live longer. In near unison with his fellow boosters of medical prowess from the 1930s, Johnson suggests that the story of extra life is “progress in its usual form,” replete with the “brilliant ideas and collaborations,” the “epic achievements” and “visionary thinking” of a group of heroes who selflessly gave of themselves.

The desire to avoid writing bad history is not the only reason to be wary of these triumphalist accounts. They do a great disservice by obscuring more critical understandings of how medicine and healthcare function now. When medicine is viewed as the glorious result of the greatest scientific minds, it becomes difficult to imagine that we have any standing to acknowledge its shortcomings or any scope to make real change. When it is revealed, on the other hand, to be a practice that is shaped by society and culture and steeped in the industrial and capitalist logics that guide all the other areas of our lives, critique and change become not only viable, but something of a civic duty.

Notes

    1. Medical Care for the American People, the Final Report of the Committee on the Costs of Medical Care, adopted Oct 31, 1932, 3. See also Caitjan Gainty, The Product of Medicine (forthcoming).

Featured image caption: A scientist in white lab coat and gloves inject liquid into a vial in a lab setting. (Courtesy PIXNIO)

Agnes Arnold-Forster is a researcher in the Social Studies of Medicine Department at McGill University. Her first book, The Cancer Problem, was published by Oxford University Press in January 2021. She is co-PI on the Wellcome Trust–funded project, Healthy Scepticism.

Caitjan Gainty is a historian of twentieth century medicine and technology at King's College London. Together they run the Wellcome Trust-funded Healthy Scepticism project, which examines the role of medicine's critics and detractors, its dispossessed and antagonists in the constitution of its contemporary form.


Discover more from Nursing Clio

Subscribe to get the latest posts sent to your email.