History
Making Malaria History

Making Malaria History

Recently global headlines celebrated the news that the World Health Organization (WHO) has recommended the RTS,S vaccine for use against malaria. While the headlines claimed this was “groundbreaking,” a “major milestone,” and a “historic day,” it didn’t take long for a note of caution to creep in.[1] The vaccine is less effective than many had hoped, and the requirement for four doses poses a significant public health challenge.

Missing from the media reports and WHO statements, however, was any sense of the history of the disease itself or attempts to control it. The desire to eradicate malaria, and the methods used, are intimately connected with colonial medicine and the imperial goal of developing the tropics. The appetite of the public, and organizations like WHO, have long been for single solutions like vaccines. Alongside the discussions of solutions, there is almost no acknowledgment of the central role of development activities in creating the problem of malaria as we see it today. As long as our plans remain uninformed by history, and fail to acknowledge the broader causes of malaria infection, there is little hope of achieving malaria eradication and relieving the burden of the 229 million estimated infections seen globally every year.

Famously, the first effective malaria prophylaxis was quinine. First encountered by Europeans through the treatment of fevers by indigenous healers in South America, powdered cinchona bark was initially used to help treat malaria. When quinine was isolated from cinchona tree bark by French researchers Pierre Joseph Pelletier and Joseph Caventou in 1820, it was found to also work as a preventative measure and so was adopted by the British army for use on their troops in 1848. The British government spent large sums importing the bark from South America, until the British introduced it to the soil of their imperial jewel, India, in 1861.

A U.S. Public Health Service poster advocates doses of quinine to prevent malaria. (Courtesy Library of Congress)

The ability to grow their own supply of quinine was hugely significant for Britain. Quinine enabled colonialism, allowing agents of empire to successfully control malarial areas for the first time, leading to a series of expeditions inland to claim vast swaths of Africa. Quinine was also a result of the travel, trade, and knowledge networks of colonialism; the experimental plantations that filled Indian hillsides were a living testament to the success of British expansion. The circumstances surrounding the adoption and cultivation of quinine to control malaria shows us how both the disease and the attempts to control it were, in Rohan Deb Roy’s terms, “co-constituted” alongside the British Empire itself.[2] As the British Empire grew through the remaining decades of the nineteenth century, malaria prophylaxis enabled imperial expansion, and, as part of the burgeoning field of tropical medicine, it also provided its own moral justification.

In 1905, Professor Ronald Ross gave a lecture in St George’s Hall in Liverpool laying out the successes of the new discipline of Tropical Medicine. Ross had spent 25 years working in the British Indian Medical Service and had recently been awarded the Nobel Prize in Physiology or Medicine for his work proving that mosquitoes spread malaria. Alongside Ross’s report on scientific achievements, he provided his vision for the British imperial enterprise. The neoclassical St George’s Hall, with its columns, porticos, and Grecian style sculptures, was a fitting venue for his talk. Ross drew a clear line of triumphant progress for his audience, beginning with the cultures of ancient Greece and Rome, right up to the work on the prevention of tropical diseases. According to Ross, diseases like malaria, elephantiasis, sleeping sickness, and smallpox were the “great enemies of civilization” in British colonies and were “really the principal bar to progress” in “uncivilised” tropical Africa.[3] For Ross and his colleagues, the perceived inherent filth of the tropics combined with the absence of civilized hygiene led to disease, and disease was the enemy of progress. Practitioners like Ross imbued the experience and treatment of disease in the colonies with ideas about civilization, moral superiority, and racial hierarchies.[4] The prevention of malaria thus took on a noble and moralistic character in the eyes of colonialists.

The ideas of Ross and his colleagues, such as the perception of malaria as a blockade to development, carried on through the first decades of the twentieth century. Malaria was still seen as a disease that was “natural” to many places in Africa and Asia. As a result, many schemes to eradicate malaria today focus primarily on mosquitos, rather than the conditions that make humans vulnerable to the disease. In contrast, Randall Packard, a historian of medicine who has written extensively on malaria, argues that wider societal forces significantly affect malaria prevalence. Packard demonstrated the impact of societal change in southern Africa on the presence of malaria in the early twentieth century, complicating the view that malaria in Africa was inherent and unchanging.[5] Packard’s study of malaria epidemics from 1912 to 1945 in Swaziland showed that social changes resulted in a higher burden of malaria in the Swazi population. Colonialism, changing labor and food production practices, and the introduction of new restrictive laws had a major effect on the health and sickness of the Swazi people. Malaria prevalence changed from a seasonal issue to a series of repeating epidemics. The very activities that colonialists saw as developing a country were, in fact, increasing malaria outbreaks. Packard has also shown that anti-malaria campaigns launched in South Africa in the 1950s led to the further disenfranchisement of the black farmers who had made use of their acquired resistance to malaria to successfully farm in areas white farmers avoided, due to the presence of malaria.[6] The relationship between development and malaria is a complex one, and efforts to eradicate malaria do not always have the intended effect.

A post from the Malaria Eradication Programme.
A poster from the Malaria Eradication Programme, c. 1960. (Courtesy U.S. National Library of Medicine

The WHO Global Malaria Eradication Program (GMEP), launched in 1955, also had unintended consequences and failed to achieve its goals. When the program ended in 1969, it was accepted that eradication in many endemic countries was not achievable with the resources available at the time. The GMEP was designed along similar lines to those envisaged by Ross and other early practitioners of Tropical Medicine – a large-scale campaign of “vector control,” or the killing of mosquitos through insecticides and other means.[7] Historical analysis shows that these kinds of campaigns can achieve success but are hard to sustain. Relying almost entirely on DDT spraying, the GMEP depicted other measures, including insecticide sprayed nets, as unnecessary.[8] As a result, mosquitos globally developed resistance to DDT, and measures we now know to be successful and effective were left out of the campaign altogether. Following the withdrawal of funding for malaria prevention in many areas, some locations saw increased transmission, due to reduced acquired protection in their population. As late as 1999, the WHO launched the program “Roll Back Malaria,” with the main slogan “roll back malaria, roll in development.” The continuing rhetoric of a simple causal relationship between malaria and underdevelopment ignores the abundant scholarship complicating this relationship. WHO announcements about the introduction of the vaccine show us that they hope to “reinvigorate” the fight against malaria in some countries, in spite of historical evidence that single interventions, like vaccines, rarely have that power.

It is essential that policies and programs to control or eradicate malaria take into account the history of the disease and the broader socioeconomic factors that impact infection. The “human ecology of malaria,” to borrow Packard’s term, is hardly a new idea.[9] Field malariologists in the nineteenth century had insight into the wider context of malaria infection, commenting on poverty, migration, and nutrition. Today, malaria is recognized as symptomatic of global inequities in resource allocation.[10] However, these insights have rarely been included in the design of efforts to reduce or eradicate malaria.[11] Arguably, the lack of engagement in “historical reflection and biosocial analysis” led to the failure of previous malaria eradication efforts like the GMEP.[12] When knowledge silos result in failure to pass information from one discipline to another, we fail to grapple with the complex history of attempts to control and eradicate malaria. In doing so, we risk repeating failure in the present.

Notes

  1. See also “WHO Hails ‘Historic Day’ as It Recommends Malaria Vaccine,” The BMJ, accessed 1 December 2021, https://www.bmj.com/content/375/bmj.n2455; The Lancet Infectious Diseases, “Malaria Vaccination: A Major Milestone,” Lancet Infectious Diseases 19, no. 6 (June 2019): 559, https://doi.org/10.1016/S1473-3099(19)30222-1.
  2. Rohan Deb Roy, Malarial Subjects: Empire, Medicine and Nonhumans in British India, 1820–1909, Science in History (Cambridge University Press, 2017), 3.
  3. Ronald Ross, “The Progress of Tropical Medicine,” Journal of the Royal African Society 4, no. 1 (1905): 272
  4. Pratik Chakrabarti, “Moral Geographies of Tropical Bacteriology,” in Bacteriology in British India: Laboratory Medicine and the Tropics, Rochester Studies in Medical History v. 22 (University of Rochester Press, 2012): 61
  5. Randall M. Packard, “Maize, Cattle and Mosquitoes: The Political Economy of Malaria Epidemics in Colonial Swaziland,” Journal of African History 25, no. 2 (1984): 189–212.
  6. Randall M. Packard, ‘“Roll Back Malaria, Roll in Development?’ Reassessing the Economic Burden of Malaria,” Population and Development Review 35, no. 1 (March 2009): 53–87, https://doi.org/10.1111/j.1728-4457.2009.00261.x.
  7. José A. Nájera, Matiana González-Silva, and Pedro L. Alonso, “Some Lessons for the Future from the Global Malaria Eradication Programme (1955–1969),” PLoS Medicine 8, no. 1 (January 25, 2011): e1000412, https://doi.org/10.1371/journal.pmed.1000412.p.1
  8. Nájera et al., 3
  9. Randall M. Packard, The Making of a Tropical Disease: A Short History of Malaria, Johns Hopkins Biographies of Disease (Johns Hopkins University Press, 2007), 10
  10. Leeanne Stratton, Marie S. O’Neill, Margaret E. Kruk, and Michelle L. Bell, “The Persistent Problem of Malaria: Addressing the Fundamental Causes of a Global Killer,” Social Science & Medicine 67, no. 5 (September 2008): 855. https://doi.org/10.1016/j.socscimed.2008.05.013.
  11. Stratton, et al., 855
  12. Jeremy Greene, et al., “Colonial Medicine and Its Legacies,” in Reimagining Global Health, ed. Paul Farmer, et al. (University of California Press, 2019), 33–73, quotation 33. https://doi.org/10.1525/9780520954632-005.

Eleanor Shaw is a PhD student at the University of Manchester where she teaches on the history of global health and is writing her thesis on the history of medical journals. Before starting her PhD, Eleanor worked in international development with a specific focus on maternal health.