I am both a historian of medicine and a practicing physician. This sometimes throws into sharp relief how different medicine is today than it was even 100 years ago. One of the guiding principles in my research and teaching is called “relativism” — trying to understand people and ideas in the past on their own terms rather than assuming that they were wrong and we are right about how bodies work and what counts as good medicine. This is a counter-cultural position to take in a medical and scientific culture that values the newest and most recent findings most. Recently I have been thinking and writing about dietetics, or the use of food to obtain and maintain health. I traveled around Germany, reading and taking notes on a wide variety of cookbooks, professional journals, women’s magazines, and public health exhibits. Yet I learned a poignant lesson about medical relativism and the history of dietetics not in an archive or in the hospital, but in my own home as the spouse and primary caregiver of a cancer patient.
While writing my dissertation about the development of “the new(er) science of nutrition” in the early twentieth century, I noticed how the definition of “good nutrition” changed from the 1890s to the 1930s.1 The “new” science of nutrition in the 1850s and 1860s was the idea that food consists of macromolecules (protein, carbohydrates, fats) that can be counted in calories; this is what scholars call quantitative nutrition. They often describe “newer” nutrition as the change in the early 1900s from quantitative to qualitative standards of dietary healthfulness that appreciate trace micronutrients like vitamins and minerals.
Reading the literature, you could get the sense that dietary advice consisted of quantitative recommendations before about 1910-1920 and qualitative recommendations after that. Eventually, I realized that “good nutrition” in the late-nineteenth century already incorporated both quantitative and qualitative characteristics. Moreover, even before the discovery of vitamins around 1911, quantitative and qualitative nutrition were equally scientific. What changed was the definition of a qualitatively healthy diet.
In the 1890s, the foundations of the quantitative science of nutrition were the high-protein, calorie-counting rules associated (in Germany) with physiologists Carl von Voit (1831-1908) and Max Rubner (1854-1932). Meanwhile, qualitatively good nutrition meant regular, sit-down meals at a well-appointed table with pleasant company, a starter of soup or bouillon to stimulate the gastric juices, and thorough (but discrete) mastication to aid digestion.
Nutritionists in the 1800s emphasized table manners not because their science wasn’t as advanced as ours in the 2000s, but because they understood nerves as mediators between outside and inside the body. Therefore it seemed perfectly reasonable that a disorganized external environment led to poor digestion internally. That may not be strictly true by current scientific standards, but it made sense in a time and place that worried about the effects of modernization on the nervous system.2
By the mid-1920s, nutritional advice itself had modernized and would be familiar to many people today: sit down as a family to enjoy a varied diet that includes both animal and plant products, but not too much of either. (Notice the classed assumption that families could and would eat together.) Quantitative guidelines like getting enough protein, not consuming too many calories, and eating regular meals without too much snacking in between are still the same. Attention to raw or minimally processed fruits and vegetables, fresh dairy products, and whole grains distinguished the new(er) qualitative nutrition.
These recommendations were once tenets of alternative medicine and had been adopted by mainstream experts after the scientific discovery of vitamins in the 1910s. “Vital-amines” were only the most well known part of the maturing field of nutritional biochemistry, which broke down food and physiology into smaller and smaller parts. This kind of “reductionism” into the lower common denominator is still common in science today.
Preparing food and caring for the sick have long been understood as “women’s work,” but historians of women’s and gender history have worked hard to show that it is too simplistic to divide society into male domains like “the laboratory” or “the clinic” and female ones like “the kitchen” or “the sick room.” So I probably should not have been surprised to find that there was no distinction between advice coming from doctors or domestic-science experts to other doctors or home-makers.
I read with interest a little handbook published in 1889 by Germany’s most famous cookbook author, Hedwig Heyl (1850-1934). A Berlin housewife who ran her husband’s factory after he died, raised five children, and made a career for herself as an educator, organizer, and local politician, Heyl offered recipes for feeding the sick “at the behest of a number of friendly physicians.”3
Her advice included attention to such details as ensuring the patient’s physical comfort with pillows and bolsters before feeding him, serving courses one at a time so as not to overwhelm him, and opinions about the presentation of the food: “The dishes used for serving must be suited for small portions, because the daintiness of what is served increases the appetite of the patient, and besides, the food cools off too quickly in large containers.”4 “How quaint,” I thought with mild humor, condescension replacing my usual relativism. Daintiness seemed rather silly and unscientific to me. This judgment floated back to me four and a half years later, as I prepared trays for my husband while he underwent chemotherapy for the tumor in his chest.
The diagnosis came as a shock, of course, but understanding administrators and attending physicians made it possible for me to take time off from my clinical duties as a fourth-year medical student to nurse him. Every three weeks he spent several days in the hospital, hooked up to a pump that delivered both poison and a minimum of sustenance in the form of IV fluid. Then he came home, got sick, and was too miserable to eat or drink, to get out of bed, or even to turn on the light.
I lectured, begged, and cajoled him to eat a little, or at least to drink something. And I found myself preparing tableaus to entice him to eat: cheese and crackers stacked just so, tea in a heart-shaped mug, a folded cloth napkin, strawberries cut into little fans that he probably couldn’t see to appreciate in the preternatural darkness required by a bad reaction to one of the three chemo drugs. On that tray was the desperation of trying to keep him strong enough to weather his illness and hydrated enough to stay out of the emergency room. It carried the hope that if I just made the food offering dainty enough, he could or would overcome the nausea, the metallic taste, the lack of smell, and the mouth sores to consume some nourishment.
According to my twenty-first-century medical training, my actions and motivations as a caregiver were an understandable but unproven approach to the gastrointestinal side effects of chemotherapy. Online forums abound with homemade recipes that someone found palatable in the worst of circumstances. I imagine oncologists smile and nod when their patients ask about this or that one, and encourage them to see if it works for them. Not that they have much to offer themselves: if my husband had continued to lose weight, I could have asked his doctor to prescribe an appetite stimulant like Marinol. Although such orexigenic drugs do increase appetite and weight gain, they do not help patients live longer.5 At most, they allow families and practitioners to feel as though they are “doing something” for patients.
By contrast, according to leading late-nineteenth-century opinions such as Hedwig Heyl’s, I was realizing the best in domestic dietetics, rooted in a combination of common sense and what were once cutting-edge advances in physiology. Finding myself (re)enacting a particular kind of conventional yet thoroughly scientific women’s work, I gained a renewed appreciation for the older qualitative nutrition and the emotional labor it required. My personal experience in the sick room as neither doctor nor historian showed me that it was not as silly or as unscientific as I had initially assumed.
- Elmer Verner McCollum, The Newer Knowledge of Nutrition: The Use of Food for the Preservation of Vitality and Health (New York: MacMillan, 1918); Kenneth J. Carpenter, “A Short History of Nutritional Science: Part 3 (1912-1944),” Journal of Nutrition 133 (2003): 3023-32. Return to text.
- For a quick overview, try Sam Halliday on “The Nineteenth-Century Nervous System” in Chapter 4 of Science and Technology in the Age of Hawthorne, Melville, Twain, and James: Thinking and Writing Electricity (New York: Palgrave, 2007). For longer discussions, read Laura Otis, Networking: Communicating with Bodies and Machines in the Nineteenth Century (Ann Arbor, MI: University of Michigan Press, 2001); or David G. Schuster, Neurasthenic Nation: America’s Search for Health, Happiness, and Comfort, 1869-1920 (New Brunswick: Rutgers University Press, 2011). Return to text.
- Hedwig Heyl, Die Krankenkost (Berlin: Verlag von Carl Habel, 1889), 1. Return to text.
- Ibid., 5. Return to text.
- Ruiz Garcia V., López-Briz E., Carbonell Sanchis R., et al., “Megestrol Acetate for Treatment of Anorexia-cachexia Syndrome,” Cochrane Database Systematic Review 2013; CD004310. Return to text.