A Healthy Dose of Skepticism
The FDA is on a mission to redefine healthy, and they “want to get it right.” This undertaking stems in part from ongoing criticism of the FDA’s nearly twenty-year-old, fat-phobic labeling regulations, in which absurdities abound. For instance, low-fat toaster pastries — comprised predominately of unpronounceable ingredients from a chemistry exam, often meet requirements for being labeled as “healthy”; but neither salmon, avocados, nor almonds do. Too much “bad” fat.
It has been well documented how some of these regulations have resulted from the ongoing influence of industry lobbyists. It appears, for instance, that our fear of fat (and avoidance of avocados) over the last five decades was generously funded by the sugar industry. To be fair, governmental agencies like the FDA that regulate foods, supplements, and drugs must toe a fine line between freedom of speech and consumer protection. On one hand, companies have a right to market their product with glowing superlatives and research studies, even if intentionally misleading. It’s a free country (and market), and people should be able to sell and buy products at their own discretion. On the other hand, at various points since the Food and Drug Act of 1906, government has thankfully created and enforced rules against adulterated food and drugs, false promises of cures, and products that are actually dangerous.
Is more unbiased and transparent science the solution to a better definition of “healthy”? In response to the FDA’s outdated (and fundamentally misguided) definition of healthy, and their long history of succumbing to industry lobbying, the founder of the company behind Kind snack bars, Daniel Lubetzky, has launched an initiative called Feed the Truth. This initiative “aims to improve public health by making truth, transparency and integrity the foremost values in today’s food system.” Given the often skewed (but not necessarily wrong) results of industry-funded nutrition research, we might wonder how far we can trust this program. Yet its mission and organization seem entirely laudable, employing independent experts to “elevate reputable science” and ensure that “science overrules special interests.” Motives aside, the presumption is clear enough: science is finding the “truth” about nutrition, but it’s being obscured by industry.
What should give us pause, however, is not the potential for hidden agendas. After all, regardless of how unbiased research may strive to be, we can find correlation, if not causation, in just about anything related to food. More importantly, we must also consider the potential downsides of crafting an “official” definition of healthy in the first place. Is it possible that our faith in nutritional science as the key to healthy bodies has actually made us less healthy? The history of nutrition and dietary advice shows that the hubris of thinking something like “healthy” can emerge impartially from truly “scientific” research is precisely why we are always wrong about it.
To begin, there are the myriad questions about what exactly “healthy” on a label applies to. Is it the presence or lack of certain nutrients? Do preservatives or other “artificial” ingredients matter? Won’t food manufacturers, given their research labs, simply finds ways to follow the letter of the law, substituting one dubious ingredient for another? Do healthy labels, even on generally innocuous products like Kind bars, encourage eating habits that are generally unhealthy in the long run? Are manufacturers and consumers even speaking the same language? When it comes to labeling, we continue to employ different definitions of food according to our own agendas. Isn’t a perfectly formulated shake the healthiest “food” of all?
To put definitions of healthy on solid empirical ground, we must turn to quantification, such as evaluating the nutritional profile of a generically defined (and sometimes ridiculously small) serving size. Recommended Daily Allowances, for instance, supposedly help us get the “adequate” amounts of helpful vitamins, minerals, and macronutrients common to all “nearly all healthy people” (whoever they are), and not too much of the bad stuff (like salmon). Needless to say, it’s entirely impractical if not impossible to keep track of exactly how much (in grams!) one ingests of these things — a mystery that the dietary supplement industry uses to great effect. Yet these numbers define “healthy” for an incredible variety of human genetics, metabolisms, body types, and lifestyles. Historically, they have led to repeated failures of assessing malnutrition that nonetheless shaped nutritional policy for decades. Because it is convenient, we still use Body Mass Index (BMI) to define obesity, despite its many failings.
Perhaps more perniciously, the insistence on objective science as the deliverer of nutritional truth has led to increasingly specialized and highly-focused studies about how our bodies interact with food. Since the early eighteenth century, learned physicians have argued that their dietary expertise derived from their knowledge of hidden bodily mechanisms. The English physician George Cheyne in the 1740s, for instance, gained his patients’ trust because his knowledge of the invisible plumbing of our bodies enabled him to prescribe sound dietary regimens. In the second half of the nineteenth century, calories and macronutrients became the dominant way of understanding dietary health, a paradigm that still haunts food labels (and dieters) today. Throughout the first half of the twentieth century, vitamins and trace minerals provided even more specificity to maintaining a sound diet. More recently (to mention just two examples) gut flora and microbiomes have become promising avenues for understanding how a whole world of microbiotics affect our health, and how many germs keep us healthy.
Of course nutritional research has been tremendously useful: knowing vitamin C prevents scurvy and iodine prevents goiter (to mention only two well known examples) are crucial advances in public health. Yet because of our increasingly microscopic research and the necessarily imperfect ways it gets translated into dietary advice and onto labels, we are continually (if inadvertently) inculcated to think that maintaining a healthy diet is terribly complicated. This is not a hard sell. Good luck parsing an original research publication unless you’ve got a PhD in a related research field like physiology or biochemistry. Since we can’t understand these articles, naturally we rely on experts, like Cheyne, who at least appear to have the requisite training; and we rely on a variety of media, including labels, to communicate this knowledge to us. We know that food labeling matters: would you rather eat lean finely textured beef or pink slime?
In addition to problems created from inappropriate definitions and quantifications of healthy, the mere act of defining healthy (whether via a label or an accepted numeric range) has historically entailed numerous social consequences far outside the purview of scientific research. In particular, it creates an unnecessary dichotomy between perceptions of healthy and unhealthy. Do we then alienate if not discriminate against those whose can’t, because of access or means, buy certain foods or who have bodies that don’t fit the target demographic?
For instance, history shows us that virtually all dietary reforms — all legitimately pursuing better health through the latest nutrition research — come with an implicit morality of what is “healthy.” So much is clear from the earliest days of nutritional science as it developed in the later nineteenth century, and as the newer nutrition took hold in the 1920s. Early nutrition experts like Wilbur Atwater and Ellen Richards, for instance, advocated for a scientific and economical approach to diet based on the latest “objective” research, yet it was no coincidence that what they considered to be a healthy diet closely mimicked what the professionalizing white American middle class was eating. Luckily, most immigrants with culinary traditions that did not embrace these ideals ignored their advice. Such dietary moralizing in the name of health has persisted throughout the twentieth century, spanning John Harvey Kellogg (of “Corn Flakes” fame), the organic movement of the 1970s (and more recent revivals), and the even more recent localvore movement.
While the history of nutrition can seem like a revolving door of dietary villains and superheroes, the one constant is the eternal optimism that nutritional truth is close at hand — and that we can indeed get it right. Indeed, new research into food interactions with the body, environmental factors, and genetic differences continue to fuel hopes that the right answer is just around the corner — that we’ve almost solved the labyrinth. Yet the informative history of nutrition and the dietary advice derived from it (nicely illustrated in this visual history of food guides) provide ample examples of the futility of pursuing a ubiquitous definition of healthy. In fact, it suggests we’re in the wrong maze altogether.
Ultimately, getting healthier is not a question about elevating the “best” science or making it more transparent. There will always be competing and contradictory research; it will inevitably be imperfectly interpreted, translated, and shaped by intellectual, social, and cultural biases. That’s awfully hard to explain on a box. Perhaps research for a definition of “healthy” should come not from science but from history. In that case, the most honest label to encourage healthiness will suggest paying critical attention to ingredient lists and disregarding health claims.