Thursday, November 19, 2015

The Internet is for Xenophobia

Last weekend's tragic terrorist attacks in Paris have naturally garnered a great deal of attention and scrutiny from news outlets. More significantly, the connection between the heinous violence and emigration from Syria has created a particularly raucous debate on social media.

First off, a Constitutional issue: Upon reading that various governors do not want to allow Syrian refugees into their states, my first thought was "Okay, so?" Certainly they're aware that the federal government dictates the immigration policy for the entire country, right? Thankfully, New York's Governor Andrew Cuomo is aware of this and pointed out the same, whilst voicing his support for a continued tradition of immigration into this country.

His reference to the Statue of Liberty may seem cliche, but it's relevant. With the exception of Native Americans, everyone's ethnic lineage in the U.S. can be traced back to other countries, many within only a few generations. It's disappointing, then, to see such knee-jerk xenophobic reactions to the mere idea that we as a nation offer refuge to people from a country torn by civil war. The Paris attacks understandably strike a nerve with those living in U.S. cities, who fear similar violations on our soil. But the notion that the terrorism in France is inexorably connected to those seeking refuge from Syria has been rebuffed by the revelation that many of those responsible were actually radicalized Belgian and French nationals. Here in the U.S., more deaths have resulted from "homegrown" right-wing extremist attacks than from Jihadist terrorism.

That said, everyone should note that those in favor of accepting Syrian refugees are not necessarily suggesting that we simply let anyone waltz into the country without any investigatory screening. Gov. Cuomo himself expressed that immigration should only continue so long as authorities are able to thoroughly vet potential refugees. In fact, there already exist several layers of vetting for those seeking asylum, beginning first with the United Nations High Commissioner for Refugees and then various U.S. federal agencies and departments. Nonetheless, it's disgustingly intolerant to suggest that religion or ethnicity serve as a screening criteria. While some acts of terrorism have been carried out in the name of Islam or by adherents to the same, dangerous radicals represent a tiny proportion of the world's Islamic population. Instead, counterterrorism experts use forensic interviewing techniques to gather background information, comparing refugees' answers to one another and to existing documentation. In addition, priority is given to vulnerable populations such as single mothers, orphaned children, and special-needs individuals.

As Gov. Cuomo sought to remind us, this country was founded by immigrants, and has accepted waves and waves of people fleeing various hardships in their home countries - the Irish potato famine of the mid-1800s, poor economic conditions in Japan in the late 1800s, persecution of Jews in Eastern Europe in the early-to-mid-1900s, and the Islamic fundamentalist revolution in Iran in the 1970s, to name just a few. Having not been alive during any of these prior eras, I have only secondhand accounts of the political and cultural climate of those times. I'm sure that those whose families had been in this country for more than a generation feared that new immigrants might encroach on their jobs or other economic opportunities, and that certain groups were inherently fearful of others. Human beings are naturally wary of the unfamiliar, which unfortunately includes people from other ethnic and cultural backgrounds.

This tenor is magnified now that we live in an age of constant information. Perpetual access to news media has the potential to make us more informed, but it also leads to the spread of misinformation and unfounded fear. For example, certain headlines, politicians' and pundits' quotes, and popular Facebook posts make it sound as though the United States offered to accept Syrian refugees in direct response to the Paris attacks. In fact, the aforementioned policy geared towards refugees has been in place for several years, and (as also mentioned above) prioritizes asylum for those who are most in need of help. Up-to-the-minute reporting has also created a climate in which news outlets rush to report stories as they break. Most regrettably, this often leads to sensationalist headlines which spread incorrect information before they can be corrected. In this particular instance, some reports initially declared that one of the Paris attackers was a Syrian refugee; later it was clarified that the individual was carrying a forged passport.

Finally, the news media tend to make anything a politician says into a short quote or sound bite that can be taken out of context and interpreted in a variety of ways by the lay public. The ongoing Presidential primary season has provided an unnecessary platform for overly simplified grandstanding by candidates who see an opportunity to look "tough on terror". Immigration is a serious and complex area of policy. The answers are not as simple as "build a wall to keep out the Mexicans" or "don't let in any Muslims"; nor does support of refugees equate with reckless and unfettered immigration. It's unfortunate that people are able to skim a few headlines or short blurbs and buy into such overly simplified rhetoric.

If I could ask anything of my fellow Americans right now, it would be that we react with compassion first. Xenophobia and isolationism will only breed more hatred and fear. Relatedly, I'd urge everyone to remain skeptical about what you hear on the news and read on Facebook. The loudest voices are not always right, nor as self-assured as they may seem. Investigate further and give both sides of an issue their due. Even if you're positively against a certain policy or issue stance, we'll be a better society if people are educated about their views, instead of merely parroting overly simplified quotes from talking heads.

Thursday, November 12, 2015

Tip of the Xmas-berg

By now, worldwide news coverage has been devoted to backlash over Starbucks' unveiling of plain red cups for the winter holiday season. Here in New York, Roosevelt Field Mall rolled out a new winter-themed display featuring glaciers and snowmen, instead of Santa and reindeer, then bowed to consumer pressure that Christmas-themed elements be restored. 

These instances of public outcry seem gross overreactions, to say the least. Though they changed the design, Starbucks continued their annual custom of offering a different cup - not to mention specialty drinks! - come November. People's outrage would seem somewhat more justified had the company withdrawn any and all merchandise geared towards the upcoming holiday season. Similarly, the mall endeavored to evoke a seasonally themed display; its management could very well have decided to forgo any special decorations altogether. 

As a lawyer, it is amusing to hear people underscore these arguments in terms of their "right" to religious freedom. Equating the removal of quasi-religious symbols with an infringement of rights is absurd for several reasons. For one, privately held businesses have no legal obligation to offer religiously celebratory decor. The First Amendment reads: "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof". Moreover, the amendment blatantly provides just as much protection of religion as it does from religion. That is to say that people are equally justified in being offended by the presence of religious symbols instead of their absence. 

Finally, it should go without saying that these pseudo-controversies involving holiday decor are not what the drafters of the Constitution had in mind when implementing the first amendment. In recent years, numerous issues involving religion have arisen which bear some actual import to functioning society, such as:

Individual humans are certainly entitled to their reverence for a make-believe gift-giving fat guy in a red suit; the constitution does not require that every mall in America provide such a man for photo opportunities. People who are offended by the lack of overt Christmas displays don't need to shop at such establishments. OR they can simply use the mall to fulfill its intended function as a place of retail, and engage in festive/seasonal/religious practices in their homes and churches. And if people care that much about the cup their coffee is served in, Starbucks invites them to bring their own - to the tune of a ten-cent discountFurthermore, the coffee chain continues to sell various Christmas-themed merchandise, including an Advent Calendar, stuffed Santa bears, candy cane mugs, ornament tumblers, and mug ornaments. If the company was really as "anti-Christian" as some critics have asserted, wouldn't these offerings have been pulled as well?

In the interest of full disclosure, this blogger is not of the Christian faith, and partakes in celebrations of Christmas only to the extent that she is invited to other people's homes to exchange gifts and drink egg nog. From this perspective, it is easy to roll one's eyes at the disappointment associated with the above events. True religious persecution still exists in numerous parts of the world, and in our own nation discrimination often inhibits the practice of certain faiths. In light of these circumstances, it's disheartening to see so much attention paid and energy devoted to superficial matters.

But if it means shorter lines at Starbucks (whose coffee this blogger would drink out of an old shoe, if necessary), maybe it's not such a bad thing. 

Friday, May 1, 2015

To freeze, or not to freeze...

For any parent - stay-at-home, working, or something in between - issues of work-life balance loom large. This can be especially true for women, who may feel pressured towards "traditional" duties of homemaking and child care while simultaneously yearning to pursue a career as modern society encourages. As everyone tries to find their individual recipe for harmonizing employment with domestic life, discussions about the family-vs.-career dilemma loom large in the media as well. 

A few weeks ago, I came across an opinion piece by CNN's Kyra Phillips, which lauds the advent of egg-freezing and other reproductive technology as empowering career-focused women. The overall message of her essay is inarguable - scientific medical advancements make it easier for women to bear children later in life, enabling them to use their 20s and 30s to advance their workplace goals. However, she displays an unnecessary level of condescension towards others who've decided to start families at the expense of promotions. Further, she insinuates that any woman who practices work-life balance is personally responsible for holding back workplace equality for women in general.

Contrary to her opinion, some women welcome the chance to bear children in their late-20s or early 30s, and enjoy taking time away from work in order to raise those kids. Others need to work to support their households, but effort tirelessly to make time for their families instead of devoting every possible moment to their jobs. This blogger refuses to view such choices as "blowing it" with respect to an individual career, nor somehow stunting the progress of female career paths at large. 


There are plenty of good reasons to freeze one's eggs, utilize other means of assistive reproductive technology, or simply decide not to have children at all. But any such decision involves reasoning unique to the individual. The world is best served when people make choices that fit their specific circumstances, instead of simply running to freeze their eggs because some TV reporter is concerned about setbacks to feminism in the workplace. In this blogger's opinion, the most feminist path a woman can choose is her own - whether or not it includes the corporate ladder. 

Wednesday, April 1, 2015

Clean Cooking Crusade

This is the third in a series of articles about the food industry and its impact on our lives.

For most of my life I’ve been trying to eat healthy, and I think I’ve done a good job of avoiding certain pitfalls. In food pyramid order, I mostly stick to whole grains instead of refined flour, I eat fresh fruits and vegetables daily, my portions of meat tend to be small, I aim for low-fat dairy, and my family rarely consumes fast food or sweets. But, there’s always room for improvement, especially when it comes to minimizing processed foods. In eschewing traditional “diets” as a recipe for deprivation, one of the experts in Hungry for Change advises simply to begin adding good foods to your diet, since doing so “eventually crowds the bad stuff out.”

For me, this has equated to using up the “bad stuff” in my refrigerator and pantry, and figuring out what to replace them with.

  • No more canned soups, with their high sodium content and long list of chemical preservatives. I actually like making my own soup; the challenge will be finding more recipes and time to make large batches.
  • Goodbye, condiments. Mayonnaise, mustard, ketchup, and Asian-style marinating sauce all contain various cocktails of hydrogenated oils, chemical preservatives, HFCS, et al. Hummus is probably safe, but I’m wondering if it’s okay to stick with store-bought or if I should try making my own.
  • Replacing meat on weekdays with beans and nuts. Hummus and nuts are just as tasty on salad as chicken (please tell me hummus is safe).  
  • Replacing mass-produced meat on weekends with cage-free, grass-fed, sustainable choices. This honestly might be a matter of budget, and researching which brands are truly an improvement when it comes to health and sustainability.
  • Snack foods. It’s enough that I have to choose quinoa over processed pasta. The idea of never having a cookie or cracker again feels excruciating. Finding healthy alternatives for snack foods may simply consist of eating even more fruits and vegetables as snacks.
  • Caffeine, my first love (sorry, Hubs). The struggle to eliminate caffeinated beverages from my life is ongoing (and probably deserves its own article).

Of course, I need to reconcile my own eating goals with my family’s. My lovely husband is lucky enough to have the same metabolism he did when we first began dating – at age 17. This has enabled him to keep eating ice cream sandwiches, chips, and soda as we enter our thirties.  He is also blessed with self-control, chiefly when it comes to portioning less healthy foods. Whereas I still eat walnuts directly out of the bag until I’m full (or bored), he places a handful of chips onto a plate and puts the bag away. Years before my healthier-eating experiment began, he’d banned gallon-tub ice cream in the home because he realized how easy it was to eat too much at once. Buying individually wrapped ice cream sandwiches (Klondike Oreos are his favorite) has done the trick as far as allowing him to eat one per day. It also prevents me from eating any at all, since I can’t fudge the nutritional info of exactly one serving anymore. (One cup of ice cream? That’s like two scoops, right?) Hubs also abhors condiments, the use of which has been shown to encourage overeating (i.e. we keep eating chips until the dip runs out). This has made him amenable to some of the changes – namely the switch from Asian marinades to a lemon or lime juice-based garlic rub for chicken. The incorporation of quinoa into some meals has also gone over well ever since I figured out the perfect combination of garlic, onion, and herbs to sauté it in.

I do realize I’m lucky to have a few advantages in this endeavor: 1) I’ve always enjoyed cooking my own meals, and 2) right now, my schedule and budget allow me to go to the grocery store and prepare said meals. After last week’s visit to our local produce market, I realized I had bought enough fruits and vegetables for a week for the same price as one restaurant meal. That alone is reason to celebrate taking a step in the healthiest direction.

Once home from the store, I happened upon this article. (The lovely reporter went to college with my husband). The story features Congresswoman Debbie Wasserman Schultz (D-FL) and her own quest for healthier cooking and eating. The well-known Congress member and Democratic National Committee (DNC) chairwoman describes how her breast cancer diagnosis and treatment forced her to examine her subpar eating habits and to replace them with a more conscientious “clean cooking” diet. She acknowledges that doing so entailed a big commitment, calling it “a lifestyle” unto itself and admitting that she’s stayed up into the wee hours of the night in order to prepare her healthy meals.

In the accompanying video, Schultz lauds the clean cooking community of Instagram for inspiring and supporting her efforts. I find it amusing that someone who works alongside 534 other elected representatives (not to mention support staff and other Congressional personnel) had to cultivate like-minded foodie comrades online. Since then, she has shared photos of her culinary creations with the world and attempted to spread the word on clean cooking among her colleagues.

Unfortunately, her predicament as a healthy food pioneer is a reflection of American society at large. Most adults in this country give in to the ease of take-out meals and prepared foods while juggling busy work and family schedules. Life can be a whirlwind of stress no matter one’s circumstances, and a lot of folks feel it is impossible to go out and buy groceries and then turn them into healthy meals. But the tradeoff of time-saving foods often comes back to haunt us in the form of chronic disease requiring medical intervention. I hope that more people follow the Congresswoman’s lead, sacrificing at least a little bit of their week to shop for groceries and cook at home. If she can manage it, we can all at least give it a try, can't we?



Friday, March 6, 2015

Prescription for disaster

It's hard to turn on the TV or open a magazine without seeing an advertisement for pharmaceutical products of one type or another. It's even harder not to be amused by the litany of warnings ("Don't take this medication if you take nitrates for chest pain, if your name begins with the letter F, if you cannot sit or stand for thirty minutes…") and possible side effects ("including but not limited to headache, temporary memory loss while staring at waterfalls, dry mouth…"). As an attorney representing plaintiffs in pharmaceutical products liability cases, my job has included reading package inserts that go into even more detail about the potential dangers of a medication and its indications for use. I've also pored over hundreds of pages of materials describing the process of clinical trials and Food and Drug Administration (FDA) review prior to approval. If anything, it made me wish that the drug companies had to be more thorough in their TV and print ads. 


In fact, that used to be the case. Direct-to-consumer advertising has been legal in the U.S. since 1985, but was initially restricted to print media due to the stringency of FDA regulations regarding content. Back then, an ad was required to convey every risk detailed in the drug's packaging, as well as a "fair balance" of information concerning both these risks and the benefits. Television commercials became popular in 1997, when the regulations were relaxed to require merely an "adequate provision" of the "major risks." Over the next ten years, pharmaceutical advertising budgets exploded, topping $5 billion in 2006 and 2007. 


Accompanying this boom has been the seeming invention of  diseases purely for drug marketing purposes. One of my favorite health documentaries, Forks Over Knives, includes discussion of the ever-marketed erectile dysfunction (ED); namely, that ED itself is not a medical disorder but rather a symptom of other health problems, including heart disease and diabetes. Dr. Terry Mason, a Chicago public health official, suggests that instead of using medication simply to improve sexual function, ED sufferers should change their diets to alleviate the underlying maladies and improve overall health, including "below the equator".


On other occasions, pharmaceutical companies have marketed their products for completely different disorders than the ones they were intended to treat, and unlike the invention of penicillin, it’s no happy accident. The New York Times reported on Shire Pharmaceuticals' recent efforts to promote awareness of binge-eating disorder. The company recently received FDA approval to market one of its preexisting medications in advance of releasing a new drug to treat the disorder, which was officially recognized by the American Psychiatric Association in 2013. The drug, Vyvanse, had previously been approved to treat attention deficit hyperactivity disorder (ADHD); as a result, it was approved for its new indication without FDA advisory committee input. Nonetheless, medical professionals remain concerned that the drug - a powerful amphetamine - will be overprescribed and misused. And with good reason, considering the FDA charged Shire with improperly promoting Vyvanse and Adderall by downplaying their addictiveness and touting questionable benefits.


This cycle of activity gives way to suspicion of the merits of identifying every quirky behavior as a disease or disorder. Then again, far be it for me to pass judgment on the legitimacy and seriousness of a medical condition, since I'm not a doctor. 

But that's just it - hawking pharmaceuticals directly to the people undermines the ability of medical professionals to treat our ailments with a variety of tools. It's human nature to look for the easiest way to accomplish a goal, and pharmaceutical drugs easily present themselves as a quick-fix for numerous conditions. One psychiatrist expressed to the Times his concern that, as effective as Vyvanse may be, talk therapy and other treatments have been the subject of more research. Nonetheless, until there's big money in marketing visits to a therapist, drug companies will continue to tell us that pills are the answer for all of our problems... and what those problems are in the first place.

Given my prior experience, I have a bit more to say about the pharmaceutical industry and its influence. For instance, I've been to FDA advisory committee meetings, and can't say for sure that holding one would have made any difference in the decision to expand approved uses of Vyvanse. For now, though, I'd love to hear your thoughts on this new development. Am I unnecessarily concerned about the perils of pharmaceutical advertising? Should we simply be grateful that pharmaceuticals can help us manage our medical problems? Sound off in the comments, or email me at nylawmom@gmail.com.

Monday, February 23, 2015

Big Brother is watching...what you eat!

This is the second in a series of articles about the food industry and its impact on society.

As part of my recent quest to eat healthier foods, I began consuming (with my eyes and ears) a few videos on healthful eating. Hungry for Change emphasizes how eating processed and refined foods deprives us of necessary nutrients while overfilling our bodies with more calories than we need for our mostly sedentary lifestyles. Meanwhile, it goes on, crash diet culture tells us that avoiding certain foods for a short time will allow us to lose weight, which usually doesn’t work. The panel of contributors advocates instead that we transition to a healthy lifestyle including mostly unprocessed whole foods, sufficient rest and exercise, and generally surrounding oneself with positivity.

Forks Over Knives explains how food is responsible for the poor health of many Americans, most of whom then seek cures via pharmaceutical drugs and surgeries. The doctors featured have found themselves shunned to the fringes of the medical community despite their patients’ ability to overcome serious illnesses by eating whole-food, plant-based diets.

I also watched a few TED Talks compiled as Netflix’s “Chew on This” series, feeling utterly informed and empowered by Jamie Oliver’s quest to improve nutrition in schools and surrounding communities, Dan Barber’s concerns about sustainable fish, Mark Bittman’s lamentation over current societal norms when it comes to food, and Graham Hill’s strategy of weekday vegetarianism.

One central theme of particular import to this blog is the notion of unhealthy eating as an overarching societal problem. On the one hand, hearing the numbers – how many Americans suffer from obesity, heart disease, diabetes, high blood pressure; how much meat we eat per year; how much of the rain forest has been destroyed because of how much meat we eat per year – gave me an overwhelming sense of guilt. But almost immediately, my skepticism kicked in and I felt indignant about avoiding some kind of trap. “That’s just what they want me to feel! Why should I stop eating meat and dairy in moderation because so many other people overindulge and need quadruple bypass surgery?” For my own reasons, I eventually settled on trying to eat healthier. Moreover, personal quests for healthy eating aside, I came to realize that the collective problem of lousy nutrition and our failing health is exactly what concerns me as a consumer protection attorney.

Numerous factors have conspired to result in our present state of eating as a society.  As I discussed previously, the food industry has relied on corn as its staple input due to its abundance and the low cost in using it to make a variety of products. One of the documentary food experts went so far as to say that “we’re not eating food anymore; we’re eating ‘food-like products.’” In his TED Talk, Mark Bittman discussed early-20th century cuisine, wherein dishes didn’t have “ingredients” because you were simply eating one food at a time. Contrast today’s supermarkets, shelves stocked with boxed foods containing any number of nutritional (not to mention chemical) components.

Because they’re produced on a massive scale with many synthesized ingredients, packaged creations are relatively inexpensive to produce. This makes them cheaper to buy than fresh and unadulterated fruits, vegetables, grains, meats, etc. Additionally, processed foods are heavily marketed to intended customers, and sold in portions and quantities that encourage mass consumption. In conversations it’s typical for people to talk about being “addicted to” certain foods, describing how many cases they have in their basement or the size of the drink they had at lunch.

Likely as a result, the last several decades have seen our food culture infiltrated by numerous “diet” or “light” products, marketed as healthful alternatives to original versions of processed foods. “Eat this and you’ll be healthier!” the products seem to say. But is that really true? To make a low-fat version of certain processed foods, the manufacturer simply swaps the fat content for more sugar, making the fare less filling and less nutritious. Other products are advertised as low-sugar, but contain artificial sweeteners. These chemically-constructed sugar substitutes have been shown to alter the way the brain processes sweetness and therefore reek havoc on one’s ability to feel satiated. (I’ve written about this before, here.) In addition, the branding of something as “diet” or “light” has the psychological effect of giving an eater license to consume as much of something as he or she wants. I grew up in a diet-soda, fat-free yogurt, light bread, sugar substitute (Equal was our brand) household. On one hand, my parents were both trying to watch their weight and therefore sending a good message about being healthy and taking care of oneself. On the other hand, I wish that I’d been taught to eat healthy portions of original/regular/full-fat versions of processed foods, alongside the wholesome fresh foods we kept in the fridge.

The proliferation of light and diet foods is part of a cultural motive wherein certain components of food have been demonized. Jerry Seinfeld has a bit about people’s bewilderment with grocery store ingredients panels: “There’s fat! In it! It’s going to be in me!” But where did we as a society get this idea that fat – and carbohydrates, and cholesterol, and sugar – is bad?

My experience as a pharmaceutical products liability attorney left me wondering about the role the Food and Drug Administration (FDA) plays in influencing the food industry. What I didn’t realize is that a more complicated array of government agencies and committees gathers and proliferates information about diet and nutrition. Just this week, the New York Times reported recent findings of the Dietary Guidelines Advisory Committee (DGAC). The committee has met every five years since 1985, issuing recommendations to the Department of Health and Human Services (HHS) and the Department of Agriculture (USDA). These agencies then issue dietary guidelines which are used as the basis for public school lunch programs and national food assistance programs.

Subcommittee topics at the recent meeting touched on current trends in food and nutrient intakes, dietary patterns and health outcomes, the impact of physical activity, and food sustainability. The committee’s recommendations included reduced consumption of added sugars, as well as limiting salt and saturated fats. Further, social media was sent ablaze by the DGAC’s suggestion that dietary cholesterol isn’t as harmful as once thought. This shunning of cholesterol – which began in the 1980s – is partly responsible for decreased consumption of red meat, eggs, and butter. In turn, people increased their intake of grains and other processed foods. Despite our collective avoidance of the fatty foods, problems like heart disease and obesity have only increased; hence the reconsideration. In a related sign of progress, the DGAC suggested deemphasizing individual nutrients in favor of an overall healthy pattern of eating.

Past committee activity has been clouded by the appointment of members with ties to key sectors of the food industry.  At least one legal expert has asserted that the USDA shouldn’t have any nutritional advisory influence at all, since its primary objective is to support and promote the national agriculture industry. While HHS is capable of developing appropriate dietary guidelines, he goes on to suggest that the Centers for Disease Control and Prevention (CDC) would be more capable at collaborating to formulate the recommendations.  In addition, stricter limitations are needed to prevent  conflicts of interest amongst committee members – even if they only stand to benefit indirectly.

Until these reforms are implemented, it seems that we’re cautioned to take the HHS/USDA findings with a grain of hypertension-inducing salt.


Monday, January 26, 2015

Food Policy Post #1

It feels like every few months there’s a news item about a food ingredient or additive that isn’t what it seems, or has some property newly revealed to be controversial.

The offending ingredients are many and varied. Last summer, word got out that a certain brand of yogurt was giving its strawberry-flavored products a nice red color by using a dye made from insect parts. Predictably, the teaser description on every news broadcast preyed directly on the sentiments of the ever nutrition-conscious public by saying something to the effect of “Coming up, are there bugs in your yogurt?! Find out after this break!” Many of the headlines covering the issue were similarly sensationalistic, proclaiming that the company “[uses] insects to make yogurt.”

Of course, the actual segments/articles reporting on the issue backed off a bit in conveying the more complete (and much less noxious-sounding) details: the company uses carmine, a dye made from the bodies of cochineal beetles, in certain selections amongst its yogurt offerings.

I’ll admit that my initial reaction was something akin to “Ew, really?!” But then I wondered whether such revulsion is warranted. We eat all sorts of other animals and animal by-products, but so-called “civilized” Western society has made it taboo to consume anything smaller than a tennis ball if it used to crawl, fly, or slither around. I’m definitely not in favor of New York City cockroaches being added to food for extra crunch. But the story got me thinking about the health implications of the “bug-dye” as compared to other synthetic additives.

A quick glance at any nutrition label reveals certain ingredients which sound like actual food, and others that seem more fitting for the shelf of some mad scientist’s windowless basement laboratory. In what came as something of a surprise to me, a lot of these chemical-y sounding substances are actually derived from corn. How bad can something be if it’s made from a vegetable? Pretty bad, it turns out, considering how extensively these inputs are processed and adulterated before they’re used in making numerous foods.

The fact that a food or food input consists of substances found in nature doesn’t necessarily make it ideal for human ingestion. A good portion of Michael Pollan’s book Omnivore’s Dilemma discusses how corn is the basis for most commercially-produced foods, but “as far as our bodies are concerned... it’s still a biological novelty.”

The most controversial corn-derived food additive of recent years is, of course, high-fructose corn syrup (HFCS). Critics are quick to point out the correlation between the advent of the substance and the meteoric rise in consumption (from zero to 60 pounds per year) for the average American.

In combating the criticisms directed towards HFCS, the corn lobby points out that the product “contains no artificial or synthetic ingredients or color additives and meets the FDA requirements for use of the term ‘natural.’” While this may initially seem like an endorsement of sorts, the FDA has never actually enacted a formal definition for “natural” foods. This has less to do with lack of oversight and results mostly from the subjectivity of such a term. After soliciting public comment on the issue in 1993 and receiving widely differing suggestions for formulating a definition, the agency opted to avoid rulemaking and simply reiterated the status quo. Namely, the FDA deemed “natural” to mean “that nothing artificial or synthetic (including all color additives regardless of the source) has been included in, or has been added to, a food that would not normally be expected to be in the food.”

The result is the following disclaimer appearing in the “Food basics” area of the agency’s website:
"From a food science perspective, it is difficult to define a food product that is 'natural' because the food has probably been processed and is no longer the product of the earth. That said, FDA has not developed a definition for use of the term natural or its derivatives. However, the agency has not objected to the use of the term if the food does not contain added color, artificial flavors, or synthetic substances."
This loose definition has most recently come under scrutiny as pertains to the use of genetically modified organisms (GMOs). These are usually plant-based ingredients that have been bioengineered at a molecular level to achieve a desired outcome or characteristic. Food producers argue that use of GMOs does not preclude the use of “natural” labeling, since such ingredients fall outside the above prohibitions; concerned citizens counter that genetically modified ingredients have been altered from their natural state in a way that requires disclosure. So far, the FDA has punted on this issue as well, citing the absence of data indicating that GMOs or bioengineered ingredients pose a threat to food safety. The agency has developed draft guidance, recommending mere voluntary labeling for such products.

Several California plaintiffs have sued General Mills, alleging deceptive marketing of certain products marketed as “100% natural” despite the presence of genetically modified ingredients. GM has countered that the subjective definition of “natural” moots the plausibility of the plaintiffs’ claims. The company argued that the labeling is simply marketing puffery, and not actionable since a reasonable consumer would obviously be aware that granola bars are processed in a factory and are not found in nature. However, the judge found that the “100% natural” claim is not merely a general assertion, and could lead a reasonable consumer to believe that the products contain entirely natural ingredients. The opinion references a Federal Trade Commission directive warning marketers that they must be able to substantiate representations of “natural” product composition.

Nonetheless, most regulation and case law also recognize that context matters. For example, marketing of “premium all-natural flavors” has been ruled a generalized assertion, insufficient to support a claim of deception regarding a product’s health benefits. In a suit involving the supposed natural composition of ice cream desserts, the court held that “no reasonable consumer is likely to think that ‘Original Vanilla’ refers to a natural ingredient.”

On a somewhat more troubling note, certain substances are apparently FDA-approved additives despite outright knowledge that they cause negative health effects. In Omnivore’s Dilemma, Pollan describes how a popular fast-food meal item contains small amounts of an anti-foaming agent (which keeps air molecules from forming during frying) known to be a carcinogen. The same item also contains a preservative which is derived from butane and deadly in slightly larger amounts.

From my experience in pharmaceutical products liability, I’m aware that the FDA tolerates a certain threshold of side effects and adverse events when approving the production and sale of drugs and medical devices. That said, pharmaceuticals are utilized under the supervision of a doctor by individuals experiencing a specific health problem - ideally one which poses greater risks than those presented by the medical remedy. Food, by contrast, is something we all need on a regular basis. Hence, to me, it seems slightly unreasonable to allow even trace amounts of anything known to be harmful in larger quantities.

Regardless, the food industry is a capitalist enterprise, and thus its participants are concerned with costs first and our health second. HFCS, for example, is cheaper to produce and use than cane sugar, which has allowed beverage makers to produce more soda at a lower cost, encouraging consumption. Thus, even if HFCS isn’t necessarily any “worse” than other types of sugar, its advent has contributed to increased intake of processed, high-calorie drinks and the corresponding increase in negative health outcomes associated with the same. As Pollan’s investigation reveals, large-scale commercial food processing entails breaking foods down into their component parts and reassembling them in a manner that encourages consumption in large quantities. The consequence of this scheme is the increased incidence of Type-II diabetes, obesity, and other effects of habitual overeating.

The backlash against HFCS has also given way to inquiries into the effects of artificial sweeteners. Another news item from the last several months declared that beverages sweetened with chemicals like aspartame, sucralose, and saccharin may actually increase the risk of Type-II diabetes, obesity, cardiovascular disease, and other metabolic dysfunction.

The underlying 2013 article, published on July 10 in Trends in Endocrinology and Metabolism, is a review of various studies involving artificial sweeteners spanning the past several decades. One such study implicates “a type of cognitive process in which knowledge that an artificially sweetened beverage that is perceived to be ‘healthy’ grants permission to over consume other ‘non-healthy’ foods” - i.e., people who consume artificial sweeteners wind up overcompensating for calories saved by consuming foods that are less nutritious overall. Other studies indicate that regularly ingesting artificially-sweetened foods alters the brain’s ability to connect sweet tastes with caloric intake, which leads people to overeat because they don’t feel sated by reasonable portions.

The most recent development in the evolution of sweeteners involves low- or no-calorie products which promote their origination in nature as a testament to their health and purity. Some are derived from stevia, a South American plant. Interestingly, the Food basics section of the FDA’swebsite disclaims that neither whole-leaf stevia nor extracts from the plant have been approved for use in the United States, due to concerns about the sweetener’s effect on the reproductive, cardiovascular, and renal systems, and on blood sugar control. (Considering that the FDA allows trace amounts of carcinogens in our fast food, these concerns must be serious.) One brand advertises stevia as the main ingredient in its “natural” zero-calorie sweetener. Meanwhile, its website reveals that erythritol, a type of substance known as a “sugar alcohol”, is actually what gives the product its “bulk.” Sugar alcohols occur naturally in fruits and other plants, though they obviously must be added to any processed foods. They are lauded for their lesser impact on blood sugar levels as compared with ordinary sugar, the result of their passage through the body without absorption. Their chemical structure also prevents linkage to tooth decay.

One drawback to these chemicals entails digestive side effects when consumed in excess. In addition, diabetics need to avoid being lured into a false sense of security, since sugar alcohols do influence blood sugar, albeit to a lesser extent than “regular” sugar. Among the family of chemicals (which also includes xylitol, sorbitol, and maltitol) erythritol contains fewer calories and exerts the least influence on blood sugar, and is thought to cause lesser digestive side effects.

On the one hand, ingesting a substance that simply passes through the body without really being absorbed or processed sounds just fine, since something of that nature seemingly wouldn’t cause any negative effects. On the other hand, though, nutrition indicates that our bodies are best optimized when they take in unadulterated “whole foods,” making the ingestion of these chemicals seem decidedly unnatural. Add to this the fact that these next-generation artificial sweeteners are still in development with unknown long-term effects, and it’s enough to make me skeptical.

Lest this discussion touch only on additives designed to make us overeat artificial or highly-processed cuisine, the food industry has also taken to “fortifying” certain products with added ingredients such as vitamins, minerals... and even other foods. Pollan writes of a food industry article titled “Getting More Fruits and Vegetables into Food.” “I had thought fruits and vegetables were already foods,” he observes, “and so didn’t need to be gotten into them.”

This practice strikes me as less noxious than adding artificial chemicals or chemically-altered ingredients to foods, but it too goes against the whole-food philosophy. It actually reminds me of a scene in Ocean’s Eleven (2001), in which Brad Pitt’s character, Rusty Ryan, approaches Carl Reiner’s character, Saul Bloom, at a Florida greyhound track. The two have not seen each other for some time, and while they’re catching up, Saul takes out and begins peeling an orange.

Rusty: What’s with the orange?
Saul: My doctor says I need vitamins.
Rusty: So why don’t you take vitamins?
Saul: You come here to give me a physical?

Rusty has a point in that if Saul needs more vitamins the easiest way to get them would be via pill form. However, my own experience has been that doctors are moving away from the “everyone should take a multivitamin” mantra and towards one that encourages a balanced diet of unprocessed, nutrient-laden foods. Hence, I’ve always believed that Saul’s doctor felt the same way and recommended the orange as a source of vitamin C. A personal trainer at my gym also pointed out to me that the most commercially popular vitamins aren’t worth taking. His rule of thumb is that if you can grind the pill up into a fine powder, its contents are too processed to be absorbed by the body. He recommended sticking with a grainier-looking pill... preferably one that smells bad.

Thus, amidst all of this manipulation and confusion about our food, the burden has been placed on the consumer to read ingredients lists and avoid any undesirable foods at their own initiative. While this works with products containing known allergens – peanuts, phenylalanine, shellfish, etc. – a larger issue is that many people simply don’t pay attention to labels generally, taking for granted that every component of every food approved for sale has been sufficiently tested, vetted, and otherwise confirmed for safety.

Going beyond this, toward an assessment of whether these ingredients deemed “safe” for consumption are ideal for us, requires a level of research and attention that most of us simply don’t have time or patience for. I was marginally cognizant enough of these concerns prior to becoming pregnant, but there’s nothing like the news that you’re growing another human being to magnify the importance of what, exactly, is going into your body.

Some choices really felt like a pick-your-poison dilemma. Foods with preservatives in them sound unappealing because of the instinct to avoid extra chemicals in food, whereas fear of listeria poisoning made me nervous about eating the local, organic, and therefore dirt-encrusted lettuce I decided to bring home one day on a whim. A diagnosis of gestational diabetes also pushed me toward the dreaded artificial sweeteners, which I would have avoided but for the need to limit real-sugar intake. At other times, I found myself avoiding sweeter foods (regardless of how they got that way) in favor of more savory, fat-based choices. The knowledge that the apple was “better” for me in the long-term (less processed, no cholesterol, no fat) than the cheese was little consolation against the reality that fruit would elevate my blood sugar and count against my daily carb allotment.

My skepticism about the regulation of commercially available foods notwithstanding, I’ve generally found it easy enough to rationalize that any products on supermarket shelves must meet some minimum standard of health and safety. Up until now, I’ve fallen into the typical consumer behavior of making my grocery store selections based on what’s on the front of the box, or after a quick glance at the fat/calorie content, without scrutinizing the full list of ingredients. But, now that my child is eating solid foods, I’m especially cognizant of reports advising against consumption of preservatives, pesticides, and trans fats. I recall growing up in a culture and household which espoused that the diet soda was "better" for us, even though it turns out society had only limited insight into the safety of artificial sweeteners. And I wonder what we're eating now - and feeding to our children - that we'll later learn is extremely harmful.

The neurotic part of my brain feels that the only safe option is to avoid all commercially processed foods, or at least those with unpronounceable or mysterious ingredients. It being a new year and all, I’ve decided to really scrutinize the labels of foods – that is, read the packaging and avoid buying anything that seems overly processed. Do any readers have similar goals? Or other concerns about diet and nutrition? Since there's definitely more to say about food and policy, I look forward to reporting back (and replying to any comments!) very soon.