If your brand has anything to do with food, the last place you want it to appear in is the Food Poison Journal. But that’s exactly where the Chipotle Mexican Grill recently found itself, prominently placed in a headline confirming its most recentE. coli outbreak.
Thirty-nine people in Washington and Oregon came down with E. coli O26 after eating at a Chipotle restaurant in late October. Twelve were hospitalized. The source of the outbreak has yet to be verified, but experts suspect tomatoes. Chipotle shuttered 43 stores and tossed all remaining ingredients into the trash, patting itself on the back for its “abundance of caution.”
The problem with this response, though, is that Chipotle—whose defining creed is “food with integrity”—has assured consumers that an “abundance of caution” was integral to its mission from the start. Chipotle’s much-touted cautionary approach has underscored such definitive moves as banning genetically modified organisms and supporting locally sourced produce. Thus the “fast casual” alternative has been able to transform a burrito—as one of its advertisements proclaims—into a “food-culture changing cylinder of deliciousness.”
Read more here.
As a writer who covers animal issues, I routinely get alerts from public relations firms seeking ink on the case du jour of animal abuse. These press releases typically detail horrific instances of decrepitude—piglets being flung to the ground and tossed into the trash and the like. Earlier this month, though, I was brought up short by an unexpected subject line in an email from one of these doomsday firms.
It read: “Did a Monkey Pick Coconuts for Your Coconut Water?”
The gist of the story is that macaques—nimble little monkeys—are evidently being bred and trained throughout Southeast Asia to scurry up trees, scamper across limbs, reach their tiny hands into clusters of leaves, pluck off bunches of coconuts, and deliver the goods to their human caretakers, who then manufacture and sell a variety of products, including coconut water, pulp, and milk.
As you’d expect, the monkeys excel at their job. Males typically retrieve upwards of 1,600 coconuts a day; females about 600. This is in sharp contrast to humans who, with our comparatively poor climbing skills, can harvest around 80. If the phrase “exponentially increased labor productivity” leaps to mind, you’re probably not alone. But my press release went dark. It called the arrangement “monkey slavery.”
Monkey slavery? Seems a bit extreme. . . . Read more here.
A version of this piece ran in the New York Times in 2006.
This time of the year, the windows of America are beginning to be dotted with carefully carved jack-o’-lanterns, but in a week or so, the streets will be splotched with pumpkin guts. Orange gourds will fly from car windows, fall from apartment balconies, career like cannon fire from the arms of pranksters craving the odd satisfaction of that dull thud.
There are, to be sure, more productive ways to deploy a Halloween pumpkin. Post-holiday, composting is a noble option. A pumpkin grower in Wisconsin once turned a 500-pound Atlantic Giant into a boat.
But what we almost certainly won’t do is eat it. First cultivated more than 10,000 years ago in Mexico, cucurbitaceae were mainstays of the Native American diet. If for no other reason than its status as one of America’s oldest cultivated crops, an honest pumpkin deserves our reverence.
The current batches that will soon litter the pavement, however, are for the most part irreverent fabrications, cheap replicas inflated for the carving knife. Food in name only, they’re a culinary trick without the treat. For those of us who value America’s culinary past, smashing a generic pumpkin is more of a moral obligation than an act of vandalism.
During the colonial era, the pumpkin was just one squash among dozens, a vine-ripening vegetable unmarked by a distinctive color, size or shape. Native Americans grew it to be boiled, roasted and baked. They routinely prepared pumpkin pancakes, pumpkin porridge, pumpkin stew and even pumpkin jerky.
Europeans readily incorporated the pumpkin into their own diet. Peter Kalm, a Swede visiting colonial America, wrote approvingly about “pumpkins of several kinds, oblong, round, flat or compressed, crook-necked, small, etc.” He noted in his journal — on, coincidentally, Oct. 31, 1749 — how Europeans living in America cut them through the middle, take out the seeds, put the halves together again, and roast them in an oven, adding that “some butter is put in while they are warm.”
Sounds tasty. But one would be ill advised to follow Kalm’s recipe with the pumpkins now grown on commercial farms. The most popular pumpkins today are grown to be porch décor rather than pie filling. They dominate the industry because of their durability, uniform size (about 15 pounds), orange color, wart-less texture and oval shape. Chances are good that the specimen you’re displaying goes by the name of Trick or Treat, Magic Lantern, or Jumpin’ Jack. Chances are equally good that its flesh is bitter and stringy.
In contrast, pumpkins grown in the 19th and early 20th centuries — the hybridized descendants of those cultivated by Native Americans — were soft, rich and buttery. They came in numerous colors, shapes and sizes and were destined for the roasting pan.
The Tennessee Sweet Potato pumpkin looked more like a pear than a modern pumpkin and, as its name implies, was baked and eaten like the sweet potato. The Winter Luxury Pie pumpkin, first introduced in 1893, became so popular for pies that it posed a fresh challenge to the canned stuff. These pumpkin varieties, and scores of others, were once the most flavorful vegetables in the American diet.
Fortunately, the edible pumpkin is not completely lost. While akin to endangered species, heirloom seeds are only a few mouse clicks and a credit card number away. By growing heirloom pumpkins, you can have your jack-o’-lantern and eat it too. More immediately, you can search out heirloom pumpkins at some farmers’ markets.
Of course, advocating a shift in any holiday tradition seems like a futile exercise in a nation that (perhaps because we’re so young) takes its traditions rather seriously. But it’s not as if there’s much of a Halloween tradition to violate. Halloween is relatively new to America. The Irish brought the holiday to the United States in the 1840’s (and used turnips as jack-o-lanterns). But Halloween didn’t become profitable enough for commercial growers to produce decorative pumpkins until the suburbanized 1950’s.
Edible pumpkins were driven near extinction in the early 1970’s when a farmer named Jack Howden started to mass produce a firm, deep orange, rotund pumpkin endowed with thick vines to create a fat handle to hold while carving. The $5 billion a year industry that developed around Howden’s inedible creation is, historically speaking, still in its infancy.
And thus the “tradition” is ripe for improvement. Next year, let’s do something not so different. Let’s replace a fake pumpkin with a real one. The face you carve into it might be more distorted, and it might cost a bit more, but there will finally be a credible reason not to smash the thing at the end of the evening. And most important, as Peter Kalm observed back in 1749, we could once again split it open, roast it, add butter and remind ourselves that some traditions — like cultivating vegetables to eat — should never be destroyed.
Last month the United States Department of Agriculture and Environmental Protection Agency agreed to establish the “first ever national food waste reduction goal.” The program is not only notable for its ambition—it aims to reduce food waste 50 percent by 2015—but for the diversity of its participants. An array of churches, corporations, charitable organizations, and local governments has been asked to play a role. The plan, anodyne though it may be, will surely get a lion’s share of (dull) media attention.
But the one relevant group that’s been overlooked has the most to offer when it comes to reducing food waste: freegans. Freegans encourage eating food sourced from various waste streams pouring from the cracks of an excessively abundant food system. They’re scrappy scavengers who frequent grocery store alleyways, restaurant dumpsters, un-cleared food court tables, and anywhere else that yields a free meal and keeps freegan cash out of Big Food coffers—which kind of explains why the USDA and EPA aren’t terribly impressed. Freegans, who root their lifestyle in 1960s Berkeley-ish activism, package themselves as a subversive social movement.
Precisely what kind of movement—anarchist?, socialist?, punk?—is difficult to say. The freegan manifesto, as it were, reads as if it was written by a precocious if rant-prone high-schooler. It describes freeganism as a “withdrawal from the consumer death culture,” observes that “working sucks!,” condemns “the all oppressive dollar,” and implores us not to sacrifice “humanity to the evil demon of wage slavery.” Couching the generic dumpster dive in this rhetorically shrill language, a “stick-it-to-the-man” posture that supports an “anti-consumeristic ethic of eating,” the freegan manifesto might inspire angrier souls to thrust a fist skyward. But, for the sober-minded reformer, it threatens to condemn the movement to a kind of self-imposed solipsism. This is, after all, America.
Still, we cannot afford to dismiss freegans. . .
My last three columns have explored philosophical defenses for eating animals. I’ve done this from the perspectives of utilitarianism, animal rights, and contractualism. My intention with this series has been, in part, to reiterate how difficult it can be to justify eating animals, but also to defuse the off-putting “total abstinence” dictum inherent in the vegan ideology. There is, after all, almost certainly moral space for consuming animals.
But it’s not necessarily an easy space to find. It’s often neither consistent with the way we currently source meat nor tolerant of a business-as-usual approach to agriculture. It may require radical behavioral changes as well as structural shifts that are pragmatically beyond our control. Ironically, given the current configuration of our food system, these changes may be so difficult to achieve that choosing veganism by default could prove to be the easier option.
That said, there appear to be legitimate ways to eat meat, ways that are consistent with the ethical principles that we rely on to guide us through life, and ways that the future’s food architects might consider accommodating.
Wendell Berry has famously declared eating to be “an agricultural act.” This phrase has become a rallying cry for an agrarian reform movement that seeks to know the sources of our food supply. But, perhaps even more so, eating is also an ecological act, an elemental behavior that extends beyond the local farm and the farmers’ market to the endlessly interrelated biotic community. It is from this latter perspective—deep ecology—that I want to suggest a fourth philosophical defense for eating meat.
Read about it here.
Perdue, the fourth-largest chicken company in the United States, is a giant among giants in the agribusiness world. Recently, it purchased Natural Food Holdings, which owns Niman Ranch, a niche meat producer known for its comparatively impressive welfare and sustainability standards.
News of Niman’s acquisition was generally greeted with the big media equivalent of a shrug, but I think it warrants a stronger, more appropriate reaction: Panic.
Niman was never perfect—its founder, Bill Niman, left the company when it outgrew his small-farm vision. But still, its 700-plus farmers working in 28 states maintain relatively close ties to the landscape, the animals they raise, and even the company that continues to set and enforce its standards of production.
To think that Niman farmers will be able to maintain these meaningful connections under Perdue stretches plausibility to the breaking point. Yet theNew York Times’ brief report on the Niman purchase does just this. It suggests that the Perdue acquisition is evidence that Big Ag is finally embracing the gentler logic of small-scale, alternative agriculture. On the topic of animal welfare, it quoted (without offering a counterpoint) Jim Perdue as saying, “I think [Niman] can bring us a lot of new ideas.”
Please. Perdue’s entire corporate history is one of rejecting Niman’s new ideas. . . . . Read more.
A recent study found that ants offer a better form of agricultural insect control than chemical insecticides. If indeed true, this finding would appear to be excellent news for the prospect of veganic farming. I therefore find the idea very exciting.
With predatory insects, farmers could grow plants for people to eat without exterminating other insects with toxic chemicals–something that’s routinely done today, even in organic agriculture. The only catch here is that we’d have to breed and deploy insects such as ants to do the dirty work that the chemicals once did. They’d have to, in essence, set one insect species up to slaughter another.
This drawback is only a drawback, of course, if we are inclined to grant insects status as sentient beings. If we do that, we are under a clear obligation to treat insects with the same moral consideration as pigs, cows, and chickens.
As such, we could not condone an arrangement whereby insects are, for all intents and purposes, domesticated in order to serve us as forced armies in the vegetable patch and fruit orchard. True, the slaughter would be sort of natural, but still, we’d be in the position of rigging slaughter to serve human interests, something that animal rights activists typically find anathema.
Fortunately, there’s little convincing evidence that insects are sentient. I thus see this recent finding as yet another reminder of why we should not grant insects sentient status. The prospect of doing so undermines the more achievable goals that animal advocates are trying to enact for animals we know for sure to be sentient and demanding of moral consideration.
In response to my last post, several readers have pointed out the prevalence of animal products in everyday consumer goods, as well as our myriad indirect associations with animal exploitation. My response? Aside from “thank you,”
This reality you have duly highlighted, after all, only further supports the larger case that I’m making with the beef-fat-fuel example.*
And that case is this: given the ubiquity of animal products in the world around us, as well as the numerous ways in which our voluntary activities harm/kill animals, veganism as currently understood is less a clear moral baseline line than a circumscribed choice to avoid animal products in relatively easy and accessible contexts.
Of course, this is not to say that we shouldn’t avoid those animal products in those relatively easy situations, or that we shouldn’t strive to do so in the harder cases as well. It’s only to say that if we engage in actions such as driving or flying—things we could give up but won’t because it would seriously put a crimp in our life—we are, technically speaking, violating the spirit of vegan.
Now, one could say that the point here isn’t to be perfect but to do the best we can, always striving to be better, always recognizing the challenges posed by reality, always working toward the ideal. Well, amen to that!
But we have to recognize that this kind of approach to ameliorative social change closely associates veganism with religious belief, and that association makes it harder for vegan advocates to impose their agenda on others. (Plus, I think what vegans want—a recognition of the fundamental moral standing of sentient animals—-transcends religion.)
In any case, just to clarify: it seems as if some readers are under the impression that I’m looking for an excuse to throw off the gloves of morality, gleefully poke holes in veganism, and eat meat. Not so.
So not so.
I’m just asking questions about the term vegan itself, the term that we use to make sense of our moral regard for sentient animals, and question whether or not there is a better way to encapsulate the vegan ideology, a way that is more inclusive, less alienating, less cultish-seeming, and more tolerant of various personal processes.
That’s all that’s happening here.
*Which, in a basic way, is different than say leather seats on an airplane, or animal products in tires, in the sense that a plane is not reupholstered every time it takes off, and the tires on a bike are rarely changed, whereas fuel is an ongoing resource demand. I think this is a matter of degree with qualitative implications.
The fact that commercial airlines are preparing to use beef fat to help fuel aircraft is the kind of news that sends the eco-razzi into celebratory whirligigs.
It hardly matters that we’re looking at yet another meaningless example of “reduce, reuse, recycle” pomp to mask deeper problems that demand more systemic and radical solutions. It hardly matters that using beef fat (beef being one of the most ecologically damaging products on earth) to subsidize flying (flying being one of the most ecologically damaging services on earth) is like robbing Paul to pay Peter; at the end of the day it’s just another lovely, feel-good case of reducing waste, an act whose evidently inherent virtue makes the media go all loopy while obscuring the underlying, scolding question of why we rely so heavily on these goods and services (beef, flying) in the first place.
But that’s all high horse talk. Down in the streets vegans have a new and difficult question to ask themselves: will vegans fly in planes fueled by the animals we claim to do everything in our power not to exploit? I couldn’t help but notice an ominous dearth of commentary on this heavily covered media issue in the vegan blogosphere. Although I can certainly understand the reticence. The prospect of every major airline supplementing fossil fuel with beefy bio-diesel is a real one, and if that possibility comes to fruition, vegans face yet another case of a terribly convenient aspect of first-world life—flying—that, while hardly necessary to existence, is something we’ll most likely never give up. Vegans, in other words, will routinely participate in yet another activity that harms animals when, realistically albeit very inconveniently, they could avoid but won’t.
As a result, they will further gut the meaning of vegan from within.
In 2013-2014 I flew 35 times to locations where I preached (in part) the ecological virtues of not eating meat. Absurd, of course, that I was flying hither and yon to do this, but what if my mile-high experience had been powered by beef? Well, I’d have to be the first person to laugh my ass off at myself.
Readers, pipe up. What to do about beef-powered planes?
Fact: driving a car kills animals.
This killing is not necessarily intentional. But, because we know that killing insects, squirrels, chipmunks, deer, birds, and so on is inevitable, the killing cannot be called completely unintentional either. Driving is the collateral damage of getting from point A to point B, a reluctant form of animal sacrifice we allow in order to take journeys that add immensely to the quality of human life.
I have noted elsewhere that driving presents the vegan with a conundrum, and this proposition has been met considerable resistance. So allow me to think out loud on this.
I believe driving presents a conundrum because vegans aim to avoid exploiting animals whenever they possibly can. The decisions to not eat them, wear them, or exploit them for research or entertainment offer the most obvious ways of fulfilling this larger mission. Vegans I know do these things admirably well and, without doubt, they are making the world a better place for animals.
But the avoidance of eating, wearing, or exploiting animals for research or entertainment is veganism’s low hanging fruit. It’s relatively easy, or at least something most of us can realistically do right now and right away. The fact that only about 1-3 percent of Americans do it is sort of distressing, but still, it can be done with little preparation or alteration to one’s way of life.
But driving? For obvious reasons, driving is much, much harder to avoid. But let’s face it: it can be avoided. Many people, in fact, radically alter their lives to avoid driving. I can sit here and assure you that I will not do this. But, fact is, I could. Fact is, my consideration of animal welfare does not extend far enough for me to make that sacrifice. Any vegan who drives must, I would venture, have to agree with this difficult admission.
The common response to this conundrum has been to stretch the definition of veganism to include the idea of doing what’s “pragmatically possible.” Not eating animals is pragmatically possible, it is said. To stop driving is not.
This move, however, doesn’t really work, if for no other reason than the fact that “pragmatic” introduces a big gray area hiding a slippery slope. Giving up driving might not be pragmatic for you, but for the next person, giving up the chicken soup that grandma makes every Christmas Eve isn’t pragmatic, either. Being ostracized from your family over not eating a meal that is going to be made either way is not pragmatic. Pragmatism, in essence, is inherently relative. Nobody can place limits on what it is.
To the extent that driving forces vegans into a reliance on pragmatism, it forces us to acknowledge that, in reality, a less clear distinction separates the vegan from the non-vegan than is popularly thought. For example, a vegan who does not eat meat but drives every day will kill more animals than the non-vegan who never drives but eats grandma’s chicken once a year to preserve familial harmony.
That’s a tough thing to acknowledge. But we must. So, perhaps instead of thinking about the world as comprised of vegan and non-vegans, we might consider thinking about the world as full of people who exist on a continuum of causing harm to animals. The closer we move toward not harming animals, the better. But the fact is, even those who aim to radically reduce their impact on animal suffering—by not eating, wearing, or exploiting animals for entertainment and research—still harm animals through decisions that they can avoid but don’t.
Trying to cover up that reality with the label “vegan” may do nothing to help the animals we harm.