How Graphic Should War Coverage Be? Ctd

Last week we ran an Urtak poll of Dish readers regarding the thread and the questions it raises. Below are the results from the roughly 1,000 readers who responded (blue means yes, orange means no):

Screen Shot 2013-04-09 at 5.42.39 PM

Screen Shot 2013-04-09 at 5.47.49 PM

But that support dropped a little when it came to graphic images of kids:

Screen Shot 2013-04-09 at 5.46.25 PM

And although readers overwhelmingly support the idea of posting graphic images, they seemed to be sensitive to the impact those images have on the small minority of readers who oppose it:

Screen Shot 2013-04-09 at 5.43.50 PM

Understandably the aversion to seeing graphic images of kids was higher among readers who have children (22% aversion) compared to those who do not (14% aversion). Review all of the poll results here. Another reader continues the thread:

I am a little surprised that most of your readers either object or insist on war photography by listing similar reasons – confronting the reality of a war that we take part in but don’t generally have to look at. It’s surprising to me that no one is really questioning the idea that the photography is in some way allowing us to access this reality. It’s not. It’s a picture. One of the common complaints in theories about photography is that looking at these images actually desensitize the viewer by making them complicit with the act of photography, which is by its nature an act of non-intervention.

It makes the documenting of the event seem to be more important in some contexts than the event itself. Think of photographs of starving children, where the photographer presumably could feed the child but takes a picture instead, arguing that if the world sees the starving child that more children could be saved than the one day of extra life this photographer could provide by feeding. But it also turns the act of witnessing the photography into a feeling that you have done something important by confronting something horrible – when in fact the viewer hasn’t done anything at all, hasn’t even confronted something horrible, has just looked at a picture.

And then there is the argument that says basically the shock of seeing carnage like this wears off, and any sadness or horror the viewer feels on being confronted with the image takes the place of any action or more critical thought that might be engendered by another way of presenting the facts of war. I can’t for the life of me remember where I read it, but I think it’s true that the magazines that first published starving children photos in the ’80s sparked a lot of donations – but not really many after that. The photograph insists on the grinding reality of what it portrays, which suggests that these horrors always exist in the world with or without our participation while also only asking of its viewers that they look at them. All you have to do is view them and you have done your part about recognizing the horrors of war. Now, pat yourself on the back and be sure you look at tomorrow’s pictures from some other bloody conflict.

I don’t really have an opinion about whether or not you post pictures. They make me sad but they don’t give me nightmares. And reading about the conflicts also make me sad. But this isn’t a simple question of whether or not your readers have a moral obligation to see this stuff. It’s a lot more complicated than that, and being aware of those complications can help your viewers steer clear of those traps.

The Jindal Bubble Bursts

Bobby Jindal’s home-state popularity has plummeted and his tax reform plan is dead. Barro’s postmortem:

Advocating the replacement of state income taxes with sales taxes remains a cottage industry for conservative think tanks around the country. They have had a partial success this year in Kansas, where the legislature paid for a modest reduction in the state income tax by making a “temporary” sales tax increase permanent. But conservatives have had less success than they expect with this agenda because they haven’t admitted to themselves the tradeoff it involves. Sales taxes do indeed appear to be better for the economy than income taxes. But they are also much more regressive, so a shift toward sales tax buys economic growth at the expense of greater inequality. This fact, which Jindal was not prepared to rebut, was a key reason his plan died.

Chait’s view:

Jindal was attempting to enact a state-level version of the Ryan approach, but in a context that left him unable to use the Ryan-style obfuscations that are necessary to hide the fact that it’s a gigantic exercise in upward redistribution of wealth.

Silver sees Jindal’s troubles as a reminder that “it can be difficult for a candidate to serve in an executive role and to position himself for national office at the same time”:

There was a series of active governors nominated by the parties between 1988 and 2000 (Michael S. Dukakis, Bill Clinton and George W. Bush). But there have been none since. Mr. Romney was a former governor but was almost six years removed from office at the time of his nomination and talked little about his executive record in either the primaries or the general election.

As the parties have become more nationalized, demanding greater ideological fealty from their candidates, sitting governors may face an unpleasant choice between working to preserve their standing among their constituents while alienating national party leaders – or pursuing a national agenda at the price of their home-state popularity.

Police Work Isn’t That Dangerous

Balko points out that “last year was the safest year for cops since the early 1960s” and that “a cop today is about as likely to be murdered on the job as someone who merely resides in about half of the country’s 75 largest cities”:

In researching my forthcoming book, I interviewed lots of police officers, police administrators, criminologists and others connected to the field of law enforcement. There was a consensus among these people that constantly telling cops how dangerous their jobs are is affecting their mindset. It reinforces the soldier mentality already relentlessly drummed into cops’ heads by politicians’ habit of declaring “war” on things. Browse the online bulletin boards at sites like PoliceOne (where users must be credentialed law enforcement to comment), and you’ll see a lot of hostility toward everyone who isn’t in law enforcement, as well as various versions of the sentiment “I’ll do whatever I need to get home safe at night.” That’s a mantra that speaks more to self-preservation than public service.

When cops are told that every day on the job could be their last, that every morning they say goodbye to their families could be the last time they see their kids, that everyone they encounter is someone who could possibly kill them, it isn’t difficult to see how they might start to see the people they serve as an enemy. Again, in truth, the average cop has no more reason to see the people he interacts with day to day as a threat to his safety than does the average resident of St. Louis or Los Angeles or Nashville, where I live.

The Other Other White Meat

Cuy Dinner

Guinea pig?

Most recent scientific work looking at guinea pigs and diet mostly concerns what the animals eat (as a proxy for humans), and not having them eaten (by those same humans), but research from the 1980s finds guinea pig is good for you, with more protein and less fat than flesh from pigs, cows, sheep, or chickens. Guinea pig is also good for Mother Earth—you don’t need a Ponderosa-sized spread to raise them, and they convert their feed into edible protein twice as efficiently as a cow.

Update from a reader with insights as to why the meal isn’t more popular in American restaurants:

My father is from Peru and I’ve eaten home-cooked guinea pig while up in the Andes visiting family. Unlike what you can get away with when serving chicken, guinea pig – to taste at the very least “good” – must be cooked to order. This fact would disqualify most restaurants from serving it, I assume. It’s often pan fried whole in a chili paste/sauce in a wok-like vessel and because there is no thick portion of meat. And I would not want to eat a whole guinea pig that had been sitting under a heat lamp for three hours. It’s different with a fried chicken leg or breast, right? This is probably why (at least as of 10 years ago when I last visited) most all restaurants in Peru won’t serve it on menu.

(Image by Flickr user Erin & Camera)

The NRA’s Unlikely Role Model, Ctd

A reader writes:

The idea of arms being limited to “well-regulated militias” in the Constitution, as your reader claims his father taught him, shows a complete lack of understanding of both the document and the history of its writing. The (presumably sarcastic) remark that the Court has “let any unstable jerk be a militia, and let him regulate himself” is actually the entire point of the Second Amendment. The right to bear arms was an essential right in 1787 because of the risk of Indian incursion, not as a bulwark against tyranny or a defense against foreign invasion. The concept of a “well-regulated militia” as it concerned pre-Revolutionary America actually would often have been little more than a posse of all available homesteaders organized to defend the area or march on a nearby settlement.

In this context, each individual actually does represent his own militia, because he may be the only one available to defend an isolated farm. The idea that the Founders intended weapons to be limited exclusively to organized military formations is preposterous; with few exceptions, such formal organizations did not exist. In many frontier regions, there would have been no defense available to settlers if the Amendment was read as many now propose.

This absolutely does not mean that gun control is unconstitutional, however. The inclusion of the phrase “well-regulated militias” in the Amendment was not an accident, even if it is frequently misread today.

The Founders intended weapons to be readily available to the extent that Americans would be ABLE to form a “well-regulated militia” to defend against incursion. This is the appropriate reading of the Amendment. Gun control should therefore be based, as it was in 1934 and 1994, on a determination of what constitutes a necessary weapon to enable the formation of a militia. I personally think that hunting rifles and pistols probably constitute an adequate complement of weaponry for a militia (in a country with a standing army), but others may disagree. Tragedies like Newtown should be occasions to reevaluate our definitions.

Instead people on the right defend the right to bear all weapons as a defense against Obamacare or something while people on the left, like your reader, dismiss the right to bear arms entirely on the grounds that militias don’t exist anymore.

Is the Second Amendment outdated? Probably, but I don’t see a sizable movement to repeal it anywhere. Instead of talking past each other we should be debating which types of weapons are defensive and which are not. This is what Obama has tried to do in the past few months, to his credit.

Another reader:

You published correspondence from a reader who wrote: “Well, the NRA finally found a Court that was willing to ignore the word ”militia” and the concept of “well-regulated” – overturning 230 years of jurisprudence. ” This statement is way off-base.  The NRA was not behind either of the cases that went to the Supreme Court resulting in the finding that the Second Amendment protects an individual right.

In fact, the NRA tried every imaginable approach to keep the case that eventually became Heller from going forward.  They filed a copycat suit with other plaintiffs and named additional defendants who brought much more power and influence to the opposing side.  They tried to have their suit consolidated with the original suit, with their attorney named as counsel.  They also tried to get their allies in Congress to enact legislation that would moot the case.  Only after those bringing the suit had prevailed at the appellate level (four years after the original filing) did they finally simply stay quiet while the rest of the litigation proceeded. Similarly, they did not finance or back the McDonald case that incorporated the right established in Heller to state law.

When the Heller case was decided, it brought the Supreme Court’s jurisprudence into line with the consensus of the majority of Constitutional Law scholars, including liberal Constitutional Law scholars.  Your correspondent should read the amicus brief filed by Jack Balkin of Yale Law and others – whom no one could possibly describe as anything other than liberal – in the McDonald case.

The History Of The Everyday

Francesca Mari zooms in on it:

Microhistory arose largely in Europe and the United States in the 1970s and 1980s in reaction not only to the top-down historical narratives common to political history but also to the increasingly quantitative ones of social history. Microhistorians argued that the generalizations of capital “H” Great Man History distorted the truth of how most individuals actually lower-case lived and therefore advocated telling the stories of what one practitioner called “the normal exception”: the interesting small player who could stand in for the average person and, as a result, offer a unique angle overlooked by elite texts and master narratives. Although at first a European phenomenon—perhaps best exemplified by Carlo Ginzburg’s 1976 The Cheese and the Worms: The Cosmos of a Sixteenth-Century Miller—microhistory also found its advocates on this side of the Atlantic: Robert Darnton’s The Great Cat Massacre and Other Episodes in French Cultural History, for example, and Laurel Ulrich’s Pulitzer Prize–winning A Midwife’s Tale: The Life of Martha Ballard, Based on Her Diary, 1785–1812.

Mari focuses on the work of Jill Lepore:

Although one of her essays shows how a biography won Andrew Jackson the presidency and set a presidential precedent, another shows how Thomas Paine, who motivated the American Revolution with his pamphlet Common Sense, died alone in Greenwich Village shortly after being turned away from the poll booth, his nails curled over his toes like claws. Perhaps such instructives are why Lepore doesn’t primarily write for writing’s sake in the form of finely-crafted fiction. She writes to redeem a set of truths about the past for consumption by the normal exception, an audience of ordinary lives, readers who want history that sounds like them: “Who tells the story,” Lepore says in her preface to The Story of America, “like who writes the laws and who wages the wars, is always part of that struggle”—a struggle over who’s included and who’s excluded, which is to say, over who is empowered.

When Animal Rights Goes Wrong, Ctd

Like Marc Champion,  Steve Chapman argues that the ban on horse slaughterhouses has done more harm than good:

When 17 state veterinarians were polled on horse welfare, all said it’s gotten worse since the slaughter ban. According to the National Association of Counties, the number of abandoned horses has risen — just as opponents warned it would. If an owner can’t sell the horse for a decent sum and lacks the money to have it euthanized, he may leave it somewhere to meet death by starvation, disease or predators. … Rescue operations would be a more congenial answer, but they can’t do enough. They currently care for only about 6,000 horses nationwide, and most are at capacity. They couldn’t possibly accommodate the 166,000 shipped for slaughter each year. Those unwanted animals have to go somewhere.

Update from a reader who discusses a factor brought up during the Dish thread on why Americans don’t eat horse meat:

I am a small animal veterinarian who has been watching the horse slaughter debate since the US facilities were closed. One issue that people never seem to address is that fact that horses in the US are not subject to the same medication restrictions as animals raised for meat.

There are strict limitations on drugs that can be used in cattle, poultry, or small ruminants (i.e. goats) due to the fact that the drug residues in the meat end up being consumed by people. Horses exist in this strange in-between world where they are treated as companion animals while they are being used for riding, racing, etc, but then treated as meat animals when they go for slaughter. If horse slaughter advocates want to pursue re-opening US slaughter facilities, they ought to also advocate strict limitations on what medications horses can be given to protect the humans or zoo animals who eventually eat those horses- which in turn may have huge quality of life impacts on those horses treated as companions. Thanks for bringing attention to the issue!

The End Of The Gallery Show?

Jerry Saltz recently sounded a death knell:

These days, the art world is large and spread out, happening everywhere at once. A shrinkingfraction of galleries’ business is done when collectors come to a show. Selling happens 2448485410_9b9a77b208_oyear-round,at art fairs, auctions, biennials, and big exhibitions, as well as online via JPEG files and even via collector apps. Gallery shows are now just another cog in the global wheel. Many dealers admit that some of their collectors never set foot in their actual physical spaces.

The beloved linchpin of my viewing life is playing a diminished role in the life of art. And I fear that my knowledge of art—and along with it the self-knowledge that comes from looking at art—is shrinking.

Artists and dealers are as passionate as ever about creating good shows, but fewer and fewer people are actually seeing them. Chelsea galleries used to hum with activity; now they’re often eerily empty. Sometimes I’m nearly alone. Even on some weekends, galleries are quiet, and that’s never been true in my 30 years here. (There are exceptions, such as Gagosian’s current blockbuster Basquiat survey.) Fewer ideas are being exchanged, fewer aesthetic arguments initiated. I can’t turn to the woman next to me and ask what she thinks, because there’s nobody there.

Sarah Nardi nods:

Warhol said that an artist is someone who produces things that people don’t need to have. That’s hard to argue with. I need water. I need to exhale after I inhale. But I don’t need to have a painting on my wall. No one does. But a sense of connection, I believe, is something we all need.

And art serves as a way to form those connections—to ourselves, to each other, and to the world. We need galleries for that. We need thoughtfully curated shows, the juxtaposition of wildly different points of view, and—most importantly—access to living, breathing, working artists. We need to see them standing next to their work, enjoying a well-deserved glass of wine.

So let the champagne crowd toss back their bubbly and move on. And let the faceless collectors appraise pixelated shadows projected on the virtual wall. Galleries serve a larger purpose than sales. Beyond their role as a place where work is sold, they are physical spaces that help foster a culture of real, meaningful connection. They are, as Saltz writes, “social spaces, collective seances, and campfires where anyone can gather.”

(Photo by Celine Nadeau)

Some Like It Really Really Hot

8352189424_4919d01956_b

There’s an arms race afoot to grow the hottest pepper in the world. Lessley Anderson tasted a top contender, the Moruga Scorpion:

[U]pon popping the Scorpion into my mouth, the tip of my tongue feels like it’s being jabbed by a hundred needles and there’s a heavy burn rolling toward my tonsils. My salivary glands are in overdrive, drool gushes into my mouth and my nose is running. This all from eating a piece the size of a sesame seed.

The Moruga Scorpion, in fact, is at the same Scoville heat unit (SHU) level as police-grade pepper spray. Yet the market is hungry for the superhots, “with hot sauce just behind social gaming and solar panel manufacturing as one of our fastest growing industries”:

There’s a serious commercial advantage to being the official grower of the official hottest pepper in the world. Superhot seeds aren’t commercially available from large seed companies, so heat freaks wishing to grow their own have to buy them online from small suppliers like Duffy and Currie. Being able to market yourself as the record holder is great advertising. Hot sauce makers, who generally contract with one main grower, sell more sauce with a world-famous chile on the label.

“The Superhot peppers are an extremely valuable commodity,” says Dave DeWitt, an author and chile expert who runs the industry’s biggest event, The National Fiery Foods and Barbecue Show in Albuquerque, New Mexico. A typical Scorpion pepper pod at a farmers’ market will go for one dollar, notes DeWitt. “Think if you had an acre of these things; think how much money you could generate. Behind marijuana, they have the potential to become the second- or third-highest yielding crop per acre monetarily.”

For all of the superhot lovers out there, DeWitt has 1,001 Best Hot and Spicy Recipes. (And speaking of marijuana, he also put out Growing Medical Marijuana: Securely and Legally.)

(Photo by edenpictures)

Thatcher On AIDS: No Reagan

On a topic I touched upon here, Harold Pollack, not a big fan of the Tory prime minister, concedes that “Thatcher-era British policies provided a damning contrast to the Reagan and George H. W. Bush administrations”:

The Thatcher government responded rather effectively and humanely to the HIV/AIDS crisis. Embracing harm reduction measures such as syringe exchange and methadone maintenance, it saved thousands of lives. Indeed the words “harm reduction,” anathema to American drug control policy until the Obama administration, were official watchwords of British drug policy. As Alex Wodak and Leah McLeod summarize this history:

By 1986 the Scottish Home and Health Department concluded that ‘the gravity of the problem is such that on balance the containment of the spread of the virus is a higher priority in management than the prevention of drug misuse.’ and recommended accordingly that ‘on balance, the prevention of spread should take priority over any perceived risk of increased drug use.’ This approach was strengthened by the influential UK Advisory Committee on the Misuse of Drugs asserting in 1988 that ‘the spread of HIV is a greater danger to individual and public health than drug misuse…accordingly, services that aim to minimize HIV risk behaviour by all available means should take precedence in development plans.’