Comical Racism, Ctd

A reader defends those critical of casting an African-American actor as a “white” superhero:

While there are no doubt some genuine racists incensed by this just because Fox has cast a black man as Johnny Storm, I suspect most of the pushback really comes down to two closely related phenomena in comics fandom. First – and this you see in pretty much all fandoms – is a major concern with things staying the way they were. Fans get attached to things and worry that minor changes are going to mean more drastic changes in the future. I would compare this, for instance, for the concern raised last year about whether the next Doctor would be black or a woman.  It wasn’t so much that people were opposed to a black Doctor or woman Doctor, as that they (especially original series fans) feared that it was a sign that too much had changed since “the good old days.”

Second, comic fans have adopted something of a “once bitten, twice shy” approach to most adaptations.

With one notable exception, the movie studios tend to treat comics adaptations as mindless franchises, good only for making money off of young men and teenage boys who don’t really know any better. Even series that start out strong have quickly devolved into self-parody and incoherent silliness. While the poster boy of this trend is probably the ’90s Batman franchise (especially the execrable Batman and Robin), Fox – the studio that currently owns the rights to the Fantastic Four – has gotten a huge amount of flack from fans for its treatment of the X-Men (the third X-Men film, Last Stand, is loathed by X-Men fans, and the less said about X-Men Origins: Wolverine, the better) and for its atrocious adaptation of Alan Moore’s League of Extraordinary Gentlemen. And that’s leaving aside the fact that Fox’s earlier Fantastic Four films were completely underwhelming.

When you combine these two factors – conservatism of fandom and distrust of adaptations – it’s easy to see why comic fans are suspicious over what they see as significant changes to the characters they love. It’s telling that so much of the criticism has focused on the Johnny Storm/Sue Storm Richardson relationship. It isn’t that fans don’t understand that interracial adoption is a thing; it’s that they are seriously worried (and, honestly, have every reason to worry) that the film’s producers simply don’t care that the two are siblings, and are going to write the relationship out of the film. Or, worse, that an idiot writer will just look at the last names and think Sue and husband Reed Richards are siblings.

Other reader responses seem to bear that analysis out:

My problem isn’t with a black actor playing Johnny Storm. My problem is with Johnny Storm having a full-blooded sister who is white, which stretches credulity even for a made-up comic book world.

Another:

I wouldn’t want a white actor playing Luke Cage, just like I think George Clooney would be ill-suited to play Martin Luther King, Jr.

Another reader directs us to an essay by Hashim R. Hathaway, who identifies himself as a man of color as well as a fanboy. He writes:

Here’s why I can’t get behind Jordan as Johnny Storm: it’s not enough. In fact, it’s so “not enough” because what some, like Louis [Falcetti], would argue as a positive step only serves to highlight just how far away we are. Kate Mara was cast as Sue Storm. Are they stepbrother and sister? Are they adopted siblings? Does it matter? In one way, it shouldn’t matter at all, but in another way, the fact that it even has to be explained shows how diversity can be forced, because suddenly we have a token black guy on a team of white superheroes. … Johnny Storm being black means there’s some hitherto-unnecessary explaining to do in various other parts of the story.

The reader adds:

Hathaway basically argues for more black characters being brought to the screen, rather than changing the race of existing characters then calling it a win for diversity. Thankfully, we will be seeing The Falcon in the new Captain America movie, and Luke Cage will be getting a Netflix series soon. As a whole, Marvel has a great record on diversity, recently introducing a team of superheroes in Young Avengers who are almost all queer (but you don’t find that out until the final issue). We just need to see some of that diversity make its way to the silver screen.

Mall Of The Wild

Amy Merrick considers the ironies of Cabela’s, the big-box chain for outdoorsmen:

Today, Cabela’s has $3.5 billion in annual revenue. Its 50 stores look like enormous log cabins, and 3859580558_6c924b2e7finside of them hundreds of taxidermied animals grapple in lifelike poses—grizzlies rear up, rams stand atop plaster mountains like figures on a wedding cake. Cabela’s exhibits have been described as “natural-history museums,” and its stores are billed as tourist destinations. When a Cabela’s opened this past October, in Waco, Texas, people lined up outside for hours in the rain, starting at 8:00 A.M. on the previous day.

The growth of Cabela’s reflects Americans’ odd relationship with the outdoors: we mythologize it even as we pave it over. To accommodate their bulk and the crowds that they attract, Cabela’s stores are often built next to interstates and surrounded by giant parking lots. Generally, the only wildlife in sight are the crows picking over the litter. Some of the newest branches are on the edges of cities—Denver, Austin—that epitomize sprawl. In Greenville, South Carolina, where Cabela’s plans to open on a congested retail strip in April, other retailers are worried that traffic jams will scare away their customers.

(Photo of a Cabela’s display in Reno, Nevada, by Flickr user RenoTahoe)

When A Low IQ Can Save Your Life

The Supreme Court will soon determine how mentally disabled a person must be to be considered ineligible for capital punishment:

A 2002 Supreme Court ruling already bars the execution of people with an intellectual disability. But [convicted murderer Freddie Lee] Hall’s lawyers are expected to argue that many US states assess mental ability using outdated measures that take little or no account of current scientific research on the subject. Florida, in particular, is one of 10 states in which anyone with an IQ score of above a certain number, usually 70, is automatically considered to be intellectually competent and is therefore eligible for the death penalty. Psychologists contend that IQ tests are not precise enough to draw such a ‘bright line.’ Hall’s IQ scores range from 60 to 80, and many states would not consider him for the death penalty. …

There is a great deal resting on how the Supreme Court decides this case.

According to one estimate, as many as 20 percent of the more than 3,100 people on death row in the United States may have some level of intellectual disability. So a decision in Hall’s favor could lead to hundreds of appeals, says Nancy Haydt, an attorney in Santa Barbara, California, who is compiling a database of pleas relating to intellectual disabilities in death-penalty cases.

But many mental-health specialists hope that the court will rule more broadly. In briefs filed in the case in December, professional organizations, including the American Psychological Association (APA) and the American Association on Intellectual and Developmental Disabilities (AAIDD), advocated for the court to set a new legal standard that reflects current research on intelligence. IQ tests were never designed to assess the criminal mind, psychologists argue. They say that the modern definition of intelligence – which includes the ability to learn and solve problems, relate to other people and function in society – is much more relevant.

Apathetic Atheism vs New Atheism, Ctd

A reader objects to Thomas Wells:

I don’t know what kind of bubble he’s living in, but he’s completely dismissing (or outright missing) the social context of religion. Wells wrote:

I see no more reason to describe myself as an atheist, than as an afairieist, ahomeopathist, etc. To put it another way, my non-belief is apathetic: the nonexistence of God/Gods is a matter of great insignificance to me.

Well, I do see a reason: there are not billions of people who identify as members of “fairieist” or “homeopathist” institutions. Like the other examples in his list – UFOs, crop circles, ghosts – they are regarded by our society as the harmless delusions of an insignificant minority. Religion is categorically different.

I am an atheist, but religion is important to me because it is important to virtually everyone I interact with in my community; because it is important to the people who write my laws and run my country; because the faith-based method of forming beliefs and making decisions affects me and people I care about. There’s no corresponding phenomenon of astrologist families disowning their gay sons and daughters, or Santa Claus extremists flying planes into buildings. Come on.

Another adds:

The “So what?” from that Thomas Wells excerpt angered me, as the reason God’s nonexistence matters more than that of fairies or UFOs is quite clearly the fact that nobody’s trying to legislate based on the latter. But clicking through and reading Wells’ full essay reveals that he addresses that point by distinguishing secularism as a broader (and, to him, more laudable) category than atheism. So, anger abated.

But reading his piece got me thinking about when and why I moved from his casual brand of non-belief to feeling a need to call myself an atheist and to tell others (politely!). I can remember the moment clearly.

It was a December, early in the Clinton presidency, and I was watching some Xmas special broadcast from D.C. on television. Hillary Clinton was onstage, reciting from a prompter, saying something about her faith in Jesus. Her line reading was so stilted, and her “faith” so transparently part of a political as well as a literal script, I turned to my wife and said, “I’ve never believed in God, but now I’m an atheist.”

In that moment, the dispassionate silence Wells advocates and I’d always practiced struck me as deeply misguided – not because I thought it was my duty to change the mind of anyone who had beliefs I lacked, but because so many of those people took it for granted there was only one way one could or should think about God that expressing faith in Jesus had somehow become a requirement of running our country. For me, saying “I’m an atheist” is an important reminder that believing in God is not a requirement of citizenship.

But a third reader agrees with Wells’ criticism of the New Atheists and offers an analogy:

I deeply dislike soccer. I think it is boring. I think the culture is bizarre and unappealing. I don’t like it every four years when the World Cup gets everyone excited for a couple of weeks; I don’t like it when my friends play FIFA on Xbox; I don’t like it when I watch it with my Latina girlfriend and her family. I didn’t like it when I lived in Amsterdam and my friends would go out to watch Ajax matches. I have tried over and over to give it a chance.

And I just don’t care. I am “asoccer.” Maybe I will give soccer another chance. I mean, spectacle is exciting in general, and I like to think I take interest in my friends’ and family’s interests. And, getting to my point, I will definitely give theism many more chances. But I am without God. I am an atheist. I don’t care.

Mr. Wells’s article spoke to a lot of emotions I have felt about atheism/theism, and, even more so, about the New Atheists. Why do these people so intent on their own atheism spend so much time talking about God? They sound like jilted lovers dishing dirt on how they’re “better off anyway.” Get over it. Move on. These folks have chosen to define themselves in opposition to something, and in the process necessarily must legitimize the very thing they are disowning. It’s just loud madness, and it’s become the face of atheism.

I want to admit to my friends and family that I am an atheist, but the New Atheists are making it harder. They’re contrarians. They’re dickheads. Imagine if I brought up my dislike of soccer to my friends, not just every time soccer came up, but every time sports came up. After the Super Bowl. Every Wiffle ball game. Every time a coworker caught a falling pen.

I don’t know, maybe the analogy is breaking down, but I don’t understand why someone would force themselves to see the world through the very thing they detest. The great thing about not liking soccer is that I don’t have to think about soccer. The great thing about being an atheist is that I don’t have to think about God.

Another steps back:

You know, I’ve seen readers criticize the Dish with statements like “I never read your Sunday posts” because of all the God talk. They should maybe reconsider, because every Sunday you seem to have plenty of atheist talk as well. And I really appreciate both.

The Best Of The Dish Today

Five posts: the Tea Party tiger and its growing hunger; the terrifying campaign to find and jail gays in Uganda; felfies – farmer selfies; a Bill Kristol self-parody; and adding design to the surveillance state.

The most popular post of the day was Erick Erickson Has A Point (dissents here); followed by The Tiger Gets Hungrier.

See you in the morning.

America And The Lessons Of The 21st Century

[Re-posted from earlier today]

Whenever I read the blog-posts of Walter Russell Mead, or the fulminations of Leon Wieseltier, or the sputtering harrumphs of John McCain, or the neoconservative or liberal internationalist horrified respect for Vladimir Putin’s Great Game of geo-politics, I find myself on the other side of a vast, intellectual chasm. I’ve tried at times in my head to engage the arguments, and have consistently found myself at a loss. It’s not the arguments for this or that military or diplomatic intervention that stump me. It’s the entire premise behind them.

At times, I’ve put it down to a very different response to the Iraq War. Many of these voices decrying Obama’s restraint in the face of evil or US President Barack Obama speaks duringdistant conflict have not drawn the same lessons I did from that defining episode in American foreign policy in the 21st Century. I saw that war as an almost text-book refutation of the logic behind US intervention in this century. The Iraqis were not the equivalent of Poles in 1989. They were deeply conflicted about US intervention and Western liberalism and came to despise the occupying power. The US was not the exemplar of liberal democracy that it was in the Cold War. It was a belligerent state, initiating Israel-style pre-emptive wars, and using torture as its primary intelligence-gathering weapon. Its military did not defeat an enemy without firing a shot, as with the the end-game of the Soviet Union; it failed to defeat an enemy while unloading every piece of military “shock and awe” upon it. A paradigm was shattered for me – and shattered by plain reality. A realist is a neocon mugged by history.

But the more I ponder this, the more I wonder if it isn’t also rooted in an entirely different response to the end of the Cold War as well. I remember vividly some early disputes at TNR as the Soviet Union collapsed. Marty, in particular, was just as vigilant toward the new Russian government as he had been toward the Communist one (and that was during the elysian years of Yeltsin’s doomed, democratic experiment). I found this utterly baffling. For me, the fight against Communism was not a fight against Russia. They were separate entities; one was a global, expansionist ideology, capable of intervening across the planet; the other was a ruined but still proud regional power. Indeed, if we were to prevent a return to Communist norms, I believed we should expect Russia to flex some muscles in its area of influence, for the mccainmariotamagetty.jpgOrthodox church to return as some sort of cultural unifier, for Russia to return to, well, being Russia, with some measure of self-respect. Magnanimity in victory was a Churchillian lesson I took to heart. I hoped – and hope – for more, but came to understand much more vividly what I had unforgivably forgotten after 9/11 – the perils of trying to force democratic advance in alien cultures and polities.

For me, the end of the Cold War was a blessed permission to return to “normal”. And “normal” meant a defense of national interests and no countervailing ideological crusade of the kind the Communist world demanded. In time, it seems to me that the basic and intuitive foreign policy for the US would return to what it had been before the global ideological warfare against totalitarianism from the 1940s to the 1980s. The US would become again an engaged ally, a protector of global peace, but would return to the blessed state of existing between two vast oceans and two friendly neighbors. The idea of global hegemony – so alien to the vision of the Founders – would not appeal for long, at least outside the Jacksonian South. As Islamist terror traumatized us on 9/11, however, I reverted almost reflexively to the Cold War mindset – as did large numbers of Americans. It was the rubric we understood; and defining Islamism as the new totalitarianism helped dispel what then appeared as the delusions of the 1990s, when peace and prosperity seemed to indicate an “end of history”, in Frank Fukuyama’s grossly misunderstood and still brilliantly incisive essay.

From the perspective of 2014, however, the delusions seem to have been far more profound in the first decade of the 21st Century than in the last decade of the 20th. The conflation of Islamism with Communism was far too glib – not least because the former was clearly a reactionary response to modernity, while the latter claimed to be modernity’s logical future; and the latter commanded a vast military machine, while the former had a bunch of religious nutcases with boxcutters. And the attempt to use neo-colonial military force to fight Islamism was clearly doomed to produce yet more Islamists – as the Iraq and Afghanistan interventions proved definitively. More to the point, Americans now understand this in ways that many in the elite don’t:

Screen Shot 2014-02-25 at 1.17.15 PM

From 93 percent to 48 percent is quite some shift in 12 years. Sometimes, when I’m criticized for changing my mind on the wars against Islamist terror, as if I were some kind of incoherent madman, I can’t help chuckling. I can see why it might seem fickle or unprincipled or irrational if I were alone. But when there’s a stampede among Americans on exactly the same lines, I’m clearly not an outlier. People changed their minds because the facts forced them to. The polling on the Iraq war is even worse:

Screen Shot 2014-02-25 at 1.23.03 PM

Again, the shifts are dramatic: from 23 percent believing the war was a mistake at the start to 53 percent ten years later. A third of Americans reversed themselves.

Now, you could argue that these results are just a function of obvious failures in war-planning, or war-fatigue, or the recession. But I think that condescends too much to American public opinion. In thinking about intervention, America no longer has some global ideological rival to counter, and in so far as America does (i.e. Islamist terrorism), there is a widespread understanding that our military attempts to stymie it have been costly failures at best and cripplingly counter-productive at worst. This is a real shift, and I can only recommend Ross Douthat’s latest blog-post that explicates it further.

What Ross is arguing is that Sam Huntington was right to see the persistence of conflict and warfare where the civilizations meet and clash:

Crises keep irrupting along the rough borders that Huntington sketched — where what he called the “Orthodox” world overlaps with the West (the Balkans, the Ukraine), along Islam’s so-called “bloody borders” (from central Africa to Central Asia), in Latin American resistance (in Venezuela’s Chavismo, Bolivia’s ethno-socialism, and the like) to North American-style neoliberalism, and to a lesser extent in the long-simmering Sino-Japanese tensions in North Asia.

But these conflicts mean less to us, because so much less is at stake than it once was. And the reason for that is that Fukuyama was also right:

Huntington’s partial vindication hasn’t actually disproven Fukuyama’s point, because all of these conflicts are still taking place in the shadow of a kind of liberal hegemony, and none them have the kind of global relevance or ideological import that the conflicts of the 19th and 20th century did. Radical Islam is essentially an anti-modern protest, not a real alternative … China’s meritocratic-authoritarian model has a long way to go to prove itself as anything except a repressive Sino-specific kludge … Chavismo and similar experiments struggle to maintain even domestic legitimacy … and what Huntington called the Western model is still the only real aspiring world-civilization, with enemies aplenty, yes, but also influence and admirers in every corner of the globe.

Ross’ is a really elegant overview of the real, historical context for the deployment of American military and even diplomatic and economic power in this century. Military power can achieve less and, more importantly, its success or failure in any specific context matters less. Obama’s restraint is not weakness; and trying to match Putin’s militarist bluster is a mug’s game. The most important thing the president says about Iran is not the need to prevent its developing an operational nuclear weapon, but the articulation of the truth that, compared with America’s global enemies in the past, Iran is a puny power, a failing economy, and a bankrupt ideology. Yes it can still do damage, and we should do our best to restrain it as best we can. But a full-scale war to disarm it? The costs would so vastly exceed the tiny benefits that you really have to be stuck in 1984 to contemplate it. We had the first iteration of this debate in 2008 – between the 20th Century nostalgic, John McCain, and the 21st Century realist, Barack Obama. And Obama won. And won again.

That verdict has not yet changed. And for good reason. What lies ahead is simply the task of resisting some primal impulse to return to the Cold War mindset. Which is what, I’d argue, Chuck Hagel’s plans for slimming the military are really all about.

(Photos: Getty Images)

Down On Dead Poets Society

Kevin J.H. Dettmar loathes the film. As an English professor, he’s especially unimpressed with the antics of Mr. Keating, the boisterous poetry teacher played by Robin Williams:

For all his talk about students “finding their own voice,” however, Keating actually allows his students very little opportunity for original thought. It’s a freedom that’s often preached but never realized. A graphic example is presented in one of the film’s iconic moments, when that zany Mr. Keating with his “unorthodox” teaching methods suddenly leaps up onto his desk. Why?

“I stand on my desk to remind myself that we must constantly look at things in a different way,” he helpfully declaims. How bold: He’s standing perhaps 2½ feet off the ground. Ralph Waldo Emerson, in his essay “Nature,” had made the same point rather more radically, suggesting that one “Turn the eyes upside down, by looking at the landscape through your legs.”

Keating then has the boys march up to the front, of course, and one-by-one and two-by-two they mount his desk and they too “look at things in a different way”—exactly the different way that he has. After each has experienced this “small alteration in [his] local position” (Emerson), he steps or leaps off the desk, as if a lemming off a cliff: Keating’s warning, “Don’t just walk off the edge like lemmings!,” unfortunately only serves to underscore the horrible irony of this unintended dramatic metaphor. Even when the students reprise this desktop posture at the film’s close, in a gesture of schoolboy disobedience (or perhaps obedience to Keating), we realize that while the boys are marching to the beat of a different drum, it’s Keating’s drum. Or they’re dancing to his pipes.

Jason Bailey sighs:

[Dettmar’s] catalog of woes and complaints reads less like criticism and more like a therapy session. “I think I hate Dead Poets Society for the same reason that Robyn, a physician assistant, hates House,” Dettmar writes, “because its portrayal of my profession is both misleading and deeply seductive.” Ah, yes, a movie that gets someone’s profession wrong, how novel. They say that you can’t watch a movie that was filmed in your house because you’ll never be able to get past how they changed the wallpaper; likewise, every time a film of note is set within a particular profession, out come the armchair experts to tell you how they get it all wrong, as if that actually makes any difference to the casual viewer, merely looking for a good story or an emotional experience.

The point is, it doesn’t matter one iota to the general movie-going public if Citizen Kane gets the newspaper world right, or L.A. Confidential is an accurate representation of police work, or Tootsie nails the day-to-day workings of a daytime drama. And though it might matter very much to (respectively) a newspaper editor, a cop, or a soap actor, that doesn’t mean we have to listen to them, and when Dettmar bemoans Dead Poets’ lack of representation for “the thrilling intellectual work of real analysis,” you just want to gently take him aside and explain to him how drama and movies work.

Update from a reader:

I think Dettmar misses in his critique of Dead Poet’s Society. The Mr. Keating character is taking his students along a process of changing their conformist ways. They are students of some wealth and social stature (mostly) in 1959 boarding school where their careers are already decided for them. At least that is conveyed in the movie. This is set up to be as conformist a setting as can be for the student characters. Now the scene Dettmar laments is where everyone changes their current perspective to that of Keatings on the desk. So what, it’s just one step along the process of breaking a mold. Why does Dettmar not point to a later scene when the students are out walking in the courtyard? In that scene they all fall in step while walking and Keating forces them to each walk their own way. One character, Dalton, takes it upon himself to take the point further and not walk at all (YouTube link here). This to me is the scene where Keating really gives his students the forum for “finding their own voice” as Dettmar put it, and not just repeating his voice. All part of the process.

The Facebook Empire

Matt Buchanan wonders if it can hold together:

When you first sign up for WhatsApp, you’re informed about the pricing model: the first year of service is free; each year after that costs a dollar. A post offering an explanation of why the company charges a nominal fee instead of showing advertising or requiring you to hand over basic demographic information reads like an anti-Facebook manifesto, beginning with a quote from “Fight Club”: “Advertising has us chasing cars and clothes, working jobs we hate so we can buy shit we don’t need.” The post continues, “These days companies know literally everything about you, your friends, your interests, and they use it all to sell ads”—an apt description of WhatsApp’s new owner. After a few more cutting remarks about advertising, such as “no one jumps up from a nap and runs to see an advertisement,” the page concludes, “Your data isn’t even in the picture. We are simply not interested in any of it.”

Facebook, despite being driven primarily by demographics and advertising, is adamant that it won’t change WhatsApp, an indication of just how overarching the company wants to be: in the pursuit of its next billion users, it is now willing to tolerate a highly discordant new product, like a vast empire that contains many competing nations. Empires always fall; the question now is how big Facebook’s can get before it does.

Earlier Dish on the WhatsApp purchase here and here.

Faces Of The Day

Guinness Book of World Records, Largest Collection of Video Game Memorabilia

Brett Martin exhibits a pair of Mario retail displays in his “Video Game Memorabilia Museum” at his home in Littleton, Colorado on February 25, 2014. His collection, with 8030 pieces, has been recognized by the Guinness Book of World Records as the largest collection of video game memorabilia. Martin says half of his collection is connected to Mario game series. By Craig F. Walker/The Denver Post/Getty Images.