How much more direct can you get than a hero’s welcome at the Vatican for Gustavo Gutierrez, the key proponent of liberation theology?
Author: Andrew Sullivan
Nanny State Watch
Bloomberg’s departure as mayor hasn’t made New York nannyism any less lame:
Essentially, according to the [NYC Hospitality] Alliance, restauranteurs are prohibited from “selling, serving, delivering or offering to patrons an unlimited number of drinks during any set period of time for a fixed price.” And yes, this law also includes your favorite club’s ladies night where the drink specials let you party off all those TPS reports you filed so someone else could throw them away. Club drink specials are breaking the law too, and I’m guessing this will ruin all New Year’s Eve party planning. How else will restaurants entice us to hunker down with them for four hours? Boo.
Apparently the only drink specials that are legal are two-for-ones and discounts that aren’t more than half price. So, bring your flask to brunch ladies, because Eater thinks a crackdown is on the horizon and shit’s about to get real.
Update from a reader:
That post is totally misleading. First of all, this isn’t a new law, it is in the news because a NYC hospitality industry group has brought attention to an existing law that isn’t being enforced, but could possibly be enforced if the city/state wanted to.
Second of all, it’s astate law through the NYS Liquor Authority (SLA), not something done at the city-level like Bloomberg’s bans on large sodas. Finally, even if enforced, this puts New York State at about the middle of the pack in terms of state-level happy hour laws. According to this summary of happy hour laws (which may be somewhat dated), over half of U.S. states have some kind of ban on happy hour promotions. Of those that do, New York is among the most permissive states, with Massachusetts, Rhode Island, Tennessee, and Virginia as some of the most restrictive states.
Don’t get me wrong; enforcing the ban on unlimited drinks in NYC would totally ruin my favorite way to waste a weekend afternoon, but let’s be clear about who is really responsible for these rules and hope that they go on unenforced.
Talking Trash
In an interview, Adam Minter, who grew up working in a scrapyard, describes “a couple of messages that I really wanted to get across” in Junkyard Planet:
Number one, that the recycling business is not a niche, and not just responsible for “green products”. In the US you get greeting cards made from 50% recycled content, and I think those kinds of products tend to make people think it’s a niche, specialised industry. But it’s much bigger than that. Pretty much anything you can buy is affected in its pricing by scrap, and it quite likely has scrap in it. Every automobile, for example, is to some degree made from scrap materials, such as the engine block, if it’s made from aluminium, or the plastic of the bumpers. So I wanted people to realise that recycling is more than just a blue and green bin in their pantry.
The second message, and it’s a very personal message for me, is that there’s dignity in this kind of work. The images that we see of scrap workers, on television for instance, are often of exploited masses in China, Africa, India. There’s more to it than that. These are real, three dimensional people, and there is dignity and entrepreneurship in the work that they do. It’s often an opportunity to lift themselves up from a lower standing.
Previous Dish on Minter’s book here.
(Hat tip: The Browser)
Comical Racism, Ctd
A reader defends those critical of casting an African-American actor as a “white” superhero:
While there are no doubt some genuine racists incensed by this just because Fox has cast a black man as Johnny Storm, I suspect most of the pushback really comes down to two closely related phenomena in comics fandom. First – and this you see in pretty much all fandoms – is a major concern with things staying the way they were. Fans get attached to things and worry that minor changes are going to mean more drastic changes in the future. I would compare this, for instance, for the concern raised last year about whether the next Doctor would be black or a woman. It wasn’t so much that people were opposed to a black Doctor or woman Doctor, as that they (especially original series fans) feared that it was a sign that too much had changed since “the good old days.”
Second, comic fans have adopted something of a “once bitten, twice shy” approach to most adaptations.
With one notable exception, the movie studios tend to treat comics adaptations as mindless franchises, good only for making money off of young men and teenage boys who don’t really know any better. Even series that start out strong have quickly devolved into self-parody and incoherent silliness. While the poster boy of this trend is probably the ’90s Batman franchise (especially the execrable Batman and Robin), Fox – the studio that currently owns the rights to the Fantastic Four – has gotten a huge amount of flack from fans for its treatment of the X-Men (the third X-Men film, Last Stand, is loathed by X-Men fans, and the less said about X-Men Origins: Wolverine, the better) and for its atrocious adaptation of Alan Moore’s League of Extraordinary Gentlemen. And that’s leaving aside the fact that Fox’s earlier Fantastic Four films were completely underwhelming.
When you combine these two factors – conservatism of fandom and distrust of adaptations – it’s easy to see why comic fans are suspicious over what they see as significant changes to the characters they love. It’s telling that so much of the criticism has focused on the Johnny Storm/Sue Storm Richardson relationship. It isn’t that fans don’t understand that interracial adoption is a thing; it’s that they are seriously worried (and, honestly, have every reason to worry) that the film’s producers simply don’t care that the two are siblings, and are going to write the relationship out of the film. Or, worse, that an idiot writer will just look at the last names and think Sue and husband Reed Richards are siblings.
Other reader responses seem to bear that analysis out:
My problem isn’t with a black actor playing Johnny Storm. My problem is with Johnny Storm having a full-blooded sister who is white, which stretches credulity even for a made-up comic book world.
Another:
I wouldn’t want a white actor playing Luke Cage, just like I think George Clooney would be ill-suited to play Martin Luther King, Jr.
Another reader directs us to an essay by Hashim R. Hathaway, who identifies himself as a man of color as well as a fanboy. He writes:
Here’s why I can’t get behind Jordan as Johnny Storm: it’s not enough. In fact, it’s so “not enough” because what some, like Louis [Falcetti], would argue as a positive step only serves to highlight just how far away we are. Kate Mara was cast as Sue Storm. Are they stepbrother and sister? Are they adopted siblings? Does it matter? In one way, it shouldn’t matter at all, but in another way, the fact that it even has to be explained shows how diversity can be forced, because suddenly we have a token black guy on a team of white superheroes. … Johnny Storm being black means there’s some hitherto-unnecessary explaining to do in various other parts of the story.
The reader adds:
Hathaway basically argues for more black characters being brought to the screen, rather than changing the race of existing characters then calling it a win for diversity. Thankfully, we will be seeing The Falcon in the new Captain America movie, and Luke Cage will be getting a Netflix series soon. As a whole, Marvel has a great record on diversity, recently introducing a team of superheroes in Young Avengers who are almost all queer (but you don’t find that out until the final issue). We just need to see some of that diversity make its way to the silver screen.
Mall Of The Wild
Amy Merrick considers the ironies of Cabela’s, the big-box chain for outdoorsmen:
Today, Cabela’s has $3.5 billion in annual revenue. Its 50 stores look like enormous log cabins, and
inside of them hundreds of taxidermied animals grapple in lifelike poses—grizzlies rear up, rams stand atop plaster mountains like figures on a wedding cake. Cabela’s exhibits have been described as “natural-history museums,” and its stores are billed as tourist destinations. When a Cabela’s opened this past October, in Waco, Texas, people lined up outside for hours in the rain, starting at 8:00 A.M. on the previous day.
The growth of Cabela’s reflects Americans’ odd relationship with the outdoors: we mythologize it even as we pave it over. To accommodate their bulk and the crowds that they attract, Cabela’s stores are often built next to interstates and surrounded by giant parking lots. Generally, the only wildlife in sight are the crows picking over the litter. Some of the newest branches are on the edges of cities—Denver, Austin—that epitomize sprawl. In Greenville, South Carolina, where Cabela’s plans to open on a congested retail strip in April, other retailers are worried that traffic jams will scare away their customers.
(Photo of a Cabela’s display in Reno, Nevada, by Flickr user RenoTahoe)
When A Low IQ Can Save Your Life
The Supreme Court will soon determine how mentally disabled a person must be to be considered ineligible for capital punishment:
A 2002 Supreme Court ruling already bars the execution of people with an intellectual disability. But [convicted murderer Freddie Lee] Hall’s lawyers are expected to argue that many US states assess mental ability using outdated measures that take little or no account of current scientific research on the subject. Florida, in particular, is one of 10 states in which anyone with an IQ score of above a certain number, usually 70, is automatically considered to be intellectually competent and is therefore eligible for the death penalty. Psychologists contend that IQ tests are not precise enough to draw such a ‘bright line.’ Hall’s IQ scores range from 60 to 80, and many states would not consider him for the death penalty. …
There is a great deal resting on how the Supreme Court decides this case.
According to one estimate, as many as 20 percent of the more than 3,100 people on death row in the United States may have some level of intellectual disability. So a decision in Hall’s favor could lead to hundreds of appeals, says Nancy Haydt, an attorney in Santa Barbara, California, who is compiling a database of pleas relating to intellectual disabilities in death-penalty cases.
But many mental-health specialists hope that the court will rule more broadly. In briefs filed in the case in December, professional organizations, including the American Psychological Association (APA) and the American Association on Intellectual and Developmental Disabilities (AAIDD), advocated for the court to set a new legal standard that reflects current research on intelligence. IQ tests were never designed to assess the criminal mind, psychologists argue. They say that the modern definition of intelligence – which includes the ability to learn and solve problems, relate to other people and function in society – is much more relevant.
The View From Your Window
Apathetic Atheism vs New Atheism, Ctd
A reader objects to Thomas Wells:
I don’t know what kind of bubble he’s living in, but he’s completely dismissing (or outright missing) the social context of religion. Wells wrote:
I see no more reason to describe myself as an atheist, than as an afairieist, ahomeopathist, etc. To put it another way, my non-belief is apathetic: the nonexistence of God/Gods is a matter of great insignificance to me.
Well, I do see a reason: there are not billions of people who identify as members of “fairieist” or “homeopathist” institutions. Like the other examples in his list – UFOs, crop circles, ghosts – they are regarded by our society as the harmless delusions of an insignificant minority. Religion is categorically different.
I am an atheist, but religion is important to me because it is important to virtually everyone I interact with in my community; because it is important to the people who write my laws and run my country; because the faith-based method of forming beliefs and making decisions affects me and people I care about. There’s no corresponding phenomenon of astrologist families disowning their gay sons and daughters, or Santa Claus extremists flying planes into buildings. Come on.
Another adds:
The “So what?” from that Thomas Wells excerpt angered me, as the reason God’s nonexistence matters more than that of fairies or UFOs is quite clearly the fact that nobody’s trying to legislate based on the latter. But clicking through and reading Wells’ full essay reveals that he addresses that point by distinguishing secularism as a broader (and, to him, more laudable) category than atheism. So, anger abated.
But reading his piece got me thinking about when and why I moved from his casual brand of non-belief to feeling a need to call myself an atheist and to tell others (politely!). I can remember the moment clearly.
It was a December, early in the Clinton presidency, and I was watching some Xmas special broadcast from D.C. on television. Hillary Clinton was onstage, reciting from a prompter, saying something about her faith in Jesus. Her line reading was so stilted, and her “faith” so transparently part of a political as well as a literal script, I turned to my wife and said, “I’ve never believed in God, but now I’m an atheist.”
In that moment, the dispassionate silence Wells advocates and I’d always practiced struck me as deeply misguided – not because I thought it was my duty to change the mind of anyone who had beliefs I lacked, but because so many of those people took it for granted there was only one way one could or should think about God that expressing faith in Jesus had somehow become a requirement of running our country. For me, saying “I’m an atheist” is an important reminder that believing in God is not a requirement of citizenship.
But a third reader agrees with Wells’ criticism of the New Atheists and offers an analogy:
I deeply dislike soccer. I think it is boring. I think the culture is bizarre and unappealing. I don’t like it every four years when the World Cup gets everyone excited for a couple of weeks; I don’t like it when my friends play FIFA on Xbox; I don’t like it when I watch it with my Latina girlfriend and her family. I didn’t like it when I lived in Amsterdam and my friends would go out to watch Ajax matches. I have tried over and over to give it a chance.
And I just don’t care. I am “asoccer.” Maybe I will give soccer another chance. I mean, spectacle is exciting in general, and I like to think I take interest in my friends’ and family’s interests. And, getting to my point, I will definitely give theism many more chances. But I am without God. I am an atheist. I don’t care.
Mr. Wells’s article spoke to a lot of emotions I have felt about atheism/theism, and, even more so, about the New Atheists. Why do these people so intent on their own atheism spend so much time talking about God? They sound like jilted lovers dishing dirt on how they’re “better off anyway.” Get over it. Move on. These folks have chosen to define themselves in opposition to something, and in the process necessarily must legitimize the very thing they are disowning. It’s just loud madness, and it’s become the face of atheism.
I want to admit to my friends and family that I am an atheist, but the New Atheists are making it harder. They’re contrarians. They’re dickheads. Imagine if I brought up my dislike of soccer to my friends, not just every time soccer came up, but every time sports came up. After the Super Bowl. Every Wiffle ball game. Every time a coworker caught a falling pen.
I don’t know, maybe the analogy is breaking down, but I don’t understand why someone would force themselves to see the world through the very thing they detest. The great thing about not liking soccer is that I don’t have to think about soccer. The great thing about being an atheist is that I don’t have to think about God.
Another steps back:
You know, I’ve seen readers criticize the Dish with statements like “I never read your Sunday posts” because of all the God talk. They should maybe reconsider, because every Sunday you seem to have plenty of atheist talk as well. And I really appreciate both.
The Best Of The Dish Today
.@GovBrewer: veto of #SB1062 is right.
— Mitt Romney (@MittRomney) February 25, 2014
Five posts: the Tea Party tiger and its growing hunger; the terrifying campaign to find and jail gays in Uganda; felfies – farmer selfies; a Bill Kristol self-parody; and adding design to the surveillance state.
The most popular post of the day was Erick Erickson Has A Point (dissents here); followed by The Tiger Gets Hungrier.
See you in the morning.
America And The Lessons Of The 21st Century
[Re-posted from earlier today]
Whenever I read the blog-posts of Walter Russell Mead, or the fulminations of Leon Wieseltier, or the sputtering harrumphs of John McCain, or the neoconservative or liberal internationalist horrified respect for Vladimir Putin’s Great Game of geo-politics, I find myself on the other side of a vast, intellectual chasm. I’ve tried at times in my head to engage the arguments, and have consistently found myself at a loss. It’s not the arguments for this or that military or diplomatic intervention that stump me. It’s the entire premise behind them.
At times, I’ve put it down to a very different response to the Iraq War. Many of these voices decrying Obama’s restraint in the face of evil or
distant conflict have not drawn the same lessons I did from that defining episode in American foreign policy in the 21st Century. I saw that war as an almost text-book refutation of the logic behind US intervention in this century. The Iraqis were not the equivalent of Poles in 1989. They were deeply conflicted about US intervention and Western liberalism and came to despise the occupying power. The US was not the exemplar of liberal democracy that it was in the Cold War. It was a belligerent state, initiating Israel-style pre-emptive wars, and using torture as its primary intelligence-gathering weapon. Its military did not defeat an enemy without firing a shot, as with the the end-game of the Soviet Union; it failed to defeat an enemy while unloading every piece of military “shock and awe” upon it. A paradigm was shattered for me – and shattered by plain reality. A realist is a neocon mugged by history.
But the more I ponder this, the more I wonder if it isn’t also rooted in an entirely different response to the end of the Cold War as well. I remember vividly some early disputes at TNR as the Soviet Union collapsed. Marty, in particular, was just as vigilant toward the new Russian government as he had been toward the Communist one (and that was during the elysian years of Yeltsin’s doomed, democratic experiment). I found this utterly baffling. For me, the fight against Communism was not a fight against Russia. They were separate entities; one was a global, expansionist ideology, capable of intervening across the planet; the other was a ruined but still proud regional power. Indeed, if we were to prevent a return to Communist norms, I believed we should expect Russia to flex some muscles in its area of influence, for the
Orthodox church to return as some sort of cultural unifier, for Russia to return to, well, being Russia, with some measure of self-respect. Magnanimity in victory was a Churchillian lesson I took to heart. I hoped – and hope – for more, but came to understand much more vividly what I had unforgivably forgotten after 9/11 – the perils of trying to force democratic advance in alien cultures and polities.
For me, the end of the Cold War was a blessed permission to return to “normal”. And “normal” meant a defense of national interests and no countervailing ideological crusade of the kind the Communist world demanded. In time, it seems to me that the basic and intuitive foreign policy for the US would return to what it had been before the global ideological warfare against totalitarianism from the 1940s to the 1980s. The US would become again an engaged ally, a protector of global peace, but would return to the blessed state of existing between two vast oceans and two friendly neighbors. The idea of global hegemony – so alien to the vision of the Founders – would not appeal for long, at least outside the Jacksonian South. As Islamist terror traumatized us on 9/11, however, I reverted almost reflexively to the Cold War mindset – as did large numbers of Americans. It was the rubric we understood; and defining Islamism as the new totalitarianism helped dispel what then appeared as the delusions of the 1990s, when peace and prosperity seemed to indicate an “end of history”, in Frank Fukuyama’s grossly misunderstood and still brilliantly incisive essay.
From the perspective of 2014, however, the delusions seem to have been far more profound in the first decade of the 21st Century than in the last decade of the 20th. The conflation of Islamism with Communism was far too glib – not least because the former was clearly a reactionary response to modernity, while the latter claimed to be modernity’s logical future; and the latter commanded a vast military machine, while the former had a bunch of religious nutcases with boxcutters. And the attempt to use neo-colonial military force to fight Islamism was clearly doomed to produce yet more Islamists – as the Iraq and Afghanistan interventions proved definitively. More to the point, Americans now understand this in ways that many in the elite don’t:
From 93 percent to 48 percent is quite some shift in 12 years. Sometimes, when I’m criticized for changing my mind on the wars against Islamist terror, as if I were some kind of incoherent madman, I can’t help chuckling. I can see why it might seem fickle or unprincipled or irrational if I were alone. But when there’s a stampede among Americans on exactly the same lines, I’m clearly not an outlier. People changed their minds because the facts forced them to. The polling on the Iraq war is even worse:
Again, the shifts are dramatic: from 23 percent believing the war was a mistake at the start to 53 percent ten years later. A third of Americans reversed themselves.
Now, you could argue that these results are just a function of obvious failures in war-planning, or war-fatigue, or the recession. But I think that condescends too much to American public opinion. In thinking about intervention, America no longer has some global ideological rival to counter, and in so far as America does (i.e. Islamist terrorism), there is a widespread understanding that our military attempts to stymie it have been costly failures at best and cripplingly counter-productive at worst. This is a real shift, and I can only recommend Ross Douthat’s latest blog-post that explicates it further.
What Ross is arguing is that Sam Huntington was right to see the persistence of conflict and warfare where the civilizations meet and clash:
Crises keep irrupting along the rough borders that Huntington sketched — where what he called the “Orthodox” world overlaps with the West (the Balkans, the Ukraine), along Islam’s so-called “bloody borders” (from central Africa to Central Asia), in Latin American resistance (in Venezuela’s Chavismo, Bolivia’s ethno-socialism, and the like) to North American-style neoliberalism, and to a lesser extent in the long-simmering Sino-Japanese tensions in North Asia.
But these conflicts mean less to us, because so much less is at stake than it once was. And the reason for that is that Fukuyama was also right:
Huntington’s partial vindication hasn’t actually disproven Fukuyama’s point, because all of these conflicts are still taking place in the shadow of a kind of liberal hegemony, and none them have the kind of global relevance or ideological import that the conflicts of the 19th and 20th century did. Radical Islam is essentially an anti-modern protest, not a real alternative … China’s meritocratic-authoritarian model has a long way to go to prove itself as anything except a repressive Sino-specific kludge … Chavismo and similar experiments struggle to maintain even domestic legitimacy … and what Huntington called the Western model is still the only real aspiring world-civilization, with enemies aplenty, yes, but also influence and admirers in every corner of the globe.
Ross’ is a really elegant overview of the real, historical context for the deployment of American military and even diplomatic and economic power in this century. Military power can achieve less and, more importantly, its success or failure in any specific context matters less. Obama’s restraint is not weakness; and trying to match Putin’s militarist bluster is a mug’s game. The most important thing the president says about Iran is not the need to prevent its developing an operational nuclear weapon, but the articulation of the truth that, compared with America’s global enemies in the past, Iran is a puny power, a failing economy, and a bankrupt ideology. Yes it can still do damage, and we should do our best to restrain it as best we can. But a full-scale war to disarm it? The costs would so vastly exceed the tiny benefits that you really have to be stuck in 1984 to contemplate it. We had the first iteration of this debate in 2008 – between the 20th Century nostalgic, John McCain, and the 21st Century realist, Barack Obama. And Obama won. And won again.
That verdict has not yet changed. And for good reason. What lies ahead is simply the task of resisting some primal impulse to return to the Cold War mindset. Which is what, I’d argue, Chuck Hagel’s plans for slimming the military are really all about.
(Photos: Getty Images)



