Louis C.K.’s “Great Mystery”

by Matthew Sitman

In an NPR interview last week, Louis C.K. unpacked the comments he made, critical of some atheists’ certainty that God doesn’t exist, during his opening monologue when he hosted SNL in March (seen above):

[S]omething I’ve learned over the years is that when you talk about religion, you want to talk to religious people. Even if you’re talking about something that’s contrary religiously or provocative, a religious audience is a better audience for that. If you talk to a bunch of cool atheists in leather and suede, you know, sucking on their vape sticks or whatever they’re doing, they’re not going to get it because they don’t even think about God. It’s not even on their radar, you know? So they’re – but if you tell religious people, I don’t know if there’s a God, I don’t think there’s a heaven, where’s God’s ex-wife, these things, they have a connection to it that means something. …

I feel like the math in my head tells me that we’re just – that everything is just science and randomness and patterns but the main thing I feel is that it’s a great mystery. I feel like I need to be humbled before the mysteries of life. I have no idea what’s caused all of this.

In response, Chris Stedman questions C.K.’s understanding of what atheism entails:

This isn’t the first time C.K. has offered confusing statements about atheism; in a 2011 Reddit Q&A he said both “I’m not an atheist” and “I don’t ‘Believe in god,’” and suggested that he does not consider himself an atheist because he doesn’t know for sure that there is no God.

It seems that a lot of this confusion boils down to differing definitions of atheism. If atheism means knowing beyond a shadow of a doubt that there are no gods, then C.K. is right. But that definition of atheism doesn’t fit most of the atheists I know. In fact, it runs up against something many atheists value: Doubt.

As an atheist, I never want to be too certain about what I believe. I strive to continually test and retest my assumptions, comparing them against new information and data as I encounter it. My atheism is curious, reflecting both a willingness to be wrong and a constant desire to learn.

So let’s clear the air: Being an atheist does not require absolute certainty. It doesn’t mean you rule out the possibility of divine or supernatural entities existing. Instead, it is the position that such a possibility is unlikely, and that the case for God hasn’t been adequately made yet.

Do Religious “Mutts” Miss Out?

by Jessie Roberts

In an interview about his new novel, To Rise Again at a Decent Hour, Joshua Ferris suggests they do. He reflects on how an indistinct religious upbringing shaped his writing:

[G]rowing up I looked in on Catholicism as a non-Catholic, and I looked in on Judaism as a non-Jew. I was an outsider, this mutt-y white kid who had no tradition or belief. I wanted a religious community for myself, probably because I didn’t have one. If I’d had one, I probably would have spurned it.

To Rise Again At a Decent Hour starts from the question of whether there’s a kind of private language and intimacy to religion that the mutt-y white guys like me are missing out on. And to some extent, I’m also thinking about the question of whether as a writer there’s something I’ve missed out on. When you’re an American novelist in 2014, at a point when Philip Roth has had a kind of apotheosis—has ascended to heaven even though he’s still on earth—you realize the extraordinary richness he found in Judaism. I didn’t grow up within that richness. I simply didn’t have it. It cuts both ways, of course. There are writers who happen to be Jewish who get labeled as “Jewish writers” and would much rather be just writers. And here I am, lamenting the fact that I’m not a Jew! But religion offers a writer a tradition both to be nurtured in and to fight against, and that nurturing and that conflict can produce great literature. Roth was given a lifetime of material from the fights he picked with Judaism—with the generation of Jews that he raised him, with the generation that excoriated him, and finally with the generation that celebrated him. Whereas I got a few potluck dinners and some basement training in Noah.

In another interview, Ferris expands on his attitudes toward religion:

Is there a God-shaped hole in your own life?

Yes. The general impression of Americans is that we’re all believers. But there are quite a few of us, obviously, across the vast swath of this crazy country who don’t have a God. I think of myself, like my character in the book, as a “non-practising atheist”. At the witching hour there is a hell of lot to say for divine comfort. I feel excluded from that. There are some aspects of religion – its community, its certainty – that I long for.

Part of your plot takes the character back, through a faked online identity, to the Old Testament. What was the experience of immersing yourself in those verses?

Well, I gained a lot of respect for the Bible. I never knew it first-hand before. The best stories there remove all inessentials, and what you’re left with is something extremely efficient. It’s almost like a divinely inspired Hemingway writing in those parts.

Listen to another recent interview with Ferris here.

Should Christians Ditch The Devil?

by Matthew Sitman

Timothy Tutt makes the case for doing so, arguing that Satan is “clearly a theological construct found in many cultures and dolled up differently over time”:

In the earliest traditions of Hebrew scripture, both good and evil were God’s domain. Satan appeared in the books of Job and Zechariah. At first, he worked for God. In the Book of Job, the devil functions as something of a chief-of-staff, checking up on God’s lower level employee. (Some readers may be confused by this and think of the devil as tricky tempter right from the get-go, as in the Garden of Eden. Remember, even though the Book of Genesis is printed first in the Bible, it was not written first.)

As centuries went by, Satan became testier and more independent. He began to oppose God with accusations that tempt humans. The devil moved from God’s employee to less-than-loyal opposition. This move was, in part, because Persian thinking seeped into Judaism. For three years (from about 700-300 BCE), Persia ruled a huge chunk of land from the Indus River to Greece, including what is now Palestine and Israel. Even after the Persian Empire declined, their thinking remained. The Persian philosophy saw the world as a struggle of good and bad.

By the time that Christianity grew out of Judaism, the devil was a full-fledged bad boy, the enemy of God and humans alike.

In a follow-up article, Tutt finds belief in a literal, personal Satan doesn’t fit with his understanding of Christianity:

Satan works well if you’re into fear and punishment. But that’s not what Christianity is about.

Christianity is about grace and love. And grace and love are like poker. They require taking risks and gambling.

If someone hits you on one cheek, let him or her hit you on the other, Jesus said. In a religion of rules, the cheek-slapper would be punished. In a faith of grace, the cheek-slapper just might be so overwhelmed by your gentleness that he or she gives up cheek-slapping. (Of course, that might not happen. That’s the risky grace of the gospel. It’s a lot like holding a straight in your hands and hoping your opponent only has three of a kind.) …

Let me be clear: Evil exists in the world. We must wrestle with that. But old constructs need to be tossed. The Christian church clung to a flat-earth cosmology far too long. Preachers used the Bible to defend slavery. Luther and then other Reformers changed theological views of communion from transubstantiation to consubstantiation to symbolism. Each move required risk.

Recent Dish on the devil here.

How Not To Read The Bible

by Matthew Sitman

From ages 17 to 23, Jessica Misener was a born-again Christian. And then she went to graduate school at Yale, learned a bit of Hebrew and Greek, delved into studying Scripture, and eventually lost her faith, which “hinged almost solely on believing the Bible to be the literal, inspired word of God”:

More and more, I realized that the Bible was a flawed, messy, deeply human book — and that in treating it as an unimpeachable guidebook for life in the 21st century, many conservative Christians were basing their entire worldviews on a text that, in my opinion, wasn’t that much different from any other historical collection of letters and stories. I was forced to confront the fact that I’d converted into a pre-fab worldview: one hatched largely in recent American history from Jonathan Edwards and the theology of the Great Awakening, and one that “family values” politics has buoyed through modern decades.

This was something the evangelical students in my program at Yale talked about often: the behemoth of doubt that sets in as your airtight hermeneutic of scripture is drained from the bottom. Christians from other traditions didn’t have it so bad…We evangelicals, with our infallible view of scripture ripped from our hands, were left gasping for air. If you crumple and toss out a literal reading of the Bible, then what does it mean to talk about Jesus literally dying for your sins?

There are places in Misener’s essay that elicit empathy and interest, especially her descriptions of what faith did in her life – how she liked who she was as a Christian, and how she misses the meaning that religion offered. But I find it utterly baffling to assume – as she implicitly does – that our options for reading the Bible amount to a choice between the evangelical belief in Scripture’s “inerrancy” and the view that it’s little more than an unreliable jumble of tall tales and fables.

This is all the more curious given that, by her own admission, the evangelical position she once held is something of a modern innovation, and that most Christian traditions outside of evangelicalism, such as the Roman Catholic, Anglican, and Orthodox ones, to say nothing of more liberal strains of Protestantism, hold different and often more nuanced and complex understandings of the Bible. In fact, out of fairness to my evangelical friends, I’d even say that within conservative evangelical theological circles you can find approaches to the Bible that uphold inerrancy without reducing it to a simplistic literalism. Misener doesn’t seem to show any interest in any these alternatives. Which is fine, as far as it goes, but it’s worth emphasizing that what she describes as a kind of personal revelation – the Bible is “messy,” and doesn’t really hold up well when read literally – is something that countless theologians and thinkers throughout Church history have affirmed, commented on, and tried to understand. Dreher is on the same page:

It’s a false choice to say that either Scripture is 100 percent infallible in a literal sense, or that none of it is reliable. It’s rather that Scripture requires an authoritative interpretive community, which is the Church. When are we free to read Scripture as a metaphor, and when must we accept it literally? Both [Roman and Orthodox] churches have answers to this, but they aren’t simple answers, and they aren’t strictly binding. You can find Orthodox Christians who believe that Genesis is literally true, and must be affirmed as such, and you can find Orthodox Christians who believe that Genesis is a “true myth” — that is, a symbolic story, like parables, through which God reveals foundational truths about Creation that are beyond the comprehension of us finite creatures (that’s what I believe, for the record).

Now maybe, ultimately, Misener wouldn’t find these alternative ways of approaching the Bible persuasive either. Many don’t, and I sincerely respect them. Or perhaps, after being burned by evangelicalism, Misener just wanted nothing to do with religion – as someone who grew up in a rather severe fundamentalist church, I’m sympathetic to that impulse. There are good reasons to be an agnostic or atheist, and even those of us who continue to be attached to Christianity, which I am, grapple with doubt, uncertainty, and dark nights of the soul. I have to say, though, that the intellectual bankruptcy of certain forms of American evangelicalism strikes me as problematic grounds for jettisoning Christianity.

But most of all, Misener’s essay points to the sad state of so much American religious life, especially the messages delivered by too many Christian churches. She makes clear that, at times, she still feels “a wave of something truly ineffable, a surreal flutter in my soul that the world was vast and overwhelming and rich and meaningful and also not really fucking meaningful at all.” That’s something most of us have felt, I’d guess, whether believer or not. It’s a pity that the brittle, ahistorical, and ultimately untenable evangelicalism she was peddled convinced her that those feelings are alien to Christianity, that faith demands the silencing of doubt and uncertainty. It’s a shame that too many Christian churches present the Bible in such a way that, when an earnest young person encounters the historical-critical approach to it, the result is shock and perplexity. It’s lamentable that more churches aren’t places where such difficulties can be worked through, where you feel welcome even if you are far from having what you believe figured out. Pope Francis has said that the Church should be a “hospital for sinners,” which is to say a refuge for all of us who struggle in all kinds of ways, profound doubt included. Misener’s story is testimony to how far Christians have to go to make the Pope’s words a reality.

“I Know What You Did Last Sunday”

by Jessie Roberts

dish_churchattendance

That’s the clever title of a new study that reveals Americans exaggerate how often they go to church:

The study, by the Public Religion Research Institute, used an intriguing method to try to measure exaggeration: It asked the same set of questions in telephone interviews, and in an online survey, and compared the results. Researchers say that online surveys, with their lack of human questioners, significantly reduce “social desirability bias” in polling — the tendency of people to exaggerate behaviors that they think will impress others. In this study, the group that took the online surveys reported much lower levels of worship attendance than those interviewed by telephone.

Jessica Schulberg elaborates on the study’s sneaky methodology:

Why the difference in results?

Talking to another human, even an anonymous one, can cause respondents to exaggerate the truth. “It is a rather unconscious cognitive bias,” says John R. Shook, a professor in the Science and the Public EdM online program at the University of Buffalo. “Even if you talk to a live human voice, someone you will never see, someone in Zaire, the brain rationalizes and tries to present itself in the most positive light possible. You can’t help it.”

But it seems that when it comes to worship attendance, liberals are more inclined to do so. PRRI found that over the phone, only 27 percent of self-identified liberals admitted that religion is not important to them; the number jumped to 40 percent of liberals who responded to an online questionnaire. Conservatives were much more consistent: Only 4 percent of telephone respondents and 6 percent of online respondents said the same.

Last month, Brandon Ambrosino looked at an earlier study that suggested people overreport their actual religious practices. He pondered their motives:

According to [researcher Philip] Brenner, overreporting Muslims and Christians are not maliciously lying on surveys — they’re mishearing the question. Here’s Brenner:

“Like the overreporting of church attendance in North America, the overreporting of prayer in the Muslim world is strongly associated with the individual’s sense of what is central to his or her self-concept. The respondent interprets the conventional survey question about prayer pragmatically rather than semantically, allowing the question to become one about the respondent’s identity, rather than actual behavior.”

In other words, when a religious person is asked, “Do you do religious stuff?” the question she actually hears is, “Are you the kind of person who does religious stuff?” If [Brenner’s] research is any indication, lots of people in North America and the Middle East perceive themselves as the kind of people who do religious stuff — even when they’re not actually doing anything.

The Comedians Of Conservatism

by Jessie Roberts

Frank Rich looks beyond Dennis Miller to Greg Gutfeld, “a signature personality on two daily Fox News shows: The giggly The Five (in late afternoon) and the fiercer Red Eye (scheduled by Roger Ailes in the stunt time slot of 3 a.m.)”:

Gutfeld is more of a wisecrack artist than a comedian and, like Miller and other comics on the right, is careful to label himself a libertarian, so damaged is the conservative brand. But if you listen to Gutfeld on Fox or read his recent best-selling manifestoNot Cool, he seems much more of a standard-issue conservative and, in keeping with that, older than he actually is (49). His targets are the usual shopworn suspects, some of whom are so far removed from the main arena of 21st-century liberalism that comic complaints about them are deadly on arrival: Rachel Carson, Yoko Ono, Hurricane Carter, Howard Zinn, Mumia Abu-Jamal, Oliver Stone, and even Dan Quayle’s old fictional bête noire, Murphy Brown. In Not Cool, Sean Penn gets 18 references, and even Robert Redford merits nine. Like much of the right, Gutfeld can’t stop fighting battles from the 1960s that are increasingly baffling to post-boomer audiences. It’s as if the clock stopped with the Vietnam War. …

If there’s one universal rule of comedy, it is, as Gutfeld himself has said, that “it’s hard to be funny without being truthful.”

But when he jokes that politically correct Americans are relabeling Fort Hood terrorism “workplace violence” and that they would rather use the term “unlicensed pharmacists” than “drug dealers,” he seems to lack any firsthand knowledge of conversation as practiced on the ground in ­present-day America. His examples of p.c. speech sound instead like the typically outrageous anomalies unearthed by Fox News. He needs to get out of the studio and meet some young people.

Rich reviews Jeff Dunham – whose act with the puppet “Achmed the Dead Terrorist” can be seen above – more favorably:

The gifted ventriloquist Jeff Dunham, as commercially successful a conservative comedian as there is (and one of the most successful touring comedians in the country, period), is best known for Achmed the Dead Terrorist, a puppet given to one-liners like “Where are all the virgins that bin Laden promised me?” Achmed can be funny, not least because he is a goofy, not hectoring, comic creation. And Dunham has a worthy comic nemesis in terrorism, much as Mel Brooks found in Hitler. The trouble with this material is its inevitable shelf life as 9/11 and its ensuing wars keep receding into the rearview mirror of American memory. There’s a reason why the playwright George S. Kaufman long ago said that “satire is what closes on Saturday night.”

Upwardly Immobile

by Jonah Shepp

Gentrification may be the talk of the town (no matter which town), but Richard Florida highlights new research showing that most urban neighborhoods that were poor 40 years ago are still poor today. The study “compared neighborhood-level poverty rates in the country’s 51 largest metro areas in 1970 and 2010.” It found that “very few high-poverty neighborhoods in 1970 dramatically reversed their fortunes over the next four decades”:

Entrenched poverty was just about the most constant thing about these neighborhoods. By 2010, fully two-thirds of these poor neighborhoods, 750 tracts in all, were still beset by chronic and concentrated poverty in 2010. Overall, their populations shrunk 40 percent over those forty years, as many of those who were able to move out did. On the other hand, only a small fraction of neighborhoods had turned around in a way that approximates what we call gentrification. Just 105 tracts, or about 10 percent, saw their poverty rates fall below 15 percent, meaning a smaller proportion of their residents lived in poverty than in the nation as a whole. The populations of these tracts grew by about 30 percent over this same period.

But wait, it gets worse:

The authors traced the fate of what they call “fallen star” neighborhoods – tracts that had below-average poverty rates in 1970 (less than 15 percent), but more than 30 percent of their residents living below the poverty line by 2010. More than 1,200 of these tracts shifted from low to high poverty during this time, contributing to an overall increase in the number of neighborhoods of concentrated poverty. Today, 10.7 million Americans live in 3,100 extremely poor neighborhoods in and around America’s largest city centers.

In other words, for every single gentrified neighborhood, 12 once-stable neighborhoods have slipped into concentrated disadvantage.

Translating World To Word

by Jessie Roberts

Kathryn Schulz raves about Geoff Dyer’s writing prowess, calling him “one of our greatest living critics” and “one of our most original writers—always out there beyond literary Mach 1, breaking the how-things-usually-sound barrier“:

[T]he essential fact about Dyer’s nonfiction is that it works beautifully when it shouldn’t work at all. Some of that work gets done at the level of the sentence, where Dyer excels. Listen to him on a hot day in Algiers: “Even the ants out on the balcony drag a little sidecar of shadow.” On Roman ruins in Libya: “All around were the vestiges of nouns—columns, stones, trees. No verbs remained.” On a saxophone solo by John Coltrane: “It’s pretty and then dangerous as he reaches so high the sky blues into the darkness of space before reentering, everything burning up around him.”

What’s going on in these sentences is the fundamental business of nonfiction: the translation, at once exact and surprising, of world to word. Writers weight that ratio of exactitude and surprise differently; you can stay close or reach further, out toward the risky and weird. Dyer reaches. You can see it in those precise but strange sidecars, in that startling grammar of ruin, and finally in the sax solo, where, like Coltrane, he pushes so hard on his medium that it threatens to break. Note the word blues, pulling three times its weight—noun, adjective, verb, so much pivoting around it that all the referents go briefly haywire and it seems like the solo is still rising and what’s falling is the sky. And note, too, how the sentence itself is pretty and then dangerous: dangerous because it starts out too pretty (“pretty” is a pretty word; “so high the sky” is Hallmark stuff); beautiful because it ends in so much danger.

Schulz goes on to praise Another Day at Sea, Dyer’s new travelogue of two weeks aboard the USS George H.W. Bush. John McAlley reviews the book (somewhat spoiling its ending in the last paragraph):

Dyer’s tour of the boat (that’s right: boat, not ship) is as closely monitored as an F-18 sortie, even though it’s a relatively stress-free time on the Bush: October 2011, more than a year after President Obama announced the end of America’s combat mission in Iraq. Once Dyer inures himself to the ’round-the-clock “crash and thunder” of the in-transit jets and the “aftertaste of the big meats” served in the mess, he’s at ease to report on the daily encounters prearranged for him. Each brief chapter gives us a peek into another nook and cranny of the carrier’s teeming underworld, or the above-deck “island,” “the bridge and assorted flight-ops rooms rising in a stack from one side of the deck: an island on the island of the carrier.” …

For all the snap and snark in his prose, Dyer can’t tamp down his generosity of spirit forever. This unbeliever — in faith, in wayward military action, in bad food and the snorting of bath salts, even in mourning the death of his parents — ends the breezy Another Great Day at Sea with stunning economy and emotional force, and in the most unexpected way. He says a prayer for the men and women of the USS George H.W. Bush — and for all of us at sea.

Subscribers to The New Yorker can read an excerpt of the book here.

A New York State Of Rhyme

by Matthew Sitman

This year marks the fiftieth anniversary of the publication of Frank O’Hara’s Lunch Poems, which Micah Mattix lauds as a love letter to New York City:

With its references to Park Avenue, Times Square, Pennsylvania Station, liver sausage sandwiches, the Five Spot, the Seagram Building, the opening of the American Folk Art Museum, and much more, it is a very New York book. O’Hara walks around the buzzing city, buys “a chocolate malted” or “a little Verlaine,” remembers a friend’s birthday, and talks to the Puerto Rican cabbies before rushing back to his desk at the MoMA with a copy of Reverdy’s poems in his pocket. Born in Baltimore and raised in Grafton, Massachusetts, O’Hara moved to New York in 1951 and stayed until his untimely death in 1966. The city offered freedom, possibility, movement, all of which O’Hara associated with life. “I can’t even enjoy a blade of grass,” he once wrote, “unless I know there’s a subway handy.” It also offered him a community of fellow outcasts, poets, and artists who became, as Lytle Shaw notes, a surrogate family.

Lunch Poems is still popular with New Yorkers today: In 2012, when the Leonard Lopate Show asked listeners to vote on 10 objects that “best tell New York’s story,” it came in at number six—just above the Brooklyn Bridge.

Even more, Mattix finds the collection has “an appeal that reaches beyond the time and place it was written,” remaining popular, in part, because of the way O’Hara’s language resonates with our own:

Casual, sardonic, funny, and full of pop-culture references, Lunch Poems has all the brevity, informality, irony, and at times chatty pointlessness of modern discourse without having been influenced by it. The volume has never gone out of print, in part because O’Hara expresses himself in the same way modern Americans do: Like many of us, he tries to overcome the absurdity and loneliness of modern life by addressing an audience of anonymous others.

O’Hara’s Lunch Poems—like Facebook posts or tweets—shares, saves, and re-creates the poet’s experience of the world. He addresses others in order to combat a sense of loneliness, sharing his gossipy, sometimes snarky take of modern life, his unfiltered enthusiasm, and his boredom in a direct, conversational tone. In short, Lunch Poems, while 50 years old, is very a 21st-century book.

Last spring the Dish featured one of O’Hara’s poems here.

A Cure For Aging?

by Jessie Roberts

Richard Walker, a scientist who specializes in aging research, believes the key to ending human aging lies in a rare disease known as “syndrome X.” Babies afflicted with the condition grow older but not bigger, remaining physically “marked by what seems to be a permanent state of infancy.” Here’s an excerpt from Virginia Hughes’s arresting account of Walker’s quest for immortality:

Brooke’s body seemed to be developing not as a coordinated unit, [Walker] wrote, but rather as a collection of individual, out-of-sync parts. He used her feeding problems as a primary example. To feed normally, an infant must use mouth muscles to create suction, jaw muscles to open and close the mouth, and the tongue to move the food to the back of the throat. If these systems weren’t coordinated properly in Brooke, it could explain why she had such trouble feeding. Her motor development had gone similarly awry: she didn’t learn to sit up until she was six years old and never learned to walk. “She is not simply ‘frozen in time’,” Walker wrote. “Her development is continuing, albeit in a disorganised fashion.”

The big question remained: why was Brooke developmentally disorganised? It wasn’t nutritional and it wasn’t hormonal. The answer had to be in her genes. Walker suspected that she carried a glitch in a gene (or a set of genes, or some kind of complex genetic program) that directed healthy development. There must be some mechanism, after all, that allows us to develop from a single cell to a system of trillions of cells. This genetic program, Walker reasoned, would have two main functions: it would initiate and drive dramatic changes throughout the organism, and it would also coordinate these changes into a cohesive unit.

Ageing, he thought, comes about because this developmental program, this constant change, never turns off. From birth until puberty, change is crucial:

we need it to grow and mature. After we’ve matured, however, our adult bodies don’t need change, but rather maintenance. “If you’ve built the perfect house, you would want to stop adding bricks at a certain point,” Walker says. “When you’ve built a perfect body, you’d want to stop screwing around with it. But that’s not how evolution works.” Because natural selection cannot influence traits that show up after we have passed on our genes, we never evolved a “stop switch” for development, Walker says. So we keep adding bricks to the house. At first this doesn’t cause much damage – a sagging roof here, a broken window there. But eventually the foundation can’t sustain the additions, and the house topples. This, Walker says, is ageing.

Brooke was special because she seemed to have been born with a stop switch. The media were fascinated by her case. Walker appeared with the Greenberg family on television several times and explained why he was so interested in Brooke’s genes. “This is an opportunity for us to answer the question ‘Why are we mortal?’” he said on Good Morning America. “If we’re right, we’ve got the golden ring.”