When The Bride And Groom Forgo God

Annalise Fonza, a former United Methodist clergywoman who now identifies as an atheist, reveals the unexpected trouble she encountered after performing a wedding in Atlanta, Georgia, as a licensed humanist chaplain:

A few days after the wedding, the bride called to inform me that they were at the Fulton County Probate Courthouse where a clerk had a question about my credentials. After several minutes of troubling conversations with the clerk and her supervisor, I was told that Judge James Brock had refused to sign the certificate. “Who are you?” the clerk asked me in reference to the title “Humanist Celebrant,” which I was required to provide on the license. Obviously, the clerk and the rest of the Fulton County Probate Court had never heard the term, nor had they heard of the Humanist Society. After a few more phone calls, I was informed that it was Judge Brock’s final decision to deny the certificate and, hence, it was clear that the couple would have to be married again.

The couple decided to get married again that same day, this time in the Fulton County Probate Courthouse, but after a few weeks of wrangling the judge did confirm that Fonza was qualified to officiate weddings in Georgia. What she takes away from the episode:

The truth of the matter is that anyone who openly identifies as I do must expect public scrutiny and possible rejection. People in the United States still discriminate against atheists, even though more and more people are using the word “atheist” to self-identify. In other words, just because one uses the term openly and proudly, doesn’t mean he or she will be accepted without question or won’t face rejection. In addition, the religious bigotry and social entitlement here in the South is so pronounced—by people of every color and background. Many, including African Americans, openly discriminate against or exclude other black people from social and professional circles when they learn that those others are atheists. …

My first time navigating the courts in Atlanta, Georgia, as a humanist celebrant was far from loving, and it taught me the value of being good and godless in a world that is dominated by systems and people who put their faith in gods, but who don’t seem to have the slightest idea of what it means to be loving and accepting of anyone who doesn’t adhere to their religious beliefs.

A Constitution Neither Living Nor Dead

Reviewing Richard Epstein’s The Classical Liberal Constitution : The Uncertain Quest for Limited Government, Stephen Rohde describes how the libertarian-leaning law professor offers an an approach to constitutional interpretation “which ‘starts from the twin pillars of private property and limited government’ and emphasizes federalism, separation of powers, and economic liberties.” Despite such concerns, Epstein is no originalist:

Epstein takes “originalists,” such as Justice Antonin Scalia, to task for their insistence that all constitutional issues must be based on the text of the Constitution as publicly dish_US constitution understood in 1791. Sounding very much like the “living Constitution” proponents he’s about to criticize, Epstein points out that conservative originalists cannot remain faithful to the text because “the Constitution is written in broad bold strokes, which at some points confer vast powers on government and at others impose major limitations on their exercise.”

He rejects the originalists’ “cramped mode of interpretation which, ironically, is not faithful to the dominant interpretive norms of the Founding period,” because “in no legal system at any time could the question of construction be reduced to a search for original public meaning of terms that are found in the constitutional text.” Epstein also observes that the originalist view “offers no basis for the implication of additional constitutional terms that are dependent on either government structure or the nature of individual rights.” Consequently, “a bare text raises more questions than it answers,” which make it imperative “to isolate the general theory that animates the text — usually the protection of personal autonomy, liberty, and property.”

In an earlier review, however, Cass Sunstein reminded us that progressives are the real enemies of Epstein’s vision:

To give content to what he calls (controversially) “the classical liberal vision,” Epstein offers a foil, the villain of the piece, which he calls the “modern progressive” or “social democratic” approach. He identifies that approach with the 1930s, when, he urges, policymakers jettisoned “the traditional safeguards against excessive state power.” As a matter of law, their ill-advised, and constitutionally illegitimate, reforms became possible for two reasons. First, the progressives saw ambiguity in the constitutional text, thus licensing those reforms. Second, the progressives insisted that unelected judges should recede in favor of We the People, acting through elected representatives. Epstein does not deny that the Constitution is sometimes ambiguous. Importantly, he acknowledges that “our basic conception of the proper scope of government action will, and should, influence the resolution of key interpretive disputes.” But he emphasizes that a “detailed textual analysis” has priority over our preferred views about political theory.

Much of his book consists of comprehensive and exceptionally detailed accounts of how constitutional provisions ought to be understood. In many places his discussion is highly technical, but in some important respectsof course not in allyou can take it as a careful and sophisticated guide to Tea Party constitutionalism.

(Image of first page of the US Constitution via Wikimedia Commons)

Learning From Lolita

Grace Boey explores the morality of a number of forms of sexual deviancy, using Vladimir Nabokov’s novel Lolita – in which the thirtysomething professor Humbert Humbert narrates his sexual relationship with a 12 year old girl named Dolores – as a case study of our “unresolved thoughts and feelings” about perversion. She especially notices how many readers of the book simultaneously condemn pedophilia while harboring complicated feelings about Humbert:

[I]t’s easy for readers to experience spurts of sympathy for Humbert. Dolores, after all, was the one who initiated the affair, and she was hardly unacquainted with sex when she did it, having lost her virginity at a summer camp to a young man named Charlie Holmes. Humbert paints a portrait of himself as an unwitting accomplice to the entire affair, seduced by a manipulative nymphet who was all too good at getting what she wanted. He appears, as well, to fit the criteria for pathological pedophilia or hebephilia, having exclusively been attracted to young girls from the start. Humbert, it seems, was set up for depravity from the start. It is completely natural to pity anyone set up for desires that are so strong, yet so utterly wrong and harmful.

Yet make no mistake:

despite his efforts to portray otherwise, Humbert knew exactly what he was doing when he acted upon his desires, and had numerous opportunities to remove himself from the situation long before his temptress made her advances. And the fact that she advanced upon him is of no consequence: she was still a child with a child’s mentality (as he himself observed numerous times). As the party with more experience and more power, Humbert should have known better than to succumb—and it is here that the immorality of Lolita lies.

As the novel progresses, it becomes sadly obvious that young Dolores was not fully cognizant of her best interests when she initiated the affair. Along the way she becomes irreparably damaged by their relationship: she learns to view sex as a transaction, and enters a confusing cycle of resentment and dependence on Humbert that characterizes so many real-life cases of child sexual abuse. Dolores, in the end, escapes to land up broke and pregnant in a clapboard shack, dying at the age of 17 after a stillbirth on Christmas day.

(Video: Scene from Stanley Kubrick’s 1962 film adaptation)

Saving Face

When Richard Norris was 22, he shot himself in the face. In 2012, he received a full facial transplant from a recently-deceased 21-year-old man. In a fascinating profile, Jeanne Marie Laskas describes meeting Norris:

He seems nervous. His hands tremble, bringing constant sips of water to his mouth. His lips can’t quite grip the bottle, so each sip is more a little pour. He fights a constant drool with the help of a towel. His new face is a marvel nonetheless. It’s a new face. Wide and open, the cheekbones of an Irishman and the wrinkle-free complexion of a college kid. It’s difficult to reconcile the youthful face with the body of a man nearly 40. I am trying not to stare. I am trying to stop looking for the seams, where the new connects to the old, the eyelids, the neck, the scar in front of his ears. I am trying to stop thinking about his beard, which isn’t really his beard, except now it is, and it grows. I’m distracted by a thousand little thoughts like these. Coupled with his lack of facial expression—a solid, largely unmoving veneer—in all these ways the barrier to getting to know Richard feels to me immediately and appreciably steep.

Norris now takes medicine “to keep his immune system at 50 percent strength—just enough to ward off common viral and bacteria infections but not so strong that his own antibodies attack his new face”:

“He’s not supposed to smoke,” his mom says. He can’t get sunburn. He can’t get a cold. He can’t drink. He can’t fall and risk injury. He can’t afford to tax his immune system at all. Even a cut could trigger rejection. It starts as a blotchy rash; it means his body is winning the fight to reject the transplant, and Richard has to be flown to the hospital to receive rounds of emergency drugs intravenously. Uncontrollable rejection would mean an almost certain death; the only things left of Richard’s old face are his eyes and the back of his throat. Everything else is now gone for good. “I have to keep watch that his face doesn’t go yellow,” his mom says. “He’s had two rejections so far.”

“I’ll leave you two talking,” Richard says, and he heads outside for a smoke.

“The Supreme Art Form”

Roger Scruton nominates opera, hailing it as “not so much a representation of human life as a redemption of it. For dramatic music can rescue our feelings from their randomness, and vindicate our immortal longings in the face of chaos and decay”:

The complaint was already made in Monteverdi’s Venice that singing detracts from the realism of the stage. The verismo of Verdi was a response to this complaint, an attempt to tie the melodic moment to the particular person in a believable situation — and no one can doubt his success in this. But Wagner had another and more persuasive response to those who dismissed his operas as mere fairytales. By lifting everything — character, setting, emotion and gesture — into the imagined space of music, he believed, we achieve another and higher kind of realism. Words and music develop together, and the purpose of both is drama.

Opera conceived as a sequence of arias, loosely joined by recitative, thereafter disappeared. Even Italian composers quietly adopted the Wagnerian ideal, so that by the time of Puccini it was universally accepted that operas should be through-composed, each act working towards its climax by largely musical means, with the musical material constantly reworked in accordance with the logic of the drama. …

Many people have an opera buried within them: so at least I believe. For the inner life is essentially operatic. It sings to itself in many voices, and we strive in our dreams and meditations to bring those voices into line, to turn discord to concord, and conflict to resolution. Precisely because the characters in opera sing their passions, we sense that these passions are really cosmic forces, whose scope is far greater than the mere individuals who represent them. Through opera our inner life is summoned from hidden regions and resolved before us on the stage.

(Video: Overture of Verdi’s La Traviata performed at The Royal Opera House in London in 1994)

Are Religious Stories Bad For Kids?

Mark Joseph Stern warns religious parents, “All that talk of snake-inspired subterfugeplanet-cleansing floods, and apocalyptic horsemen might hamper kids’ ability to differentiate between fantasy and reality – or even to think critically”:

That’s the implication of two recent studies published in Cognitive Science in which researchers attempted to gauge perceptions of reality in religious and secular children. (The religious children were all from Christian families, from a variety of denominations.)

In one study, the researchers read realistic stories and fantasy tales to the kids. Some of the fantasy tales featured familiar biblical events – like the parting of the Red Sea – but with non-biblical characters. (In the retelling of the Red Sea story, Moses was called John.) Others featured non-biblical but clearly magical events – the parting of a mountain, for instance—as well as non-biblical characters. … Every child believed that the protagonist of the realistic stories was a real person. But when asked about the stories featuring biblically inspired or non-biblical but magical events, the children disagreed. Children raised with religion thought the protagonists of the miraculous stories were real people, and they seemed to interpret the narratives – both biblical and magical – as true accounts.

Luke Malone throws cold water on Stern’s interpretation of the studies:

[Study author Kathleen Corriveau] stresses that this needn’t be seen as strictly negative. “In no way should the findings of this study point to any sort of deficit in one group or the other,” she says. “Indeed, in some instances, the ability to suspend disbelief could be viewed as a benefit. For example, when exposed to counterintuitive phenomena—such as modern physics—a suspension of disbelief might assist in learning.”

Eliyahu Federman adds, “This study proves a benefit of religion, not a detriment, because research shows how imaginative and fictional thinking, fantasy play, aid in the cognitive development of children. Raising children with fantastical religious tales is not bad after all.” Meanwhile, Brandon Ambrosino looks at the findings in light of Justin L. Barrett’s book Born Believers, which argues that “kids are born with a tendency toward thinking that there is some sort of supernatural agent behind this order”:

Or, as he put it to me over the phone, “children have a number of natural dispositions to religious beliefs of various sorts.” And while he believes that these dispositions can “certainly be overridden by certain kinds of cultural and educational environments,” he thinks the research shows that a child’s cognitive “playing field is tilted toward religious beliefs.”

A new study out this month, however, pushes against Barrett’s conclusion. Published in the July issue of Cognitive Science, the article presents findings that seem to show that children’s beliefs in the supernatural are the result of their education. Further, argue the researchers, “exposure to religious ideas has a powerful impact on children’s differentiation between reality and fiction.” In other words, said Kathleen Corriveau, one of the study’s co-authors, the study found that childhood exposure to religious ideas may influence children’s “conception of what could actually happen.” She also told me her research suggests that Barrett’s Born Believers thesis is wrong — that children don’t possess an “innate bias” toward religious belief.

What Do The “Spiritual But Not Religious” Really Believe?

In her new book, Belief without Borders: Inside the Minds of the Spiritual but not Religious, Linda A. Mercadante attempts to find out. Kristin Aune runs down the essentials of “SBNRs”:

[Mercadante] explores their thoughts on transcendence, human nature, community and afterlife and finds that they don’t believe in an interventionist or personal God (if “God” exists, they think God is part of creation, not separate from it). As for human nature, they don’t see themselves as sinners needing salvation, but as “inherently good” selves needing freedom and choice so that their “purity, even divinity” can shine.

This focus on the self affects their view of community. “Many interviewees did much more than just ‘question authority’,” Mercadante says. “Instead, they relocated it within, relativized it to each person, and detached it from any particular spirituality community.” Some belonged to recovery groups such as Alcoholics Anonymous, but none had a longstanding affiliation with a spiritual community. This makes it hard for them to sustain shared group beliefs or behaviour, and Mercadante thinks it impedes their ability to benefit society.

On life after death, SBNRs share ground with Hindu beliefs, reflecting what Colin Campbell calls “the Easternization of the West”. Most believe in reincarnation and “karma, endless opportunities, inevitable progress, expanding consciousness, and the very American ideal of free will and personal choice”. Their optimism is clear: reincarnations will be better, not worse, than their previous life. Actions have consequences, but only positive ones.

In a recent column on the subject, Mark Oppenheimer depicts Mercadante as pushing back against claims that SBNR thinking leans shallow and unserious, noting that “she makes the case that spiritual people can be quite deep theologically” (NYT):

An ordained Presbyterian minister whose father was Catholic and whose mother was Jewish, Dr. Mercadante went through a spiritual but not religious period of her own — although she now attends a Mennonite church. For her project, she … found that these spiritual people also thought about death, the afterlife and other profound subjects.

For example, “they reject heaven and hell, but they do believe in an afterlife,” Dr. Mercadante said recently. “In some ways, they would fit O.K. in a progressive Christian context.” Because they dislike institutions, the spiritual but not religious also recoil from the deities such institutions are built around. “They may like Jesus, he might be their guru, he might be one of their many bodhisattvas, but Jesus as God is not on their radar screen,” Dr. Mercadante said.

While she was writing the book in 2012, Mercadante gave an interview in which she addressed the role of stereotypes involved in these discussions:

I think it does come, in part, from portrayals of conservative Christianity in the media, as some kind of hegemonic, monolithic authority. This whole thing is fraught with stereotypes. Most people don’t take the time to listen to each other, to ask questions. There are terrible misconceptions on both sides, as to what Christians are, and what SBNRs are. SBNRs see “religion” as the external structure and the dogma, whereas “spirituality” is the individual’s personal experiences of transcendence. That definition really is not an accurate portrayal of religion or of spirituality. Nevertheless, the majority of my interviewees insist that spirituality is the personal center and quest for an individual, whereas religion is something external, rule-ridden and institutional. In their thinking, religion is nothing more than a dispensable shell.

A Unified Theory Of Bob Dylan’s Weirdness

Bill Wyman hones in on Dylan’s artistic process, which shows that the singer-songwriter “doesn’t trust mediation or planning” and aims to “strip away everything that stands between Bob Dylan’s art and his audience.” He finds that this approach seems to explain both Dylan’s relentless touring and the astonishing inconsistency of his increasingly self-produced studio albums, both of which offer – for better or worse – Dylan just being Dylan. About that touring:

In Chronicles, Dylan details, with seeming frankness, the aimlessness that brought him to a slough of despond at the end of the ’80s. He may have been facing what all rock stars who survive face, which is how to grow old gracefully in a medium cruelly tied to youthfulness. He resolved to get out and play his songs—and went back on the road in 1988 with a small, seldom-changing backing ensemble, with whom he delved into his back pages, including many songs he’d never played live before.

Here’s the odd thing—26 years on, he hasn’t stopped. He’s been playing about 100 shows annually ever since, growling through a set of songs old and new with a small band. It’s an endeavor that for a good chunk of each year keeps him on a private bus and, in the U.S. at least, in relatively crummy hotel and motel rooms. … The shows at first may have been a tonic, but over time they revealed themselves to be a panacea. It must have struck Dylan: How could he look foolish if he just kept doing the same thing? If he were an artist, he would continue to create and show his art publicly. If he were a celebrity, he would appear in public. And if he were a seer, a prophet, or even a god, well, he would let folks pay and see for themselves how mortal such figures actually were.

Wyman’s conclusion about understanding Dylan:

If Bob Dylan is a question, maybe this is the answer. Given the chance, Dylan will give the audience his art, unadulterated, as he creates it, and nothing more. He believes it’s a corruption of his art to be directed by someone else’s sensibility. In its own weird way, isn’t this one sacred connection between artist and audience? It might be nicer if he did things differently. It might be more palatable, more commercially successful. (He might be somewhere by now.) This is what ties together his signal creations, his ongoing shows, and even the wretched albums of the ’80s and ’90s; what he does might be sublime and ineffable or yet also coarse and unsuccessful; it is what it is, defined by where it comes from, not what it should be. Even his remoteness is a by-product; it’s what he deserves after having given his all. Call the work art, call it crap, call it Spanish boots of Spanish leather, but in the end it’s the creation of an artist who defies us to ask for something more.

Recent Dish on Dylan here, here, and here.

I See London, I See France …

Morwenna Ferrier finds that fewer French women are sunbathing topless. Is American media culture to blame?

Alice Pfeiffer, a 29-year-old Anglo-French journalist (who, incidentally does sunbathe topless in Biarritz, Guéthary, Monaco and surfing resort Hossegor), thinks the decline is inextricably linked to social media: “Young women in their 20s do it less because they are aware that … you can end up topless on your own Facebook wall.”

Pfeiffer blames “pop-porn culture – Miley Cyrus to American Apparel, ie aggressive naked imagery of young girls” – for the shift in perception of going topless. “Globalisation and Americanisation of women’s portrayal and sexiness in France has pushed away gentle (and generally harmless) French eroticism towards porno, frontal, hyper-sexualised consciousness,” she says. “Nudist, beach-like freedom is not what it used to be … breasts no longer feel innocent or temporarily asexual.”

The Germans, apparently, are the most likely to go nude at the beach. Rebecca Schuman wishes Americans would follow suit:

[A]s frantic as Americans get about the public dirty-pillows-baring of nubile young women, even self-professed progressives seem to balk at the free flaunting of a diverse array of bodies, i.e. nudity that “nobody wants to see”: older bodies, overweight bodies, scarred bodies, bodies who dare to have birthed children and remain unashamed about it. Consider Jezebel’s Kelly Faircloth, who just last week scolded the entire Speedo-wearing world—on a site that prides itself on body acceptance.

You’ll never see a German shocked at the sight of a rotund 65-year-old man with his Schawnz und Eier semi-clad by a Speedo or totally nackt; young Germans frolicking bare-breasted by the pool will receive at most a blasé once-over from their male companions. Of course, if your religion (or, like me, your infernal pallor) requires modesty at the beach, then by all means wear whatever you want. But for those whose prudery (and dislike of seeing others) comes not from necessity but conditioning? Maybe a little “free body culture” wouldn’t hurt.

Aural Sex

Autonomous Sensory Meridian Response, or ASMR for short, refers to the pleasant tingles some people get from certain sensations, particularly whispers and other soft sounds. The little-understood phenomenon has not attracted much scientific research, but has spawned a sizable community on YouTube, where a search for “ASMR” results in 2.8 million hits. Jordan Pearson takes a look at this subculture and the wildly popular videos, like the one above, that its members produce and consume:

ASMR as an internet phenomenon that took off in 2010, when a Reddit thread asking if anyone else had ever experienced it went viral, and thousands of people realized they weren’t the only ones who’d noticed the pleasant and foreign feeling. An internet subculture of roleplay videos meant to evoke the sensation has since taken off. Tingle-seekers—lots of them—watch videos delivering agreed-upon triggers like soft whispers in order to feel what devotees vaguely describe as “brain orgasms” or pleasant tingles, though there really isn’t any word in the English language to accurately describe the strange sensation.

Many people have started making these videos themselves—gaining hundreds of thousands of YouTube subscribers along the way—and often with a twist: elaborate roleplaying with a weirdly maternal bent.  “The most popular roleplay requests are the ones that involve a lot of what I call ‘personal attention.’ An example of that would be, if you go to the eye doctor, for instance, they’re going to be very close to you,” Ally Maque, an ASMR YouTube personality with over one hundred thousand subscribers told me. …

“I think the ASMR movement, demanding eye contact and prolonged attention, has sort of an undercurrent of optimism and care in the videos themselves that’s really nice. It’s hopeful,” Nitin Ahuja, a doctor and academic who published a recent paper on the topic, told me. “That’s really interesting to see against a backdrop of cynicism about technology wholesale.” Whether the popularity of ASMR videos that express caring and otherwise loving sentiments is a good or a bad thing, broadly speaking, is beside the point and probably a little unfair to the people who enjoy them, Ahuja said.