The Ways Guns Kill People

A reader writes:

I’m disappointed that you only put up the numbers from accidental gun deaths. It seems a bit disingenuous, as the number of non-accidental car deaths, pool deaths, etc., are, of course, dramatically lower. In 2010 the FBI recorded 12,996 homicides. Of those, 8,775 were committed with guns. That compares to 1,704 with knives, the next closest, 540 with blunt objects, and 11 with poison. Even if you would argue that, of those killed by guns, many would have been killed with another guns-cmbn11weapon, it’s hard to see how that would directly play out. How many drive-by knifings can you have? How many people can get hit by crossfire from a baseball bat?

How about suicide? In 2010, we had 19,392 gun suicides. Not so many with cars. And for those who would argue that guns don’t matter when it comes to suicide (i.e. people will kill themselves regardless of what tools they have to accomplish the deed), multiple studies have proven that access to guns dramatically raises the risk of a successful suicide attempt.

But if you want to stick with just accidental deaths, as you’ve done, let’s contextualize it a bit. From 2005-2010, almost 3,800 people in the US died from unintentional shootings. 1,300 of those were under the age of 25. 31% of those shootings could have been prevented by the addition of two devices: a child-proof safety lock and a loading indicator. And 8% of those shootings (that’s 304) were carried out by shots fired from children under the age of six. How many accidental road deaths are caused by drivers who are under the age of six?

So, yes, lots of stuff can kill you. No surprise. But in the US, we’re at a much higher risk of death by firearm because of the lobbying efforts of the industry whose product is design to kill.

Another reader, from the other side of the debate, quotes Waldman:

On one hand, there are over 300 million of us, so only one in 500,000 Americans is killed every year because his knumbskull cousin said “Hey Bert, is this thing loaded?” before pulling the trigger. You can see that as a small number. The other way to look at is that each and every day, an American or two loses his or her life this way. In countries with sane gun laws, that 606 number is somewhere closer to zero.

That sentence encapsulates what I hate about the anti-gun crowd.

While Waldman is ahead of the game in that he at least admits that at .5% of all accidental deaths make accidental gun deaths a pretty low priority, he goes on to say that we should eliminate all personal gun ownership to take care of it anyway. Why does this bother me? Well, because it says that he doesn’t value my desire to own a gun to the point where he would take my gun to solve a problem he just admitted was insignificant.  So by extension, what I want is even less significant than this insignificant issue.

Look, as a responsible gun owner I want to reduce the number of gun deaths, and there are many ways of doing this, from requiring guns to be locked up when not in use so that minors cannot accidentally shoot somebody, to universal background checks to at least make it difficult for criminals to get their hands on guns.  The problem is that it is difficult to work with somebody who puts such a low value on something that you value that they see no reason why anybody would even want what you want.

If you want to know why it is so easy for the NRA to sell the idea that some people want to take your guns away look no farther than Paul Waldman (and Obama, Bloomberg, Feinstein and others) who on one hand say they don’t want to take your guns while making statements that make it clear they don’t value you having one.

Update from a reader:

Your reader wrote, “While Waldman is ahead of the game in that he at least admits that at .5% of all accidental deaths make accidental gun deaths a pretty low priority, he goes on to say that we should eliminate all personal gun ownership to take care of it anyway.” I read the Waldman atricle you posted, and it mentions no such thing. I started reading other articles Mr. Waldman has written and it’s clear he favors more gun control laws (expanded background checks, limits on amounts of ammunition which can be purchased, are two examples I found), but nowhere have I seen a claim to eliminate all personal gun ownership – and he certainly doesn’t “go on to say” that in the linked article.

In the article he does mention other countries with “sane gun laws.” Few countries totally ban the ownership of guns. There is some chance that Mr. Waldman is speaking of Japan, which does come close to forbidding ownership, but, for example, most of Europe allows private gun ownership. It’s really hard to conclude that elimination of all gun rights is what Mr. Waldman means by the phrase.

Your reader seems to equate any talk of “sane gun laws” with a prohibition of ownership, but goes on to advocate for laws such as “requiring guns to be locked up when not in use so that minors cannot accidentally shoot somebody, to universal background checks to at least make it difficult for criminals to get their hands on guns” which are actually to the left of the Toomey-Manchin bill, which the NRA fought so hard against. The reader seems to have more in common with Mr. Waldman then the NRA, but sees his ally as the enemy.

(Chart based on data compiled by the Centers for Disease Control and Prevention, via Leon Neyfakh)

The Data-Driven Life

dish_dailysteps

Ed Finn argues that “when we start depending on our computers to explain how and why things happened, we’ve started to outsource not just the talking points but the narrative itself”:

The idea that a computer might know you better than you know yourself  may sound preposterous, but take stock of your life for a moment. How many years of credit card transactions, emails, Facebook likes, and digital photographs are sitting on some company’s servers right now, feeding algorithms about your preferences and habits? What would your first move be if you were in a new city and lost your smartphone? I think mine would be to borrow someone else’s smartphone and then get Google to help me rewire the missing circuits of my digital self. …

But of course we’re not surrendering our iPhones or our cloud-based storage anytime soon, and many have begun to embrace the notion of the algorithmically examined life. Lifelogging pioneers have been it at it for decades, recording and curating countless aspects of their own daily existences and then mining that data for new insights, often quite beautifully. Stephen Wolfram crunched years of data on his work habits to establish a sense of his professional rhythms far more detailed (and, in some cases, mysterious) than a human reading of his calendar or email account could offer. His reflections on the process are instructive:

He argues that lifelogging is “an adjunct to my personal memory, but also to be able to do automatic computational history—explaining how and why things happened.” We may not always be ready to hear what those things are. At least one Facebook user was served an ad encouraging him to come out as gay—a secret he never shared on the service and had divulged to only one friend. As our digital selves become more nuanced and complete, reconciling them with the “real” self will become harder. Researchers can already correlate particular tendencies in Internet browsing history with symptoms of depression—how long before a computer (or a school administrator, boss, or parent prompted by the machine) is the first to inform someone they may be depressed?

On a related note, Nick Diakopoulos urges reporters to focus more on algorithms:

He wants reporters to learn how to report on algorithms — to investigate them, to critique them — whether by interacting with the technology itself or by talking to the people who design them. Ultimately, writes Diakopoulos in his new white paper,“Algorithmic Accountability Reporting: On the Investigation of Black Boxes,” he wants algorithms to become a beat:

We’re living in a world now where algorithms adjudicate more and more consequential decisions in our lives. It’s not just search engines either; it’s everything from online review systems to educational evaluations, the operation of markets to how political campaigns are run, and even how social services like welfare and public safety are managed. Algorithms, driven by vast troves of data, are the new power brokers in society.

(Image of Wolfram’s chart based on data from his digital pedometer via Stephen Wolfram)

Apathetic Atheism vs New Atheism, Ctd

A reader speaks up for “church-going atheists”:

We are many.  Just not all of us are open about it!  I don’t like the term “atheist” (being freighted with Dawkins anti-theism); I prefer the term “non-believer”.  I passed through my ex-Catholic/angry-atheist phase to a post-religion phase where I value what we have in common more than I care about what separates us. I go to church because 1) I married a woman of deep faith, and 2) because we found our way to a community that welcomes both of us, when she was effectively driven out of her cradle Catholicism by the horrors of California’s Prop8.  In fact, I was lobbying her to become Episcopalian for years, as that seems the logical place for a Vatican II-style Catholic with progressive views of church and justice.

Mutual respect makes our inter-faith relationship work.  My wife’s service in church (she’s now the Head Verger of an Episcopal cathedral) is a major part of her life, and I love her completely, so of course I support her wholeheartedly.  And she respects who and where I am as well. I too was raised Catholic, so the ancient rhythm of the liturgy is familiar, and the music is simply amazing (thankfully she went “nosebleed high” when she swam the Thames).  I guess I am essentially a cultural Trinitarian sacramentalist Christian, even if I don’t believe per se.  So I’ll do gladly do Episcopal calisthenics on a Sunday, though I don’t pray, sing, or take communion, because that would be disrespectful of the community.

As the members of my church say, “Whoever you are, wherever you are on the journey of faith, you are welcome here.” Kinda restores my faith in Christianity.

Another reader:

Reading your most recent post on Apatheism, I thought I’d relate the following story of how politics have made this outspoken atheist into a staunch defender of religious freedom.

I’m what you might call a “movement atheist.” I go to cons. I write for a well-known skeptical website. I am 100% for the complete separation of church and state. But in the last year I have found myself in the rather unexpected position of loudly and publicly advocating for the right of Muslim women to wear the hijab, among other public expressions of religious values.

You see, I live in Quebec, where the separatist government’s proposed “Charter of Values” would ban public sector workers – which here includes all university and hospital employees in addition to your standard public servants and primary/secondary educators – from wearing any “ostentatious” religious symbols. This includes not only the hijab, but also the turban and the kippah for observant Sikhs and Jews.

The ban does not, however, extend to employees of state-funded Catholic parochial schools, which receive substantial government funding, or to the giant cross in the National Assembly, which is part of Quebec’s “cultural heritage.”

The brazen xenophobia of the whole endeavour is utterly repellant to me, especially given the province’s worrying history of anti-semitism, but the proposed charter is unfortunately quite popular with the general electorate outside of Montreal (where nearly all the affected populations live and work) and may in fact lead to the PQ winning a majority government in the upcoming election and forcing it through.

Should the measure pass, many committed atheists like myself plan nonetheless to wear banned religious articles in solidarity with our colleagues of faith. Needless to say, as an atheist activist, this is just about the last thing on earth I would ever have expected to do, but racist politics make very strange bedfellows.

Read the whole discussion thread here.

When Jesus Was A Hippie

With the release of Son of God, a new movie about the life of Jesus, Edward J. Blum and Paul Harvey remember that “Jesus films have not always been so serious, and they have not always been directed toward particular segments of the Christian community”:

Neither Godspell, nor Jesus Christ Superstar endeavored to convert unbelievers to Christianity. If they had a particular audience in mind, it was those who loved the theater. Both originated as stage productions. Tim Rice, the writer of Jesus Christ Superstar, explained later that he was fascinated with Jesus as a human and not as a divine figure. Following the theological controversy of the 1960s that had some leading religious thinkers emphasizing the actions of humans and not the powers of God, Rice tapped into this humanistic concern. In some respects, Godspell and Jesus Christ Superstar wanted to have their communion bread and eat it too. They followed the Jesus People movement by repacking Jesus in the idiom of cool, but also indulged in the new theologies that wanted to dispense with his divinity.

Blum and Harvey go on to ponder why, then, “major motion pictures about Jesus returned to their serious ways”:

So what was unique about the 1970s? One main difference was that evangelicals and Catholics had yet to form tight political bonds and had yet to become a powerful niche market. By the time of [Mel] Gibson’s Passion of the Christ, an entire industry of the devout had been created. It boasted singers like Amy Grant, athletes like NBA star David Robinson, restaurants like Chic-fil-A, and painters like Thomas Kinkade. Those markets, along with conservative media outlets, made the Son of God and The Passion not only possible, but lucrative. In fact, a main sponsor of The Bible miniseries was the dating site Christian Mingle.

Recent Dish on God and Hollywood here.

(Video: Clip from Jesus Christ Superstar, 1973)

Understanding Shylock

In a review of David Nirenberg’s Anti-Judaism, Michael Walzer observes that the author “insists, rightly, that real Jews have remarkably little to do with anti-Judaism.” Elaborating on how “imaginary Jews” came to capture the minds of thinkers who had little experience with actual Jewish people, Walzer draws on Nirenberg’s discussion of Shakespeare’s The Merchant of Venice “to illustrate the difference between his anti-Judaism and the anti-Semitism that is the subject of more conventional, but equally depressing, histories”:

Shylock himself is the classic Jew: he hates Christians and desires to tyrannize over dish_shylock2 them; he loves money, more than his own daughter; he is a creature of law rather than of love. He isn’t, indeed, a clever Jew; in his attempt to use the law against his Christian enemy, he is unintelligent and inept. (A modern commentator, Kenneth Gross, asks: “What could [he] have been thinking?”) But in every other way, he is stereotypical, and so he merits the defeat and humiliation he receives—which are meant to delight the Elizabethan audience. …

Nirenberg’s question [is]: What put so many Jews (like Shylock or Marlowe’s Jew of Malta) on the new London stage, in “a city that had sheltered fewer ‘real Jews’ than perhaps any other major one in Europe”? His answer—I can’t reproduce his long and nuanced discussion—is that London was becoming a city of merchants, hence a “Jewish” city, and Shakespeare’s play is a creative response to that development, an effort to address the allegedly Judaizing features of all commercial relationships, and then to save the Christian merchants by distinguishing them from an extreme version of the Jew.

But the distinction is open to question, and so the point of the play is best summed up when Portia asks, “Which is the merchant here, and which the Jew?” The play is about law and property, contracts, oaths, pledges, and promises. Shylock is the Jew of the gospels: “I stand here for law.” But he is defeated by a better lawyer and a more literal reading of the law: Portia out-Jews the Jew—which is surely an ironical version of Christian supersession.

So Shakespeare understands the arrival of modern commerce with the help of Judaism, though he knew no Jews and had never read a page of the Talmud. He knew the Bible, though, as Shylock’s speech about Jacob multiplying Laban’s sheep (Act 1, scene 3; Genesis 30) makes clear. And Paul and the gospels were a central part of his intellectual inheritance. Shylock emerges from those latter texts, much like, though the lineage is more complicated, Burke’s “Jew brokers” and Marx’s “emancipated Jews.” The line is continuous.

Previous Dish on Nirenberg’s book here and here.

(Image of Shylock by László Mednyánszky, circa 1900, via Wikimedia Commons)

Returning To The Cosmos, Ctd

Laura Pearson recommends the revamped Cosmos TV series, which premieres tonight with Neil deGrasse Tyson stepping in for Carl Sagan:

Popular astrophysicist Neil deGrasse Tyson hosts this time around, but the plot of the 13-part series remains largely out of this world, exploring the farthest reaches of the knowable universe and the origins of life on Earth. Spoiler alert for anyone who missed Sagan’s groundbreaking Cosmos: A Personal Voyage, the instructive 1977 film strip Powers of Ten, or every science class ever: The universe is old. Mind-blowingly old and vast. To explore it is to better understand ourselves.

That’s why watching Cosmos: A Spacetime Odyssey is a matter of necessity, and what distinguishes it from what’s usually on TV—Hell’s Kitchen or Kitchen Nightmares or Hotel Hell or My Cat From Hell or whatever. Whereas the vast nebulae of reality shows entice viewers with the notion that ordinary people (and cats) can access instant fame, Cosmos shows reality as it applies to everyone, famous or not.

Matt Zoller Seitz sees plenty to admire about the new version:

The show deploys blockbuster-quality visual effects, triumphant orchestral music by Alan Silvestri (who scored Robert Zemeckis’s Contact, which is based on Sagan’s novel), and cartoon interludes rendered in the handmade style of a graphic novel to make the Big Bang, the rise of Copernican astronomy, and the basic principles of evolution seem as Hollywood-­magnificent as the latest Marvel opus. But its greatest special effect is the laid-back charisma of its host, Neil deGrasse Tyson. Affable, plainspoken, and charming without seeming full of himself, Tyson might have seemed like Sagan’s obvious successor even if the master hadn’t mentored him as a teen (a tale movingly recounted in the first episode’s final moments). He lacks Sagan’s Vulcan-like vibe but compensates with a self-­deprecating average-guyness that often morphs, delightfully, into Danny Ocean cool. (It’s hard to imagine Sagan getting away with donning shades to watch the Big Bang, as Tyson does here.)

Willa Paskin describes a segment about the 16th century monk Giordano Bruno, who was burned at the stake for holding heretical scientific beliefs, and pushes back on Seitz’s suggestion that the new Cosmos is anti-faith:

Writing in New York magazine, Matt Zoller Seitz interpreted this segment of Cosmos as “painting organized religion as an irrelevant and intellectually discredited means of understanding factual reality” and as part of the show’s larger “pushback against faith’s encroachments on the intellectual terrain of science.” (This is particularly in contrast to the sort of echt-spirituality and new-ageism that hovered around the original Cosmos. Sagan himself was agnostic.) Organized religion certainly comes in for it, but I think this segment is up to something more gentle than declaring war on blinkered anti-science evangelists. Cosmos is offering viewers a way to reconcile science and faith: Don’t let your god be too small.

In an interview, Tyson discusses how to think about the new show:

If you only think of “Cosmos” as a science documentary, then the natural obvious question would be, “Well, it’s been 35 years. What has changed?” However, “Cosmos” wasn’t only that, and it wasn’t even mostly that. “Cosmos” is mostly “Why does science matter to you? Why should you care about science? Why should society care about what scientists say? How can you empower your own destiny by becoming scientifically literate?” …

So yeah, since then, we’ve discovered nearly 1,000 planets orbiting other stars, and Europa, one of Jupiter’s moons has an ocean underneath it. There are other experiments that have gotten us closer to understanding the Big Bang. And, of course, the politics are different. Back then, we were steeped in a Cold War and using weapons that were imagined by scientists and used to hold the world hostage.

So, the climate was different than it is now, but there are other prevailing concerns that we have: What is our effect on the environment? Will we be good shepherds of this Earth as we go forward? Do we know enough to be good shepherds of this Earth? Do we understand the risk of asteroids that could render us extinct? These are broad questions, and “Cosmos” takes some element of science and shows you why it is way more relevant to your life than you ever previously imagined.

Another interview with Tyson is here. Previous Dish on the new Cosmos here.

If Plato Were Still Around

Rebecca Newberger Goldstein’s Plato at the Googleplex recreates the philosopher’s dialogues for contemporary times. David Auerbach reviews the book:

“Plato at the 92nd Street Y” pits him against the Chua-ish “Warrior Mother” Sophie Zee, discussing Republic’s hypothetical “city of pigs” and testing out in the Myers-Briggs typology as an INTJ (just like yours truly); “Plato on Cable News” has him exchanging blows with a bloviating Bill O’Reilly clone named Roy McCoy. And yes, here is Plato at the Googleplex, debating an engineer over the possibility of crowdsourcing ethics, as well as wryly comparing its communal environment to the training of young philosopher-kings in the Republic. (I used to work for Google, and believe me, we had it way better than Plato’s ascetic Guardians.)

By alternating between these new “Platonic dialogues” and a serious chronicle of Plato’s life and philosophy, Goldstein makes a plea for the continuing importance of philosophy as Plato (427–347 B.C.) conceived it, and for the enduring relevance of Plato’s contributions.

Elizabeth Toohey weighs in:

[T]here’s something moving, if also faintly depressing, about reading Plato inserted into 21st-century Western culture.

It brings to light the gap between what might have been and what is – and what we appear to be moving toward. Consider, for instance, Plato’s musing, “I do not think that rulers should be able to own substantive private property, for substantive property will immediately make them citizens of the city of the rich, with its own special interests to protect.”  It doesn’t take much imagination to deduce what Plato would have to say about PACs.

Nick Romeo interviewed Goldstein:

[Q] How do you think Plato would respond to the cultural dominance of television and cinema?

[A] He’d be very alarmed. You can’t help but think immediately of his allegory of the cave in The Republic: The lowest form of consciousness is that of prisoners in the dark staring at images. He would be quite in despair. He would think that we were enchanted by the lowest form of thinking. To think is to be active; passivity is the death of the mind.

But he wouldn’t oppose all of contemporary culture. Just this past week I started tweeting as Plato; I guess I’m not ready to stop impersonating him. And this kind of thing, the dialogue that happens on blogs and social media, I think he would be into it. I think he’d be intrigued by that aspect of popular culture: the verbal and the written exchanges.

The fMRI Of The Beholder?

dish_manet

Roger Scruton frowns on “scientism,” which he describes as involving “the use of scientific forms and categories in order to give the appearance of science to unscientific ways of thinking.” The arts and humanities, he insists, lie beyond the reach of the empirical:

Art critics have a discipline, and it is one that involves reasoning and judgment. It is not a science, and what it describes forms no part of the physical world, which does not contain Olympia or anything else you see in Manet’s painting. Yet someone who thought that art criticism is therefore deficient and ought to be replaced by the study of pigments would surely be missing the point. There are forms of human understanding that can be neither reduced to science nor enhanced by it.

Here is where the neurothugs step in, to declare that, of course, the science of pixels won’t explain pictures, since pictures are in the eye of the beholder. But there is also such a thing as the fMRI of the beholder, and this does contain the secret of the image in the frame. Since understanding a picture is a matter of seeing it in a certain way — in such a way as to grasp its visual aspect, and the meaning which that aspect has for beings like us — then we should be examining the neural pathways involved in seeing aspects, and the connections that link those pathways to judgments of meaning.

But what, exactly, would such a study show?

Suppose we have achieved a perfect decipherment of the pathways involved in seeing an aspect and in stabilizing it in the mind of the observer. This is not a judgment of criticism, and while it might enable us to predict that the normal observer will, on confronting Titian’s picture, see a naked woman lying on a couch and looking at him, it will say nothing in answer to the critic who says: Yes, but that is not all that there is, and indeed you must see that this woman is not naked at all, but rather unclothed, that her body, as Anne Hollander shows so convincingly in Seeing Through Clothes, has the texture and the movement of the clothes she has removed, and that those eyes do not look at you but look through you, dreaming of someone you are not.

Critics don’t tell us how we do, with normal equipment, see things, but how we ought to see them, and their account of the meaning of a picture is also a recommendation, which we obey by making a free choice of our own. Neuroscience, then, remains only a science: it cannot rise to the level of intentional understanding, where meaning is created through our own voluntary acts. Hence we should not be surprised at the dreariness of neuroaesthetics, and its inability to cast light on the nature or meaning of works of art.

(Image of Olympia by Manet, 1863, via Wikimedia Commons)

Sister Lives

Casey N. Cep reviews Abbie Reese’s book Dedicated to God: An Oral History of Cloistered Nuns, which documents the lives of the Poor Clare Colettine nuns of Rockford, Illinois:

What does it mean to be called to the religious life? Even the most articulate of these women cannot find the precise words to explain how she came to understand her vocation. The youngest nun says, “I’m sure anyone who falls in love, they look back and say, ‘Oh, remember how we met? Or he showed his love?’ It’s the same, how God has shown his personal love.”

But how does one fall in love? These women are no more capable of explaining their love of the holy than we are of understanding the reasons two human beings are attracted to each other, and yet they try. One sister compares it to God “playing hide-and-seek,” drawing her to the religious life, but leaving her unsure of where to go. Like any love, there is struggle, not only with which of the various religious orders to join but how to live once there; it is not desperation which brings these women to the cloister but desire.

Nic Grosso reviewed the book in January:

While I had hoped to find greater insight into this order of cloistered nuns’ monastic practices, ceremonies, and sources of personal inspiration as they have so much to overcome …, Reese does do an excellent job of presenting the nuns as individuals. They are not fetishized or turned into fringe caricatures with clichéd beliefs. Even when she has a chance to poke a hole in their convictions with contradicting opinions held by fellow nuns, she does not dispel their faith. Instead she withholds judgment, allowing room for the flexibility of their personal beliefs. Each nun gets the chance to express herself as she continues to explore and understand herself in her journey inwards and towards God. “Several nuns volunteered, in the course of the oral history interviews, that outsiders label their life as a form of escapism. They took pains to point out that religious life is not a rejection of the world or its inhabitants; the enclosure is, in a sense, a form of embracing humanity, a calling to, not a running from.”

The Joy Of Lent

Will Willimon riffs on a conversation he once had with a woman who told him, “I’m so glad next week is Ash Wednesday”:

Glad for Ash Wednesday? I pressed for more. She responded, “You don’t know me that well, but I was the victim of sexual abuse by a relative when I was a young teenager. Spent years in therapy trying to get over it. Pop-spirituality and feel-good religion were just no help to me. That’s why I’m glad that we are coming to that time of the year when the church makes us put all the injustice, sin, blood and guilt on the altar and forces us to look at it and let God deal with it.”

Rejoice. It’s Lent. This is when the poor, old, bumbling church courageously reminds us of the joy of letting go of our illusions about ourselves. We offer our lives not to a God with high standards of conduct, but to a God who loves us as we are and forgives the worst in us.

My favorite theologian, Karl Barth, said that “only Christians sin.” He meant that only Christians know the joy of a God who forgives and thus can be frank about their sin. There is a sense in which awareness of God’s grace comes before, and not after, true and honest repentance. The person who doesn’t know a gracious God can never be truly honest about sin.

On Ash Wednesday this past week, Nadia Bolz Weber argued that people who find the first day of Lent depressing “totally don’t get it”:

[I]t’s a refreshing thing we and Christians all over the world do [on Ash Wednesday]. We gather to remind each other of the truth. To remind each other of our mortality.  We tell each other the inescapable truth that we are dust and to dust we shall return.  It’s downright audacious that amidst our societal anxiety about impermanence we just blurt out the truth as if it’s not offensive.  But the thing about blurting out this kind of truth about ourselves … is that after you do it … you can finally exhale.  It’s like the moment when you stop having to spiritually hold your stomach in.

Calling himself “a bastard” and “complete shit,” Giles Fraser admits he takes comfort in reckoning with himself during Lent:

[T]he language of sin and death – both, in Christian theology, the gift of Adam and thus a constituent part of the human condition – are, I think, much more compassionate ways of looking at human beings than the alternative doctrines of continual self-improvement.

This is counter-intuitive, I know. To use the language of sin sounds all terribly judgmental. But as the wonderful novelist Marilynne Robinson puts it in The Death of Adam: Essays on Modern Thought: “The belief that we are all sinners gives us excellent grounds for forgiveness and self-forgiveness, and is kindlier than any expectation that we might be saints, even while it affirms that standards all of us fail to attain.”

Meanwhile, Alice Robb recommends that observers resist broadcasting their abstention intentions:

What are the effects of sharing your goals on Twitter? It’s often assumed that the social pressure of announcing your intentions will compel to you follow through, but recent research suggests that it might actually backfire—and not just by irritating your friends and followers.

For a 2010 paper in the journal Psychological Science, a team of psychologists led by New York University’s Peter Gollwitzer looked at how students’ behavior changed when they shared their goals with the psychologists or with their peers. For the first experiment, Gollwitzer and his team recruited 49 students training to be psychologists. They were told they were participating in a study on the motivation of first-year psychology students, and were asked to write down two goals relating to their coursework. Some expressed an intention to take reading assignments more seriously, for instance, or to get to grips with statistics. For half the students—those assigned to the “social reality” condition—the psychologist conducting the experiment read the students’ intentions back to them. Members grouped into the “no-social reality condition,” on the other hand, were told that the page on which they recorded their intentions was included by mistake and would be thrown away. One week later, the students were brought back to the lab and asked to list the days on which they’d acted in accordance with their stated goals. On average, the “no-social reality” group kept their resolutions on more days than the “social-reality” group.