The Misery Of Miscarriage

Ariel Levy shares her experience:

On five or six occasions, I ran into mothers who had heard what had happened, and they took one look at me and burst into tears. (Once, this happened with a man.) Within a week, the apartment we were supposed to move into with the baby fell through. Within three, my marriage had shattered. I started lactating. I continued bleeding. I cried ferociously and without warning—in bed, in the middle of meetings, sitting on the subway. It seemed to me that grief was leaking out of me from every orifice.

I could not keep the story of what had happened in Mongolia inside my mouth. I went to buy clothes that would fit my big body but that didn’t have bands of stretchy maternity elastic to accommodate a baby who wasn’t there. I heard myself tell a horrified saleswoman, “I don’t know what size I am, because I just had a baby. He died, but the good news is, now I’m fat.” Well-meaning women would tell me, “I had a miscarriage, too,” and I would reply, with unnerving intensity, “He was alive.” I had given birth, however briefly, to another human being, and it seemed crucial that people understand this. Often, after I told them, I tried to get them to look at the picture of the baby on my phone.

A New Normal For The Novel

Tim Parks is tired of the traditional structure of the novel, calling it “a way of seeing that is bound to produce states of profound disappointment for those who subscribe to it”:

The variety of stories told in the novel is indeed remarkable, but the tendency to reinforce in the reader the habit of projecting his or her life as a meaningful story, a narrative that will very likely become a trap, leading to inevitable disappointment followed by the much-prized (and I suspect overrated) wisdom of maturity, is nigh on universal. Likewise, and intrinsic to this approach, is the invitation to shift our attention away from the moment, away from any real savoring of present experience, toward the past that brought us to this point and the future that will likely result. The present is allowed to have significance only in so far as it constitutes a position in a story line. Intellect, analysis, and calculation are privileged over sense and immediate perception; the whole mind is pushed toward the unceasing construction of meaning, of narrative intelligibility, of underlying structure, without which life is assumed to be unimaginable or unbearable.

Sam Sacks strikes back:

Parks isn’t talking only about mediocre novels when he invokes the tyranny of tradition. By his way of thinking, anyone who uses elements of conventional forms has done so out of either unthinking habit or unwilling necessity. But this is untrue.

For many, if not most, writers, things like plot, character development, and catharsis are not narrative fallbacks but dynamic tools that give shape to the stories they’re passionate to tell or develop ideas that are uppermost on their minds. The art of storytelling is ancient, but it is a flighty kind of world view that automatically equates oldness with staleness. Missing from Parks’s essay is the recognition that talent transmutes tradition. Gifted writers can make accustomed methods feel as new and vital as a work explicitly devoted to structural innovation. In both cases, the object is the same: form is used in the service of artistic vision.

Andrew Gallix points out that “fiction fatigue” has been around since the realist movement of Balzac and Flaubert:

[T]he realist novel attempted to dissolve whatever smacked of literariness. As Alain Robbe-Grillet pointed out in his nouveau roman heyday, serious writers always “believe they are realists”, and “literary revolutions” are all made “in the name of realism”. Whenever a given mode of writing becomes “a vulgar recipe, an academic mannerism which its followers respect out of routine or laziness, without even questioning its necessity, then it is indeed a return to the real which constitutes the arraignment of the dead formulas and the search for new forms capable of continuing the effort”.

Robbe-Grillet accused the Balzacian novel of propagating an outdated, anthropocentric worldview. Its rounded characters were an expression of triumphant bourgeois individualism; its lifelike plots mirrored readers’ “ready-made idea of reality“. Such works were designed to convey the impression of a stable, “entirely decipherable universe”, and the novelist’s task was, precisely, to do the deciphering; to unearth “the hidden soul of things”. For his part, the nouveau romancier was convinced that the “discovery of reality” through literature would only continue if these “outworn forms” were jettisoned, along with “the old myths of ‘depth'” that supported them. In the new novel he called for, the presence of the world – “neither significant nor absurd” – prevails over any attempt to project meaning on to it. Reality is no longer a given, but a taken; something that each novel must create anew. As a result, the primacy of substance over style is reversed. Style is what “constitutes reality” in such a novel, which ultimately “expresses nothing but itself”.

Our Moral Brains

brain

Philosopher Thomas Nagel reviews Joshua Greene’s new book Moral Tribes, in which he advocates adopting a single “metamorality” based not on divergent philosophies and moral codes but a common ethics discovered through psychology and neurology:

Greene wants to persuade us that moral psychology is more fundamental than moral philosophy. Most moral philosophies, he maintains, are misguided attempts to interpret our moral intuitions in particular cases as apprehensions of the truth about how we ought to live and what we ought to do, with the aim of discovering the underlying principles that determine that truth. In fact, Greene believes, all our intuitions are just manifestations of the operation of our dual-process brains, functioning either instinctively or more reflectively. He endorses one moral position, utilitarianism, not as the truth (he professes to be agnostic on whether there is such a thing as moral truth) but rather as a method of evaluation that we can all understand, and that holds out hope of providing a common currency of value less divisive than the morality of individual rights and communal obligations. “None of us is truly impartial, but everyone feels the pull of impartiality as a moral ideal.”

While we cannot get rid of our automatic settings, Greene says we should try to transcend them—and if we do, we cannot expect the universal principles that we adopt to “feel right.” Utilitarianism has counterintuitive consequences, but we arrive at it by recognizing that happiness matters to everyone, and that objectively no one matters more than anyone else, even though subjectively we are each especially important to ourselves. This is an example of what he calls “kicking away the ladder,” or forming moral values that are opposed to the evolutionary forces that originally gave rise to morality. Yet Greene cannot seem to make up his mind as to whether utilitarianism trumps individual rights in some more objective sense.

Last month, Robert Wright sparred with Greene in this Bloggingheads clip, elaborating on his suggestion that a problem with Greene’s argument is “overestimating the role played by divergent values.”  Bob unpacked one of Greene’s examples:

The Israeli-Palestinian conflict is at its root a conflict between two peoples who think they’re entitled to the same piece of land.

When they argue about this, they don’t generally posit different ethical principles of land ownership. They posit different versions of history—different views on how many Arabs were living in Palestine before 1948, on who started the fighting that resulted in many Arabs leaving the area, on which side did which horrible things to the other side first, and so on. It’s not clear why these arguments over facts would change if both groups were godless utilitarians.

In fact, Greene’s own book suggests they wouldn’t. Notwithstanding its central argument, it includes lots of evidence that often the source of human conflict isn’t different moral systems but rather a kind of naturally unbalanced perspective. He cites a study in which Israelis and Arabs watched the same media coverage of the 1982 Beirut massacre and both groups concluded that the coverage was biased against their side. Any suspicion that this discrepancy was grounded in distinctive Jewish or Arab or Muslim values is deflated by another finding he cites, from the classic 1954 study in which Princeton and Dartmouth students, after watching a particularly rough Princeton-Dartmouth football game, reached sharply different conclusions about which side had played dirtier.

Was the problem here a yawning gap between the value systems prevailing at Princeton and Dartmouth in the 1950s? Maybe a mint-julep-versus-gin-and-tonic thing? No, the problem was that both groups consisted of human beings. As such, they suffered from a deep bias—a tendency to overestimate their team’s virtue, magnify their grievances, and do the reverse with their rivals. This bias seems to have been built into our species by natural selection—at least, that’s the consensus among evolutionary psychologists.

(Photo from Zach Klein)

Chart Of The Day

dish_CDCchart

Lisa Wade points out that “the percent of teenagers that have had intercourse has been dropping consistently over the last 20 years”:

[D]espite the fact that young people are more likely than earlier generations to engage in oral sex before initiating penile-vaginal intercourse (especially fellatio), they continue to take intercourse very seriously. This may be, in part, because men are becoming more like women in this regard. Men’s numbers have dropped much more sharply. In addition, for the first time the CDC study found that boys’ #2 reason for not having engaged in intercourse was that they were waiting for the right person. Men cited this reason 29% of the time, compared to 19% for girls. For both boys and girls, the #1 reason is that it’s against their religion (41% of girls and 31% of boys). Concerns about pregnancy come in third.

It’s So Personal, Ctd

A reader writes:

You guys ran a series of personal abortion stories at one point. New York Magazine has done something similar with poignant stories coupled with stark photography. Some of the comments are even more poignant than the articles.

Here’s Dana from Colorado:

After Dr. Tiller was killed, I watched the man I didn’t know would become my doctor talking on the news, rubbed my belly, and wondered how anyone could possibly have a late-term abortion. A month later, I understood. During the 29-week ultrasound, the ventricles in the brain were enlarged. There aren’t adjectives to describe how I felt when we learned a few weeks later her neurological system wasn’t formed. It’s not that I didn’t want an imperfect child; even if we had all the interventions, she’d have seizures 70 percent of the time, never suck or breathe.

And Rachel from West Virginia:

I have schizoaffective disorder.

I’m fine on my meds, but I was scared I might hurt a child like my parents hurt me. When I started understanding my family’s history of mental illness, my husband and I said, “Okay, let’s stop the cycle of abuse and not have kids.” When I found out I was pregnant, I just started sobbing. The doctor slipped me some cards for clinics in different states. She couldn’t be pro-choice publicly—we live in a very religious area in West Virginia—but she knew that I couldn’t keep taking my meds during a pregnancy. …

“Wonderful” is a weird word to use, but inside the clinic was wonderful. There was a sensation of finally being able to breathe.

Face Of The Day

Just the two of us, Klaus Pichler, 2013

Lauren Davis highlights a compelling portrait series:

Klaus Pichler’s photo series Just the Two of Us explores the dual identity of the costume and the person who wears it by taking the subjects out of the conventions and carnivals where we typically encounter them and showing them in their homes. Pichler photographs not just cosplayers who dress up as characters from genre fiction, but also other costume enthusiasts: furries, medieval reenactors, people who participate in Krampus and Carnival traditions. The photographs are fascinating, not just for their frequent juxtapositions of the fantastical and the mundane, but also because they invite us to speculate on the lives of these masked men and women—including what prompts them to chose their particular mask.

Read the artist’s project statement here.

(Photo courtesy of Klaus Pichler/ Anzenberger Agency)

Obamacare’s Grisly, Slow Start

The ACA enrollment numbers for October were released this afternoon:

A little more than 106,000 people selected insurance plans during the new health-care law’s first month of open enrollment, the Obama administration announced Wednesday. Approximately 27,000 of those sign-ups came from 36 states where the federal government is running the exchange, which has been beset with technical difficulties. The additional 79,000 came through the 15 marketplaces run by states and the District of Columbia.

Josh Marshall puts the Healthcare.gov numbers in context:

I believe roughly a third of the country’s population is in states with exchanges (14 states, including California and New York). Mainly the state sites have been working well, in tech terms. And that gives you a sense of the full impact of the botched launch of healthcare.gov. The functioning websites were able to sign up three times as many people from only one-third of the country’s population.

Robert Laszewski chastises the administration:

The audacity of this administration to continue telling people to keep going back to the website and the call center when they knew full well that only 25 people per day per state were making it thorough the gauntlet that is Healthcare.gov is startling.

On the bright side, Chait notes that almost 400,000 people enrolled in Medicaid:

Why have so many more people enrolled in Medicaid than the exchanges? Because it’s a much easier decision. If you’re below a certain income threshold, and you don’t live in one of the states run by politicians with a sociopathic indifference to the basic human needs of their most vulnerable citizens, then you have one option for subsidized health care: Medicaid. Nobody loves being on Medicaid, but it’s better than nothing, so you enroll.

Sarah Kliff and Ezra Klein list nine facts about the numbers. Among them:

It’s not quite right to say 106,856 have actually bought a plan. The White House is also counting people who have placed a plan in their shopping cart but haven’t yet paid.

Another important detail:

Lots of shoppers on HealthCare.gov are turning to paper applications. Thirty-three percent of sign-ups in the federal exchange have been done offline. In states, that number stands at just 3 percent.

Jonathan Cohn thinks it’s too early to draw any firm conclusions about the law:

The October numbers are low, which was to be expected given the website problems and tendency of people not to buy insurance right away. But what matters isn’t the figures for October or even November. It’s December and the months that follow—particulalry into next year, as the prospect of paying fines for uninsurance start to hit people in the face. “It’s too early to say anything useful,” says Jonathan Gruber, professor of economics at MIT. “And, really, I don’t think we can draw any significant conclusions about effectiveness of the law until March, because any firm conclusion requires effects of indivdiual mandate to be felt.”

Tats For Technophiles?

Throat Tattoo

A futuristic project from Google is in the works: electronic skin tattoos. Last week, Google’s Motorola Mobility applied for a patent that would link the tattoos with mobile devices:

The patent, titled “Coupling an Electronic Skin Tattoo to a Mobile Communication Device,” explains that the device would sit on the user’s neck and serve as a supplemental phone microphone. With its close proximity to a user’s mouth, the tattoo would cut down calls’ background noise and produce clearer audio. Equipped with a transceiver, the device would allow for wireless communication with a paired mobile device; this means voice commands on your phone would get even easier, without the need to press a “talk” button before speaking.

The tattoo can also be used as a lie-detector test. The patent says the device will have a “galvanic skin response detector to detect skin resistance of a user” and explains that a user who is nervous or telling a lie might have a different skin-related response than someone who is confident and telling the truth. … The oddest part of the patent: The word “remove” doesn’t show up at all in the document.

Jim Edwards clarifies: “The tattoo isn’t permanent — it’s applied to a sticky substance on the skin.” Derek Mead surveys reaction to the idea:

At TechCrunch, Chris Velazco writes that “before you start freaking out at the mental visual of a tattoo artist weaving electronic components into your neck flesh, know that Motorola has a history of playing fast and loose with its interpretation of the word ‘tattoo.'” Alexis Madrigal called it Google’s “creepiest patent yet.” Steve Dent at Engadget huffs, “Okay, where to start with this one?”

We’re collectively at a strange point in our relationship with technology: Many of us have come to rely on gadgets so much that we’re simply not the same without them, and yet we also don’t admit we have such a techno-dependence. The integration of man and machine is a central tenet of futurism, and yet as folks like Kevin Warwick have argued, the tight integration of technology into our daily lives means we’re already there.

Jason Bittel is skeptical about the lie detector component:

By detecting skin resistance (read: sweat), Google’s skin tattoo may be able to determine whether its wearer is “nervous or engaging in speaking falsehoods.” Since I can’t imagine why the Sam Hill someone would opt to wear such incriminating technology, I think we have to assume the tattoo would either be applied clandestinely or against the wearer’s will. Let’s not even get into the dubious quality of truth-telling based on “skin resistance” alone. … One thing’s for sure, this is unlikely to help Google’s ongoing issues with privacy and consumer trust.

(Image from Google’s patent application)