Or Just Write

by Graeme Wood

The Guardian's rules for writing fiction, a compendium of advice from writers, include platitudes (Andrew Motion: "Think with your senses as well as your brain"), morsels of wit (Elmore Leonard: "If it sounds like writing, I rewrite it"), and sound practical advice (Zadie Smith: "Work on a computer that is disconnected from the internet").  Most of all, the rules sound haughty and dismissive, which is about what you should expect when you ask skilled craftsmen to reduce their craft to a few simple rules.

The great Atlantic correspondent Jeffrey Tayler, a writer of nonfiction, recently meditated on the question of how one becomes a writer. He settled on a much more onerous approach:

The question for me was not, then, how does one read to write, but how does one read to live? I conceived early on the conviction that one should lead one’s life as if one were the protagonist of an epic novel, with the outcome predetermined and chapter after chapter of edifying, traumatic and exhilarating events to be suffered through. Since the end is known in advance, one must try to experience as much as possible in the brief time allotted.

The protagonist of “The Death of Ivan Il’ich” died moaning, in agony, overcome with the realization that he had wasted his days on earth following social conventions. He lacked l’esprit frondeur, and he paid for it. Conventions now are hardly less pervasive than they were in Tolstoy’s day; we’re pressured to start a career, build our résumé, earn a certain amount of money, and so forth. But remember: None of us gets out of here alive. So don’t fear risks. Rebel. Be bold, try hard, and embrace adversity; let both success and failure provide you with unique material for your writing, let them give you a life different enough to be worth writing about. 

A Little Watergate Blogging

by Jonathan Bernstein

I posted earlier about the Reagan-era David Broder column that Ezra Klein used this morning to make good points about Barack Obama.  Broder said:

One measure of that transition was last week's Gallup Poll showing Reagan trailing two leading Democrats in trial heats for the 1984 election. Former vice president Walter Mondale had a 52-40 percent lead, and Sen. John Glenn of Ohio had a 54-39 advantage.

Such leads for opposition candidates are extremely rare at this stage of the cycle when all presidents, including Reagan, enjoy an aura of authority.

Well, I don't know how rare it is, which brings up a worthwhile historical point.  One of the things that people find puzzling about Watergate is: why did Nixon do it?  After all, Richard Nixon defeated George McGovern in one of the largest landslides in history; surely there was no reason for him to take the huge risks entailed in criminal activity targeting the woeful Democrats.

As you've no doubt guessed by now, the answer is that Nixon's reelection was hardly a sure bet a little earlier in the cycle.  The 1970 midterms went badly for Nixon, leading to plenty of speculation that he was a one-term president. Here's a fun column from May, 1971, referring to an eight-point lead that Ed Muskie held over Nixon at that point.  And even as late as mid-January, 1972, Muskie was still tied with Nixon.  That's when Muskie looked like the odds-on favorite to win the Democratic nomination.  When Muskie faltered (don't forget – it was at least in part because the Nixon campaign undermined him), Nixon and his operatives didn't really believe that they could be so lucky as to face McGovern, at least not until very late in the process.  They expected to either face Hubert Humphrey, who of course had lost a very narrow race in 1968, or Ted Kennedy — until late in the game, Nixon didn't understand the brand new, reformed, Democratic presidential nomination process, and believed that McGovern might still be a stalking horse for Kennedy (see Fred Emery's great history, Watergate).  McGovern only looked like the clear nominee after winning the California primary on June 6 (and even then, Humphrey fought on to a test vote at the convention, so it wasn't a certainty even after California).  The first attempted break-in at the Watergate offices of the DNC had been on May 26.

So: the operations that became Watergate began when Nixon's re-election prospects looked dicey indeed; the early operations may have played a part in getting the Democrats to choose their weakest candidate; and, the full operation was set into place before the political landscape of November 1972 was really clear.  There is, too, some bureaucratic inertia.  Once the switch was turned on, people were hired, and budgets were approved (although no one ever did take credit for approving the budget), it moved along with a momentum of its own.  But I'm not sure it's necessary to explain the quest for information about the Democrats: while it was happening, Nixon was not yet sure he would face and destroy McGovern.

Yes, Nixon probably should have seen that whatever it is he wanted from the Democratic National Committee and from other sources wasn't worth the risk.  But that's because it was risky to do illegal things, not because he had the election sewn up early.

Goldberg On New Media

by Andrew Sullivan

Jeffrey Goldberg, for one, hails his corporate overlords. But Jeffrey Goldberg doesn't know that much about new media. I know that sounds odd, given that he is a blogger at what was once one of the savvier new media websites, but there you have it. One of the many reasons I don't engage his blog more frequently on matters relating to new media is that he's not very knowledgeable about the dynamics of blogging as a form, or of the radical new democracy of online personal voices which render institutional authority and corporate branding so exhausted and old-school. This might be because these issues don't interest him. But his endorsement of this almighty mess could not be put more eloquently than this:

(Sorry, I can't seem to load his page right now, seriously — I'll put in the link as soon as I can).

Linking on the web. Who needs it?

After an avalanche of reader email, I've asked the suits for an end to the font mess, restoration of the latest posts of the other bloggers on the right (you don't read Goldberg because he's so goddamn sexy, after all), a fix for the 'continue reading' function so that you don't have to find out where you are when you click through, a restitution of the Dish search function, a return of a functional sitemeter – a transparency feature of the Dish since almost its inception – and a redesign of the header so it doesn't seem so fused with the blog.

Some vacation. See you next week unless the redesign strikes again.

Political Stage Parents

by Chris Bodenner Michelle Cottle sees a new wrinkle in the ever-blurring line between politics and celebrity:

Accompanying this trend, I suspect, will be the fading of the perceived downsides of political kid-dom. Once upon a time, pretty much all pols agonized about the lack of privacy their families would suffer because of their profession. Today, we have Scott Brown touting his daughters’ extreme dateability during his victory speech and calling for Ayla to once more shake her groove thing for Simon Cowell. And why not? If our reality TV/YouTube culture has taught us anything, it is not to fear exposure. …

Projecting out a few years, how long before politicians have to stop citing their kids as the reason they won’t pursue higher office—and we instead see kids lobbying their parents to run as a way to boost the kids’ public image? After all, without all that grotesque personal invasion that Sarah Palin whined about in ’08, Bristol would be just another [teen-mom] statistic …

Ayla Brown clearly has talent, and I hope she has a great career, but I can’t watch her music video without thinking of this classic from David Brent:

“Playing Chicken With Unemployment Benefits”

by Patrick Appel

McArdle whacks Jim Bunning for blocking an unemployment benefits extension:

Even if you think the government needs a plan to get its house in order, why on earth is Bunning making a stand on this issue?  It's political poison–even the Republican base knows people who are out of work.  It's terrible economic policy–suddenly cutting off the taps would have nasty knock-on effects on the economy.  And while it's a lot of money, it's one of the few government programs that pretty much unequivocally improve the net welfare of the American people.  If Bunning wants to hold up something, how about finding some useless defense appropriations to complain about?

The Reagan/O’Neill Era

by Jonathan Bernstein

Ezra Klein starts the week off with a great find — a David Broder column following the 1982 midterm elections in which Broder wrote Reagan's premature obituary:

What we are witnessing this January is not the midpoint in the Reagan presidency, but its phase-out. "Reaganism," it is becoming increasingly clear, was a one- year phenomenon, lasting from his nomination in the summer of 1980 to the passage of his first budget and tax bills in the summer of 1981. What has been occurring ever since is an accelerating retreat from Reaganism, a process in which he is more spectator than leader.

Klein's points about how this applies to Obama are all well-taken.  However, on Reagan, he says: "The column…holds up fairly well in its criticism of Reagan's leadership style, if not in its prediction of Reagan's influence on the country's politics." 

I disagree: I think Broder was absolutely correct.  In domestic policy, Reagan was pretty much finished with attempts at radical change after 1981.  And in foreign policy, Reagan never really departed in practice from the same Cold War realism and containment that Truman and all his successors shared.

The new status quo wasn't as much "Reaganism" as it was stalemate — what some think of as a Reagan era stretching out from 1980 is, in my view, more properly thought of as the era of both Ronald Reagan and Tip O'Neill, with neither liberals nor conservatives often able to control enough of the levers of government to actually implement much of what they want, but both able to block the initiatives of the other.   Seen in that way, the two paradigmatic initiatives after 1981 were the Clinton health care plan and the Bush social security plan, both of which failed to even come close to reaching the floor of either chamber of Congress despite unified party government.  And the key confrontation was the government shutdown of the winter of 1995-1996, which convinced a new generation of Republicans what Reagan learned in 1982: that the American people might say that they want a smaller government, but that actual cuts in programs are almost always unpopular.  But neither, after 1980, could liberals expand government significantly.  Stalemate.

Now, any time one starts characterizing large chunks of history one invites trouble.  Not everything from 1982 through 2008 was stalemate (just to pick one: the Americans with Disabilities Act was a major, and I think universally thought to be successful, piece of legislation). Conservatives had their triumphs, too, most notably by dominating new selections to the courts.  But the biggest conservative triumphs were about stopping liberals from getting what they wanted, not in successfully enacting the conservative platform (which helps to explain why conservatives still tended to express themselves, even during the best days of the Bush presidency, as if they were an oppressed minority, something that liberals found quite puzzling). 

Of course, what everyone is interested in now is whether that era, however one describes it, has ended.  But that's another story, for another post.

ElBaradei’s Waiting Game

by Graeme Wood

Last week Andrew noted the triumphant return to Egypt of ex-IAEA chief Mohamed ElBaradei, widely believed to be the most promising political alternative to a continuation of the Mubarak clan's three decades in power.  Presidential elections are next year, and ElBaradei has already attracted a large youth following. 

Now in his Cairo villa, ElBaradei is talking rarely to journalists, and spending his time watching his Facebook group grow and playing a sly political game:

"I would only run if people coalesce around me," said ElBaradei, who has spent years based abroad.

Speaking at his home on Cairo's outskirts, he said he could keep up momentum for his call for change "from wherever I am".

[…]

Asked about his view on a possible bid by Gamal Mubarak, ElBaradei said: "I believe the current system is not free and fair, and, whoever would run under the current system, I would make it very clear that this is not democracy as it should be."

[…]

Asked if his supporters should take to the streets, ElBaradei said: "People are talking about all sorts of things and they might go to civil disobedience if there is no change."

The Redesign, Ctd

by Patrick Appel

A reader writes:

I was a Content Manager at America Online back in the mid to late 1990s. Although I had a variety of roles, at the end of my tenure my primary job was to manage one of AOL's 18 content channels. This was the period just *before* AOL tore down the walls separating it from the rest of the Internet, and *before* the much maligned merger with Time Warner. It was very much AOL's heyday.

Our model within these channels? To aggregate material by subject matter into a series of always updating headline driven content areas.

The goal? Through a series of redesigns and reiterations, to make the AOL channels – rather than the partners who provided the content within each channel – the primary point of loyalty for our members.

The result? A mishmash of genericized content that diluted the very thing that had made us so successful – the uniquely identifiable voices that, along with basic features such as email and chat, had brought people flocking to the service in the first place.

As just one example, ask the guys at the Motley Fool, one of the commercial Internet's first true success stories, how it all worked out for them.

I'm sorry to have to say this, but Goldberg's description of the new site ("a thorough reimagining of what a magazine's website could be") could not possibly be more wrong.

What they've done to you, TNC, and the rest isn't new at all. It's AOL circa 1998. I realize that's the Internet's Stone Age, a time no doubt well beyond the memory of most of the people who put this design together, but…. that should underscore the point, right?

You guys are repeating one of the mistakes that I will always believe killed AOL. I have no reason to think anyone there will take my advice – the Senior VPs at AOL ignored me when I fought against this very same model, and they were paying me for my opinion! – but here it is:

Know your strengths. They are your Voices. Don't bury them. Don't integrate them under brand names and channels. Make them louder. And clearer. You should be working to bring them front and center. Instead you are pushing them to the back, putting more distance between them and your readers. That is, in a word, insane.

People don't want a series of headlines. They can get that elsewhere. They want personality. They want community.  They want names and faces they can identify and bond with.

The age of nameless, faceless "editors" is over. It has been over for quite some time, even if many don't yet realize it. People accepted it when the market provided them with no alternative, but as you both well know the moment alternatives became possible they flocked to them in droves.

And most importantly: for the love of all that is holy, please stop trying to "re-imagine" the magazine. That's an entirely backward looking enterprise. Be the entirely new thing that you ALREADY are. Or, I'm beginning to fear, were.

From the very beginning, the Atlantic was about the voices it contained, and not about the package that they came bound in. Somehow, in the move to the web, this magazine kept that tradition alive. Unlike most of its competitors it found success by combining what it had always been – strong voices in long-form articles provided at a more thoughtful pace – with the Internet's greatest innovation – strong voices in short-form updates provided in real-time to a community of not-always likeminded souls. I always assumed that this near perfect mix of 20th and 21st century publishing models was the result of some very forward thinking management. With the last redesign I began to doubt that. With this newest one, I'm on the verge of concluding it was all just dumb luck.

Maybe I'm wrong. Maybe this model will work this time. Maybe the way forward is to borrow a failed model from the past. But, well….