Political Stage Parents

by Chris Bodenner Michelle Cottle sees a new wrinkle in the ever-blurring line between politics and celebrity:

Accompanying this trend, I suspect, will be the fading of the perceived downsides of political kid-dom. Once upon a time, pretty much all pols agonized about the lack of privacy their families would suffer because of their profession. Today, we have Scott Brown touting his daughters’ extreme dateability during his victory speech and calling for Ayla to once more shake her groove thing for Simon Cowell. And why not? If our reality TV/YouTube culture has taught us anything, it is not to fear exposure. …

Projecting out a few years, how long before politicians have to stop citing their kids as the reason they won’t pursue higher office—and we instead see kids lobbying their parents to run as a way to boost the kids’ public image? After all, without all that grotesque personal invasion that Sarah Palin whined about in ’08, Bristol would be just another [teen-mom] statistic …

Ayla Brown clearly has talent, and I hope she has a great career, but I can’t watch her music video without thinking of this classic from David Brent:

“Playing Chicken With Unemployment Benefits”

by Patrick Appel

McArdle whacks Jim Bunning for blocking an unemployment benefits extension:

Even if you think the government needs a plan to get its house in order, why on earth is Bunning making a stand on this issue?  It's political poison–even the Republican base knows people who are out of work.  It's terrible economic policy–suddenly cutting off the taps would have nasty knock-on effects on the economy.  And while it's a lot of money, it's one of the few government programs that pretty much unequivocally improve the net welfare of the American people.  If Bunning wants to hold up something, how about finding some useless defense appropriations to complain about?

The Reagan/O’Neill Era

by Jonathan Bernstein

Ezra Klein starts the week off with a great find — a David Broder column following the 1982 midterm elections in which Broder wrote Reagan's premature obituary:

What we are witnessing this January is not the midpoint in the Reagan presidency, but its phase-out. "Reaganism," it is becoming increasingly clear, was a one- year phenomenon, lasting from his nomination in the summer of 1980 to the passage of his first budget and tax bills in the summer of 1981. What has been occurring ever since is an accelerating retreat from Reaganism, a process in which he is more spectator than leader.

Klein's points about how this applies to Obama are all well-taken.  However, on Reagan, he says: "The column…holds up fairly well in its criticism of Reagan's leadership style, if not in its prediction of Reagan's influence on the country's politics." 

I disagree: I think Broder was absolutely correct.  In domestic policy, Reagan was pretty much finished with attempts at radical change after 1981.  And in foreign policy, Reagan never really departed in practice from the same Cold War realism and containment that Truman and all his successors shared.

The new status quo wasn't as much "Reaganism" as it was stalemate — what some think of as a Reagan era stretching out from 1980 is, in my view, more properly thought of as the era of both Ronald Reagan and Tip O'Neill, with neither liberals nor conservatives often able to control enough of the levers of government to actually implement much of what they want, but both able to block the initiatives of the other.   Seen in that way, the two paradigmatic initiatives after 1981 were the Clinton health care plan and the Bush social security plan, both of which failed to even come close to reaching the floor of either chamber of Congress despite unified party government.  And the key confrontation was the government shutdown of the winter of 1995-1996, which convinced a new generation of Republicans what Reagan learned in 1982: that the American people might say that they want a smaller government, but that actual cuts in programs are almost always unpopular.  But neither, after 1980, could liberals expand government significantly.  Stalemate.

Now, any time one starts characterizing large chunks of history one invites trouble.  Not everything from 1982 through 2008 was stalemate (just to pick one: the Americans with Disabilities Act was a major, and I think universally thought to be successful, piece of legislation). Conservatives had their triumphs, too, most notably by dominating new selections to the courts.  But the biggest conservative triumphs were about stopping liberals from getting what they wanted, not in successfully enacting the conservative platform (which helps to explain why conservatives still tended to express themselves, even during the best days of the Bush presidency, as if they were an oppressed minority, something that liberals found quite puzzling). 

Of course, what everyone is interested in now is whether that era, however one describes it, has ended.  But that's another story, for another post.

ElBaradei’s Waiting Game

by Graeme Wood

Last week Andrew noted the triumphant return to Egypt of ex-IAEA chief Mohamed ElBaradei, widely believed to be the most promising political alternative to a continuation of the Mubarak clan's three decades in power.  Presidential elections are next year, and ElBaradei has already attracted a large youth following. 

Now in his Cairo villa, ElBaradei is talking rarely to journalists, and spending his time watching his Facebook group grow and playing a sly political game:

"I would only run if people coalesce around me," said ElBaradei, who has spent years based abroad.

Speaking at his home on Cairo's outskirts, he said he could keep up momentum for his call for change "from wherever I am".

[…]

Asked about his view on a possible bid by Gamal Mubarak, ElBaradei said: "I believe the current system is not free and fair, and, whoever would run under the current system, I would make it very clear that this is not democracy as it should be."

[…]

Asked if his supporters should take to the streets, ElBaradei said: "People are talking about all sorts of things and they might go to civil disobedience if there is no change."

The Redesign, Ctd

by Patrick Appel

A reader writes:

I was a Content Manager at America Online back in the mid to late 1990s. Although I had a variety of roles, at the end of my tenure my primary job was to manage one of AOL's 18 content channels. This was the period just *before* AOL tore down the walls separating it from the rest of the Internet, and *before* the much maligned merger with Time Warner. It was very much AOL's heyday.

Our model within these channels? To aggregate material by subject matter into a series of always updating headline driven content areas.

The goal? Through a series of redesigns and reiterations, to make the AOL channels – rather than the partners who provided the content within each channel – the primary point of loyalty for our members.

The result? A mishmash of genericized content that diluted the very thing that had made us so successful – the uniquely identifiable voices that, along with basic features such as email and chat, had brought people flocking to the service in the first place.

As just one example, ask the guys at the Motley Fool, one of the commercial Internet's first true success stories, how it all worked out for them.

I'm sorry to have to say this, but Goldberg's description of the new site ("a thorough reimagining of what a magazine's website could be") could not possibly be more wrong.

What they've done to you, TNC, and the rest isn't new at all. It's AOL circa 1998. I realize that's the Internet's Stone Age, a time no doubt well beyond the memory of most of the people who put this design together, but…. that should underscore the point, right?

You guys are repeating one of the mistakes that I will always believe killed AOL. I have no reason to think anyone there will take my advice – the Senior VPs at AOL ignored me when I fought against this very same model, and they were paying me for my opinion! – but here it is:

Know your strengths. They are your Voices. Don't bury them. Don't integrate them under brand names and channels. Make them louder. And clearer. You should be working to bring them front and center. Instead you are pushing them to the back, putting more distance between them and your readers. That is, in a word, insane.

People don't want a series of headlines. They can get that elsewhere. They want personality. They want community.  They want names and faces they can identify and bond with.

The age of nameless, faceless "editors" is over. It has been over for quite some time, even if many don't yet realize it. People accepted it when the market provided them with no alternative, but as you both well know the moment alternatives became possible they flocked to them in droves.

And most importantly: for the love of all that is holy, please stop trying to "re-imagine" the magazine. That's an entirely backward looking enterprise. Be the entirely new thing that you ALREADY are. Or, I'm beginning to fear, were.

From the very beginning, the Atlantic was about the voices it contained, and not about the package that they came bound in. Somehow, in the move to the web, this magazine kept that tradition alive. Unlike most of its competitors it found success by combining what it had always been – strong voices in long-form articles provided at a more thoughtful pace – with the Internet's greatest innovation – strong voices in short-form updates provided in real-time to a community of not-always likeminded souls. I always assumed that this near perfect mix of 20th and 21st century publishing models was the result of some very forward thinking management. With the last redesign I began to doubt that. With this newest one, I'm on the verge of concluding it was all just dumb luck.

Maybe I'm wrong. Maybe this model will work this time. Maybe the way forward is to borrow a failed model from the past. But, well….

How Can We Keep Religion Out Of Politics?

by Patrick Appel

Freddie responds to this Dish reader. A part of Freddie's rebuttal that I find convincing:

Let's talk tactics, shall we? This emailer with the terrible reading comprehension and I have as a first goal the same thing, which is keeping religious conviction out of politics, science and medicine. The history of the world teaches us that this is best accomplished not through atheism but through religious moderation. This is something many atheists must come to grips with if they are ever going to grow up: religious moderates do a far better job of opposing extremists than atheists do.

Look, aside from all of the "American theocracy" hysterics, this country does quite a good job of keeping the secular and the religious separate. There is much work to be done, but this is not Saudia Arabia, it is not Yemen. And why? Not because of atheism, but because of moderate religious people who have worked to divide theology from governance for centuries. When people express incredulity at the idea that people can both be practicing and religious and yet function in a secular society, I wonder what world they live in. Here on Planet Earth, in America, you interact with such people every day. They seem to have no trouble with it whatsoever.

Look to the Muslim world. Indonesia is the largest Muslim country in the world. It has a significant Muslim minority. And yet it also has significant Christian, Hindu and Buddhist minorities that live quite unmolested. Women wear pants, work in public, vote, hold office. Why? Not because some tide of atheism swept through Indonesia, but because of religious moderates embracing Enlightenment values and liberal democracy. I assure you, the large majority of these people are devout. They simply see no conflict between their religious devotion and their participation in civic life.

Another forceful point:

Which do you think is easier? To convince someone who has religious faith to totally abandon that identity? Or to convince them of the righteousness of dividing it from political life? Elementary human psychology teaches me that the more you attack the fundamental basis for someone's worldview, the more likely you are to earn violent pushback as a result.

The full post is here. Freddie writes that my reader's "strawmanning is so intense I'm shocked Patrick Appel posted it." I agree with Freddie that this reader badly misread his post, and I should have said as much at the time of posting. I certainly don't endorse this reader calling Freddie's thinking "childishly bad." Andrew Stuttaford, who largely concurred with Freddie, wrote that an "important qualification [to Freddie's post is]  that I do spend quite a bit of time pondering the implications of religious belief (to start with, there’s that whole rise of militant Islam business to think about)". Given that both my reader and Stuttaford responded in this fashion, I thought it prudent to air the argument in order to clarify the debate.

A Mechanical Mozart, Ctd

by Patrick Appel

A reader writes:

As a classical concert pianist, I was depressed to read your posting about computer-generated classical music “every bit as good as the originals.” For starters, it wouldn’t hurt to be suspicious of qualitative aesthetic statements posing as scientific assertions — Which originals? Good in what sense? According to whom?

I took a listen to EMI’s “Symphony in the Style of Mozart” and “Sonata in the Style of Beethoven.” Give me a break. Did you ask any professional musician friends to weigh in on this story before you linked to it? The Beethoven is a pale (and obvious) rip-off of the “Moonlight” Sonata, with the harmonies reconfigured. The Mozart is a inspiration-free classical style exercise, scales and harmonies in place, but absolutely none of the delicate synthesis of opposites and quirks that characterize Mozart. EMI (and Cope) are rearranging deck chairs on the ships of giants. It so happens that some aspects of the High Classical Style (1770-1820ish) lend themselves well to algorithm and mathematical understanding, especially in hindsight; but of course not all, and that last bit…well…it’s kind of important. To say that these are “as good as the originals” is silly, like saying a EZ Bake Oven is just as good as a Viking Range.

Unfortunately, we know stovels better than we know music, as a whole. It doesn’t prove anything that some audience members cannot tell the difference between EMI and Mozart: the type of thinking about Bach, Mozart, Beethoven that is required to appreciate their real inspirations, their profound shocks, their revolutions in tones: many audience members aren’t there, in the same way (for instance) that many people–myself included–cannot fathom all the nuances of health care policy. They recognize a certain sound and stylishness, in the same way many people respond to superficial talking points.

The Blitstein article you link to is full of dubious and inaccurate music-writing. For instance, talking about the nature of the composing process:

"…the most likely explanation, Cope believes, is that music comes from other works composers have heard, which they slice and dice subconsciously and piece together in novel ways. How else could a style like classical music last over three or four centuries?"

Well, the first bit of this quote is simply Musicology 101. Really, composers take things they have heard and rework them in novel ways?!? This is not a revelation, or a belief; it is just an obvious fact of any kind of writing (music, text) or any creative process at all. And the second bit of this quote— “a style like classical music”—made me want to stick a fork in my brain, made it obvious that the writer is (to put it politely) not knowledgeable about what we call classical music (for want of a better term). What style are you talking about? The style of Mozart? Or the style of Tchaikovsky? Or Josquin? We’re talking about hundreds of years of humans notating music, the massive Western heritage of humans communicating through music; it is not remotely a single style, it is an incredible evolving kaleidoscope of style and thought, even though it all basically gets dumped in the same section in the modern record store (such as it is).

Let’s concede the point that the human notion of originality has some element of cult in it, and some portion of what we call the creative can be easily simulated by the modern computer. It is fascinating to see what elements of creativity computers can ape, or hasten. But, Daily Dish, don’t aid and abet journalists who want to dupe us with naive broad-brush statements; why not have a real musical expert talk about EMI or Emily Howell? I notice a conspicuous absence of fellow musicologists or musicians in the Blitstein article, people who have devoted their lives to studying Mozart, Beethoven, etc. How come he didn’t talk to (for instance) Charles Rosen or Joseph Kerman or Richard Taruskin? I think the answer is obvious. The writer is nervous that the “hook” of the article (The Computer Mozart!!!) would be revealed as the total BS that it is.