Time To Clean House

Ezra notes that, thanks to filibuster reform, Obama can now fire people:

[T]he constant use of the filibuster against political appointments made it extraordinarily difficult for the White House to fire anyone because they didn’t know whether they’d be able to appoint a replacement — or, if they could appoint a replacement, who Republicans would actually accept. And the more political controversy there was around an issue the more dangerous a personnel change became.

This became a standard excuse for why no one is losing their job over the HealthCare.gov debacle: Firing any of the appointees in charge would just trigger a disastrous confirmation process that would lead the agency rudderless and chaotic for months — and possibly for the rest of Obama’s term.

Simultaneously, the rules change makes it far easier to hire new people. The confirmation process had become so difficult — and, because of that, the vetting process so intense — that top prospects routinely turn down presidential entreaties.

Doctor Who?

The Guardian has a great interactive guide to all the doctors over the last fifty years, if you need a refresher before the imminent 50th Anniversary episode.

It would be hard for me to express how powerful this television show was in my fledgling imagination. I never missed an episode, and got really hooked in the Patrick Troughton years. Yes, that Doctor’s male assistant, Jamie, was one of my first crushes. I just wanted to be him in some inchoate way that would eventually evolve into sexual and emotional attraction. Maybe it was the kilt that did it. There are also episodes in the Pertwee years which I can still close my eyes and remember vividly – The Green Death was my favorite (with giant maggots threatening Surrey!). But Tom Baker and Sarah Jane Smith became the iconic Doctor and his assistant for me – and for lots of Who-nerds growing up with me at the same time. Baker in some ways created the character in its fullest, final incarnation: querulous, curious, funny, fearless and yet also all-powerful. (You can watch his debut above). Then as I went to college (no TV there) and to America, I lost touch, except for occasional winces at what I thought were gimmicky and tawdry attempts at revivals in the late 1980s and 1990s.

But it lived on in my dreams. Almost every anxiety Green_Death dream I ever had in my teens was situated solidly in the Doctor’s world. I was never the Doctor – always his assistant – in these dreams. I knew my place and was just glad to be around him. The dreams invariably took the shape of a classic Doctor Who scenario – walking gingerly down strange, dark corridors waiting for some scary monster to jump out and grab me. Of particular note were the Cybermen, who for me exceeded the Daleks in their inhuman, soulless scariness. The explanation for this, I realized later, was that in the episodes in my youth, the Daleks couldn’t get up the stairs to my bedroom at night. They were on wheels! The Cybermen? They could knock down the front door and march up the stairs in an instant.

Even in my twenties, these dreams persisted. They were, I think, both about longing to exist in another kind of world; and also about seeing as the ultimate authority this quirky, non-violent, superior intelligence made Time Lord flesh in the Doctor. I realize now that this is a very English idea of inter-galactic power – all brain and close to very little brawn, and deeply moral, even when confronting the purest of evil.

The Doctor never kills, unless by accident; his enemies are always defined by their love of violence, their lack of a moral compass and what the English would regard as an absence of that Orwell virtue “decency”. You knew that aliens were evil because they kidnapped hostages, killed indiscriminately, and could even use torture at times. The Doctor always won by virtue of never sinking to their level – and laughing uproariously when they tried to intimidate him. The show is just as moralizing as Star Trek at times – but free from most sentimentality and filtered through irony. The Doctor was so often kidding.

I see the new and brilliant re-imagination of the TV show – with its huge global success – as my generation’s tribute to the spark of imagination that lit up our youth. I’m as old as Doctor Who – the show, not the Time Lord. I was born in the same year and grew up with the Doctors. The show was as familiar to me as weekly mass as a kid. Even now, there is something immensely comforting about watching it – recalling those old plots that occasionally resurface, or the feeling I had seeing the same monsters at 50 that I did at 5. A television show has this power in a way a single book never can. It unfolded before me as my own life did – and I lived a small part of it in thrall to those vistas of time and space that the Doctor invited me into, and where he taught me never to be afraid or humorless. Or cruel.

Terminally Uninsured

Rachel Pearson shares stories from working at St. Vincent’s, a free clinic in Texas:

There’s a popular myth that the uninsured—in Texas, that’s 25 percent of us—can always get medical care through emergency rooms. Ted Cruz has argued that it is “much cheaper to provide emergency care than it is to expand Medicaid,” and Rick Perry has claimed that Texans prefer the ER system. The myth is based on a 1986 federal law called the Emergency Medical Treatment and Labor Act (EMTALA), which states that hospitals with emergency rooms have to accept and stabilize patients who are in labor or who have an acute medical condition that threatens life or limb. That word “stabilize” is key: Hospital ERs don’t have to treat you. They just have to patch you up to the point where you’re not actively dying. Also, hospitals charge for ER care, and usually send patients to collections when they cannot pay.

My patient went to the ER, but didn’t get treatment. Although he was obviously sick, it wasn’t an emergency that threatened life or limb. He came back to St. Vincent’s, where I went through my routine: conversation, vital signs, physical exam. We laughed a lot, even though we both knew it was a bad situation.

One night, a friend called to say that my patient was in the hospital. He’d finally gotten so anemic that he couldn’t catch his breath, and the University of Texas Medical Branch (UTMB), where I am a student, took him in. My friend emailed me the results of his CT scans: There was cancer in his kidney, his liver and his lungs. It must have been spreading over the weeks that he’d been coming into St. Vincent’s. …

UTMB sent him to hospice, and he died at home a few months later. I read his obituary in the Galveston County Daily News.

The Fallout From The Nuclear Option: Reax

filibuster-dead

Yesterday, Ezra Klein applauded the Senate for reforming the filibuster:

With gun control dead, immigration reform on life support and bitter disagreement between the House and Senate proving the norm, it looked like the 113th Congress would be notably inconsequential. Today, it became notably consequential. It has changed how all congresses to come will work. Indeed, this might prove to be one of the most significant congresses in modern times. Today, the political system changed its rules to work more smoothly in an age of sharply polarized parties. If American politics is to avoid collapsing into complete dysfunction in the years to come, more changes like this one will likely be needed.

Fallows’s related thoughts:

Whichever party controls the government, has to be able to govern. Our checks-and-balances system, crafted in the demographic and political realities of 18th-century, 13-state, slave/free, Eastern Seaboard America and in many ways showing its age, did not ever contemplate a permanent blocking minority in the Senate as one of the regular “checks.” You can look it up. Minority protection is an important part of our overall Constitutional balance, but not in proceedings of the Senate. Outside a named class of special circumstances — impeachment, treaties, veto-overrides, etc — the Senate, whose all-states-equal formula already over-represents regional minorities, was intended to run as a majority-rule operation.

Drum passes along the chart above:

As you can see, their use rose steadily through the 80s and then leveled off starting around 1990. Democrats mainly kept things pretty stable throughout the Bush administration, increasing the number of filibusters only in his last two years when a change of power seemed imminent. When Obama took over, however, there was no honeymoon, not even for a minute. Republicans went into full-bore filibuster mode the day he took office, and they’ve kept it up ever since. For all practical purposes, anything more controversial than renaming a post office has required 60 votes during the entire Obama presidency.

Binder considers what will change:

The mostly likely effect will be felt when the president’s party controls the Senate.  Before today’s change, presidents (typically, although not always) chose nominees with an eye to whether the nominees could secure 60 votes for cloture.   With only a majority required to bring the Senate to a confirmation vote, there will be little incentive for presidents to consult.   I think the biggest potential effect will be visible with appointments to the federal bench.  Delays in selecting nominees in theory should go down, to the extent that these delays have been caused by an administration’s inability to secure the advance consent of home state senators.  Home state senators from the president’s party will no doubt still wield influence (their votes are still needed), but the rule change today shifts important leverage to the president in selecting nominees for lifetime appointments to the bench.  That said, when the opposition party controls the Senate, presidents’ leeway will be trimmed: A Senate majority of the opposite party must still approve the president’s choices.

Dickerson notes other effects:

Whoever is ultimately at fault for the rule change—the Democrats who forced it or the Republicans who blocked the nominations requiring the new rules—the result is that the minority will have less power. That means elections will matter even more than they did before. Every Republican campaign now has more incentive to fight harder to win the six seats needed to take back the Senate in the 2014 election. Presidential elections now mean more, too. Reid’s move will secure Obama’s legacy because the new nominees to the appeals court will be in a position to protect his achievements. Moderate senators will hold more power. Democratic Sens. Mark Pryor and Joe Manchin voted against the rule change. In the future, in a closely divided Senate, they are the kind of senators who will be the key vote to give or deny the majority their nominee.

Hertzberg hopes that the Senate will now do away with the filibuster altogether:

That would make government, and government policymaking, more coherent—and, more important, more accountable. Action, not gridlock, would be the norm. Over time, I believe, the new normal would be to the advantage of those among us who want our government to do its part to solve the nation’s problems and advance the well-being of the nation’s people—not those who want our government to fail, or simply to disappear. And to the advantage of those who want our government to be truly democratic, with a small “d”—even if that sometimes means with a capital “R.”

Steinglass also wants to kill the filibuster:

Democrats are still in control of the Senate for the next 14 months, at least. To use those 14 months to best advantage, they should have gotten rid of the filibuster entirely yesterday. Had they gotten rid of the filibuster in 2009, Obamacare would be a less dangerously kludgey law; they might have gotten a cap-and-trade energy law to fight climate change, immigration reform, a jobs bill, a budget compromise instead of the sequester. The filibuster has done Democrats a world of harm over the past five years, and by the time they lose the Senate, they won’t have it anyway. Apart from its partisan disadvantage to Democrats, it’s a horrible institution with no moral or practical legitimacy, “an anti-democratic monster” as Emily Bazelon puts it. There’s no reason to let it die a slow death. Democrats should finish the job. They could start on Monday.

Koger doubts that the filibuster will be abolished completely in the short term:

Since the Republicans have a majority in the House, the Democrats do not gain a lot by eliminating the filibuster altogether. And the House illustrates the pitfalls of majority party rule: limited debate, heavily censored amendments, and agenda-setting power centralized in the majority party leadership.

Adler argues that filibustering a SCOTUS nominee is now out of the question:

According to Reid, the rules change only applies to executive branch and lower court judicial nominations, and does not apply to Supreme Court nominations.  That idea is a joke, as Levin’s comments imply  While there are colorable arguments as to why the Senate might want to have different rules for executive branch and judicial nominations, the line drawn by Senator Reid will not stand.  As several Republicans threatened before the vote, if and when there is a Republican majority in the Senate again, there will be no filibuster of a President’s Supreme Court nominees.

Philip Klein thinks the change will be good for Republicans:

Though the rule’s changes will allow Obama to install a lot of liberal judges and executive officials in the near term, in the long run, ending the judicial filibuster will benefit conservatives. The reason is that liberals are simply much better at demonizing conservative judicial appointees, even those with sterling credentials. In many cases, this has prompted Republican presidents to choose “stealth” nominees who end up taking an expansive view of the Constitution once they’re on the bench. Scrapping the 60-vote threshold will make it easier for a future conservative president to choose judges who believe that the Constitution granted limited powers to the federal government.

Kilgore has his eyes open:

I have to say I prefer bad government to dysfunctional government. Perhaps without the fallback measure of the filibuster, the shape of the Supreme Court and of constitutional protections can become an open instead of a submerged issue in Senate and presidential elections. And if the nuclear option is eventually extended to legislative filibusters, perhaps we’ll obtain more coherent policies, and more accountable government, regardless of who wins elections.

Daniel McCarthy worries about stability:

In theory you might now get more intensely partisan nominees and greater polarization of the federal bench, as well as cabinets that are even less aware of the need to build more-than-majority-support for policies that affect everyone. American government has generally not been a thing of bare majorities, which tend to be unstable, and the political problems of Obamacare—quite apart from its policy and technological ones—are a product of a short-lived Democratic majority trying to make a great change without securing even grudging support from the other side.

The Economist looks at why this change happened now:

[B]ecause so many appointees have been blocked there are lots of positions to fill. The payoff from changing the rules now is therefore larger than it would normally be. … The success rate for nominees under the second George Bush was 91%. Under Barack Obama it has fallen to 76%. Those numbers probably give a flattering sense of how the process works because they do not reflect the amount of time nominees have had to wait before confirmation: a judge blocked for two years who eventually gets approved goes into the success column.

Emily Bazelon’s bottom line:

 The fight for bipartisan normalcy has already been lost. The majority leader merely sounded the death knell. There will be lots of loud lamenting at the wake that follows. Don’t be fooled. If the Republicans were in the Democrats’ position, they’d have done the same thing months ago. Now Millett, Wilkins, and Pillard can take their seats on the bench. And soon the funeral speeches will end, and the next phase of life in the Senate will begin.

There That Day

131113_HIST_YoungWitness.jpg.CROP.original-original

Jeff Franzen was six years old when he witnessed the assassination of JFK:

It was a crystal clear day and the sun was over our shoulder. Then the car came down the hill toward us and I heard the three pops. I assumed it was firecrackers—that made sense to me at a parade. I was looking at the car with the president in it and—after the pops—I saw what I thought was confetti. It was the shot that caused the president’s head to explode. My Mom cried out, “Oh, my God.” So I’m watching, I hear the bangs, see what I think was confetti, hear my mom yelling, and I realized something was very wrong. Then the car slowed down and this guy was running up to it; it was the Secret Service agent. Mrs. Kennedy was climbing out the back of the car, the guy jumped on it and was hanging on as the driver punched it and the car took off.

Then it was bedlam, people running all around, motorcycle police like bees flying all over the place. One crashed his motorcycle right in front of us. My Dad, like JFK, had served in the South Pacific during World War II and had enough awareness of what was going on to grab my mother and me and put us on the ground with his body on top of both of us. I later learned we had been directly in the line of fire.

Franzen is highlighted in the above image. Bob Schieffer was a young reporter who stumbled into an odd job that day:

Every phone in the newsroom was ringing. I answered one and a woman said, “Is there anyone who can give me a ride to Dallas?” I couldn’t believe it. “Lady,” I replied, “We’re not running a taxi service here. And besides, the President has been shot!” And she said, “Yes, I heard it on the radio. I think my son is the one they’ve arrested.” It was Lee Harvey Oswald’s mother. …

There she was, standing on the curb—a small woman with gray hair and large horn-rimmed glasses, carrying a little blue travel bag. I got in the back seat with her; [editor] Bill [Foster] drove, and I interviewed her along the way.

Mrs. Oswald didn’t talk much about her son, or about the president’s death. She was hysterical, and, I later came to believe, deranged. Some of the things she said on that drive were so outrageous that I didn’t put them in my story. During the entire drive, she did not mention the death of the President once. Her only reference to the incident while we in the car was “do they think he did it?”

Merriman Smith was a UPI reporter riding in the press car and saw the whole thing:

When we cleared the same curve we could see where we were heading — Parkland Hospital, a large brick structure to the left of the arterial highway. We skidded around a sharp left turn and spilled out of the pool car as it entered the hospital driveway.

I ran to the side of the bubble-top. The president was face-down on the back seat. Mrs. Kennedy made a cradle of her arms around the President’s head and bent over him as if she were whispering to him. Governor Connally was on his back on the floor of the car, his head and shoulders resting on the arm of his wife, Nellie, who kept shaking her head and shaking with dry sobs. Blood oozed from the front of the governor’s suit. I could not see the president’s wound. But I could see blood spattered around the interior of the rear seat and a dark stain spreading down the right side of the president’s dark gray suit.

From the telephone car, I had radioed the Dallas bureau of UPI that three shots had been fired at the Kennedy motorcade. Seeing the bloody scene in the rear of the car at the hospital entrance, I knew I had to get to a telephone immediately. Clint Hill, the Secret Service agent in charge of the detail assigned to Mrs. Kennedy, was leaning over into the rear of the car. “How badly was he hit, Clint?” I asked.

“He’s dead,” Hill replied curtly.

A Dish reader’s eyewitness account here.

Mike Allen, Busted, Ctd

Pareene tackles the purveyor of Playbook:

Allen considers press releases from organizations he covers to be plainly newsworthy in their own right and therefore worthy of passing on to his readers because he’s engaged in trade journalism — reporting on a sector for that sector, not for a general audience. You wouldn’t expect Ad Age to suddenly become adversarial toward the advertising industry, would you?

I imagine most other Politico journalists don’t consider what they do “trade journalism” for the business lobbying industry. But that’s what Allen and his bosses consider good political journalism — it informs members of the Washington power elite what other members are up to. Politico didn’t cooperate with Wemple’s piece, and it definitely won’t respond to it, because there’s absolutely no ethical issue involved here, in Politico’s eyes. You’re not Allen’s audience! If you want actual reporting, read one of the rags aimed at people who don’t wield power on Capitol Hill or K Street.

It’s not strictly trade journalism, of course, because a trade journalist might quietly think that the industry she’s covering is immoral or full of hypocrites and villains. That’s decidedly not the case with Allen, who is notoriously chummy with his sources. Allen believes, very deeply, in the business elite consensus. He just doesn’t consider that to be a “political position” as we would define it. This is what annoys and confounds people about Allen and his place in political journalism. He doesn’t act remotely like a real journalist, because he’s not. He’s an advocate journalist masquerading, poorly, as an old-fashioned political reporter. And, yes, during election season he does (very bad) conventional political analysis, but his primary gig is sincere advocate for the interests of a certain class of major corporate lobbyists and their allies. He never tries to hide his sympathies, he just for some reason feels compelled to retain the “objectivity” mask of a traditional big media reporter.

Earlier Dish on Allen here and here.

Email Of The Day

j&j

John F. Kennedy was shot 50 years ago today. A long-time reader writes:

I thought you guys might be interested in posting this picture of JFK I took in Dallas when I was 13 years old, and the family story surrounding my taking it.

I was born in Dallas into a large extended family of Irish and French Catholics, the third of six kids. My mother and her two sisters were born and raised in Kansas but Dallas-ites since the WWII years. They decided to keep us kids out of school so we could go and see the president. We kids were in Catholic parochial schools, but none of the Catholic schools were let out that day and no one was encouraged to see the Kennedys. This was Dallas, after all: not Kennedy country. Our families belonged to the minority of Democrats, even among Dallas Catholics, who supported Kennedy. Our family political affiliations went back to FDR and the New Deal.

My mother and two aunts picked a spot along Lemmon Avenue, half-way between the airport and downtown. We arrived early. Aunt Terry had the great idea to make a sign, asking the President to stop and shake our hands. We went across the street to buy a role of butcher paper. We spent the next half hour crafting our sign: “PLEASE STOP and SHAKE OUR HANDS, JFK and LBJ in ‘64”. It may have been the end of the cultural 1950s but my mother and aunts were not “out of it” (Ha!). They figured the political plug would help our cause.

It did.

As the motorcade came by, we could see John Kennedy reading our sign. To our great thrill, he beckoned us to come out to the limousine and shake his hand. He must have told the driver to stop because the whole motorcade came to a quick halt. We dropped the sign and ran for the car. The meet-up lasted less than a minute before too many others were streaming in behind us.Kennedy beckoning to us

John Kennedy was beaming. Jackie turned and smiled her big beautiful smile. My mother, aunts and most of the older kids got in a quick handshake and hello before Secret Service agents ushered us and the gathering crowd back.

I didn’t have a chance to shake Kennedy’s hand. I’d asked if I could bring along our Kodak 8mm Brownie Automatic home movie camera. I took pictures of Airforce I and II, pictures of our sign, and of the motorcade. And then, out in the street, in the press of the crowd against the limousine, I impetuously held up the camera a few feet away from the Kennedys. I took only a few seconds of film before a Secret Service arm gently used his arm to have me lower the camera. We went back to the curb, dizzy with excitement; some of which I got on film too.

Thirty minutes later, just after I’d finished telling my 8th grade classmates about this great adventure, Sister Augustine, our teacher, was called out of the classroom. She returned moments later with a very sad, grim look on her face, and a small B&W TV to watch the breaking news.

My family was so upset and heartbroken – upset too with the attitudes of too many Dallasites – we drove to Austin to spend that long, draining weekend and funeral day at my grandmother’s house.

Here are two stills from the 8mm footage I took that. They show what Life Magazine copy writers in the 1983 commemorative issue, which used the pictures, called, “A Family Memento” and “a last glimpse of a radiant President.” The close-up is likely the last, or one of the very last, close-up photos of Jack Kennedy before he was killed. I thought Dish readers might like to see it and hear about what it was like for some of us that day in Dallas 50 years ago.

Revolution In The Classroom

The immense changes in the Arab world over the past few years have sparked new battles over the content of history textbooks in the region:

In July when President Muhammed Morsi was ousted, calls rose to remove him from history tomes. The education minister refused, but decided that Hassan al-Banna, the Brotherhood’s founder, should not be featured. There have been disputes over lesser known figures too. In September, a photograph of an Arab spring icon in a second grade Egyptian Arabic language book sparked fierce debate. Khaled Said, who was beaten to death in 2011 after being taken into police custody, is considered a trigger for the revolution. But some police and some parents objected, claiming he was a junkie and a poor role model for children.

Post-revolutionary Libya in 2011 took a sledgehammer to its old history curriculum, a bizarre mix of leader-worship, xenophobia and feverish Arab nationalism.

Libyan children now pore over every battle in their civil war and the gory details of the death of Colonel Muammar Qaddafi, their former ruler. … But history remains a thorny and highly politicised subject. In Libya, locals continue to battle over facts. The eastern cities of Baida and Benghazi are locked in a dispute over which of them sparked the revolution. Both Misrata militia and some Benghazi fighters claim to have shot Qaddafi in October 2011. So history is a risky business for those charged with recording it. Faraj Najem, a Libyan historian, was criticised for his recent effort to expand the definition of martyrs to include those who were killed by NATO while fighting for Qaddafi. He says his endeavour may be premature “I’m writing on shifting sands,” he sighs.

Where Do Unicorns Come From?

Annalee Newitz looks to Chris Lavers’s book A Natural History of Unicorns for the answer:

As Lavers explains, the original Hebrew text of the Old Testament mentions an animal dish_unicorn called a “reem.” When scholars tried to translate this word into Greek, they were flummoxed. They had no idea what this “reem” was. They knew it was big, and it had horns, and that it obviously wasn’t a goat. (Goats are mentioned elsewhere in the Bible.) So they translated it as “monoceros,” meaning “one-horn.” Then, when the Greek Bible was translated into Latin, the word became “unicornus.” And that word, translated into English, is unicorn.

Early in the 20th century, when scholars cracked the code on ancient cuneiform script, they finally learned what that mysterious reem really was. In these ancient texts, written around the time when the Hebrew Bible was being penned, there are many references to an animal called a rimu. Like the biblical reem, the rimu was enormous, strong, and had horns. That animal was an ox. So all of those references to unicorns in the Bible? Those are actually to an ox. Which, if you read the actual sections of the Bible, makes a lot more sense.

But for nearly 1500 years, Christians believed in the unicorn version of things. The unicorn came to symbolize Christ, its horn the cross, and its tribulations during the hunt were like Christ’s tribulations on earth. Interestingly, the idea that unicorns were attracted to virgins comes from a pagan source. A Latin book called the Physiologus, probably written in the second century CE, mentions that a unicorn can only be caught when it lays its head down in a virgin’s lap. Christian analysts seized on this idea, suggesting that this was symbolic of how Christ came into the world – with the help of a virgin. Keeping all of this in mind, it’s easy to understand what those 16th-century unicorn tapestries are all about.

(Image of a tapestry fragment from The Mystic Capture of the Unicorn, c. 1500, via Wikimedia Commons)

The Bigger The Brain, The Better?

Not really:

People have long been tempted to link brain size and cognition. The intuitive notion that a “big brain” means “more intelligent” was first threatened some time ago, when we discovered animals with larger brains than ours: elephants and whales. Sure as we were of humankind’s superior intelligence, we still felt the need to prevail, so we gamely parried: Perhaps it is the brain size relative to body size that makes our brains the biggest. Though humans come out well there, too, this measure is biased toward birds and other small animals that have relatively large brains for their bodies. After more deliberation, scientists finally offered up the so-called “encephalization quotient”: brain size relative to the expected brain size in related taxa. On top: humans. Phew.

Consider, though, the strange case of that growing child.

Every infant’s brain develops through a period of synaptogenesis – wanton proliferation of synapses, which are the connections between neurons – in the first year or so of life. But one could argue that it is when this intense brain growth ends that the real growth of the child qua individual begins. The next phase of brain development occurs in large part through an increase in synaptic pruning: paring of those connections that are not useful for perceiving, considering or understanding the world the child is facing. In this sense, it’s by downsizing that an individual’s brain is born.