Redacted Drawings

dish_guantanamo

Janet Hamlin began working as a sketch artist at the Guantanamo Bay hearings in 2006. Her drawings – collected in a new book, Sketching Guantanamo: Court Sketches of the Military Tribunals 2006–2013 – form one of the only visual records of the trials, since cameras and video recorders were not permitted in court. An excerpt from the book:

Guantanamo tribunals differed from the other court drawings I’ve done. For instance, there were faces I was not allowed to draw, and each drawing could not leave the courtroom until a Pentagon official reviewed it. He would examine the art, occasionally have me erase some of the details, then sign and stamp the art once approved. Then I carried the sketches back, uploaded them to the media pool with descriptions, grabbed lunch, and got back for the afternoon session, going through three levels of security every time we entered or left the court area, always with an escort. Time is precious.

In a review of the book, Jillian Steinhauer emphasizes the unusual conditions under which Hamlin sketched:

When it comes to the larger of the two Guantanamo courtrooms, where Khalid Sheikh Mohammed and the other 9/11 conspirators are being tried, Hamlin isn’t even allowed inside.

She and other journalists must sit in a separate room at the back, separated from the proceedings by three panes of thick, soundproof glass. Not only that, but sound from the trial is broadcast into the room on a 40-second delay, to allow an officer who sits near the judge to censor classified material, if need be. In Hamlin’s words, “What we are seeing looks like a badly dubbed movie.”

Given these circumstances, Hamlin’s animated, colorful, and detailed sketches constitute a nearly heroic effort. The fact that she’s been able to give us thoughtful close-ups of everyone from Canadian Omar Khadr, the youngest convicted war criminal in modern history, to Khalid Sheikh Mohammed attests to her perseverance and skill as an artist. Her portraits and scenes breathe life — including that most familiar quality of human existence, tedium — into an operation shrouded in political secrecy. When Hamlin draws Khadr’s clenched, angular shoulders and his tired, weary face, he becomes a real person. Mohammed appears in the sketches alternately impassioned, disinterested, and, when he shows up one day in a camouflage vest, menacing — until we read the accompanying caption, in which Hamlin explains that the item “is a hunting vest and can be found at Sears.”

(Caption via NYRB: “Khalid Sheikh Mohammed wore a camouflage vest to court. Army Col. James L. Pohl, the judge, is shown in back with a court security officer at his left. Both Pohl and the security officer have buttons to mute, with white noise, testimony they suspect may be classified; the security officer also reviews my sketches before releasing them, October 17, 2012.” More drawings from Hamlin here.)

The Damage Done By Drones, Ctd

drones-2

Responding to two new reports on America’s drone war, Benjamin Wittes admits that “it is impossible for a modestly-moral person to read these reports without something approaching nausea” and that “they thus raise serious questions about the way at least those drone strikes they cover took place.” But he still criticizes various aspects of the reports:

[T]he reports—particularly the Amnesty report—have a way of conflating legitimate targeting which may produce civilian collateral damage with horrible errors that simply should not happen. The most glaring example of this is Amnesty’s treatment of the June 4, 2012 strike that killed Abu Yahya Al-Libi, a senior Al Qaeda leader. According to the Amnesty report, an initial drone strike killed five people and injured four others (the report does not say whether any were civilians). A group of 12 people, including both local residents and foreigners “whom villagers said were Arabs and Central Asians who were likely to be members of al-Qa’ida” showed up “to assist victims.” Al-Libi was “overseeing the rescue efforts” and was killed in the second strike, along with between 9 and 15 other people, including six local tribesman who “as far as Amnesty International could determine, had come only to assist victims.” In other words, six tribesman were killed working alongside a group of Al Qaeda operatives under a senior Al Qaeda official were killed.

Amnesty considers this strike a potential “war crime” both because it constituted an attack on civilian rescuers and, quite amazingly, because Al-Libi may not have been directly participating in hostilities at the time of the strike. (I’m really not making this up. See pp. 29-30.) In other words, Amnesty’s position is that it may be a war crime to target a senior Al Qaeda leader when he’s doing something other than plotting attacks—if, that is, it’s lawful to target him at all. There are many serious issues these reports raise; this kind of overreach undermines them all.

I agree. Unintended collateral civilian casualties are not war crimes, and never have been. But the moral equation shifts, it seems to me, when the belligerent stops truly seeing these casualties as morally deeply troubling. This is particularly true when it comes to the anti-septic feel of drone warfare, where human beings can be seen simply as distant statistics. There comes a point at which indifference to civilian casualties veers toward a war crime. That was my problem with the Israelis’ pulverization of Gaza in 2009. They did not seem particularly agonized by it at all, despite the huge imbalance of fatalities on each side of that conflict. With that kind of technological power, restraint is even more essential if we are not to lose our soul.

The way in which the Obama administration began to scale down drone warfare in the growing evidence of such casualties suggests to me a mindset attempting to avoid the worst aspects of such a war – not surrendering to it. But it’s a blurry line, and we need to remain extremely vigilant about it for moral and strategic reasons. Multiple civilian deaths do not, after all, help the case against al Qaeda in Pakistan.

(Photo by Getty)

The Hidden Homeless

Homeless Activists Group Release Report On New York City's Abandon Buildings

Ian Frazier reports on a troubling trend:

[T]here are far more homeless people in [New York City] today than there have been since “modern homelessness” (as experts refer to it) began, back in the nineteen-seventies. Most New Yorkers I talk to do not know this. They say they thought there were fewer homeless people than before, because they see fewer of them. In fact, during the twelve years of the Bloomberg administration, the number of homeless people has gone through the roof they do not have.

There are now two hundred and thirty-six homeless shelters in the city. Imagine Yankee Stadium almost four-fifths full of homeless families; about eighteen thousand adults in families in New York City were homeless as of January, 2013, and more than twenty-one thousand children. The [Coalition for the Homeless] says that during Bloomberg’s twelve years the number of homeless families went up by seventy-three per cent. One child out of every hundred children in the city is homeless.

The number of homeless single adults is up, too, but more of them are in programs than used to be, and some have taken to living underground, in subway tunnels and other places out of sight. Homeless individuals who do frequent the streets may have a philosophical streak they share with passersby, and of course they sometimes panhandle. Homeless families, by contrast, have fewer problems of mental illness and substance abuse, and they mostly stay off the street. If you are living on the street and you have children, they are more likely to be taken away and put in foster care. When homeless families are on the street or on public transportation, they are usually trying to get somewhere. If you see a young woman with big, wheeled suitcases and several children wearing backpacks on a train bound for some far subway stop, they could be homeless. Homeless families usually don’t engage with other passengers, and they seldom panhandle.

(Photo: An empty lot is viewed on January 25, 2012 in New York City. Homeless activist group Picture the Homeless will release findings from a new study conducted with Hunter College in which a count of vacant buildings and lots in 20 of NYC’s 59 community boards was conducted. The findings, which will be released tomorrow in full, found that vacant properties in New York City could house every homeless person and more. By Spencer Platt/Getty Images)

Ask Elyn Saks Anything

[Updated with many new questions submitted by readers]

From her Wiki:

Elyn Saks is an expert in mental health law and a Mac­Arthur Foundation Fellowship winner, which she used to create the Saks Institute for Mental Health Law, Policy, and Ethics. She is also Associate Dean and Orrin B. Evans Professor of Law, Psychology, and Psychiatry and the Behavioral Sciences at the University of Southern California Gould Law School. Saks lives with schizophrenia and has written about her experience with the illness in her award-winning, best-selling autobiography, The Center Cannot Hold: My Journey Through Madness, published by Hyperion Books in 2007.

You can also watch her TED Talk about living with mental illness here.

What should we ask Elyn? Let us know via the survey below (if you are reading on a mobile device, click here):


This embed is invalid

Fluent Fans

Na’vi, a language invented for the alien population of Pandora in the 2009 film Avatar, has really taken off:

The Na’vi tongue was invented by Paul Frommer, a clean-cut linguist and professor emeritus of management communication from California. Na’vi is melodious and fast flowing, composed of unusual syntax and consonant clusters that sound beautiful and exotic to an Anglophone ear. It is one of many so-called constructed languages, or conlangs: a man-made language authored for a purpose. From world peace, as in the case of perhaps the most widely-known conlang, Esperanto, to expanding our capacity for logic, like the unwieldy Loglan, constructed languages have captivated us for centuries.

Na’vi is a conlang subtype known as an artlang: It was created with a specific aesthetic goal, as an integral part of a piece of art. Like other famous artlangs (J.R.R. Tolkien’s Elvish, Star Trek’s Klingon, and the languages spoken in the HBO television series Game of Thrones, Dothraki and Valyrian) Na’vi was intended solely for fiction. … Moved to connect themselves in a more tangible way to the utopian world Avatar presented, [fans] had dived into the film and resurfaced with something no longer confined to fiction. Currently, Frommer estimates there are around 100 Na’vi speakers, though other researchers have said there are many more, and the language has grown to a vocabulary of approximately 2,000 words.

Previous Dish on Dothraki here and here, and Esperanto here. Below is a Klingon version of “Never Gonna Give You Up” that just emerged last week – if you can take the hathos (I was mesmerized):

The Silence Of St. Thomas, Ctd

In a review of Denys Turner’s Thomas Aquinas: A Portrait, James R. Kelly looks at the human behind the holy saint:

We learn that Thomas disappointed his ambitious parents when he joined the recently formed Gentile_da_Fabriano_052Dominicans, who sided with the era’s 99 percent with their off-putting vow of poverty and their street-preaching, that he did grow fat and balding, that he knew no Greek, that he unhesitatingly drew on Arabic sources of Aristotle, that his sermons were “mercifully short,” that he was not scintillating, that he had a plodding deadpan style, that he too thought of himself as a “dumb ox,” that he didn’t complete his Summa Theologiae and that this very incompletion was symptomatic of his non-self-promotional personal and professional style.

Thomas was the smartest person in the room, but he always took the last seat in the last row. So—how did Thomas the Unlikely become the founder of that periodically discarded only to be rediscovered philosophia perennis called Thomism? A deep Catholic sensibility is part of it. Thomas’s philosophy aces the F. Scott Fitzgerald criterion that “the test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function.” In his own little paradox, Turner characterizes Thomas’s thought as combining both the Protestant either-or and the Catholic both-and.

Nathaniel Peters notes Thomas’s intellectual humility in his final days:

Is Thomas a saint because he made good arguments, and if not, what is it that he did? Turner’s answer is simple: He fell silent. Thomas’ refusal to finish the Summa speaks volumes about the limits of theology and the magnitude of God. More than that, though, there is a holiness lying behind everything Thomas wrote, precisely in the fact that he lies behind it, not in front. Thomas’ writing is not about Thomas; it is about the truth. His holiness, Turner concludes, “is a theologian’s holiness, the holy teacher invisible otherwise than in the holy teaching itself.” In his writing as in his life, Thomas embraced poverty so that God might be preached all the more.

Previous Dish on Turner’s biography here.

(Detail from Gentile da Fabriano’s Valle Romita Polyptych, circa 1400, via Wikimedia Commons)

“The Best Thing Going For The GOP” Ctd

A reader takes exception with me calling the following quote from Christie on the shutdown “pitch-perfect“:

All you need to do is look about 200 miles south of here to see the mess that Republicans and Democrats have made of our national government and we should haul all their rear-ends to Camden today to see how bipartisanship works and government works together.

This is decidedly NOT pitch-perfect. Well, maybe it is to Republicans. Personally, all I see is more of the insidious false equivalence that allows Republicans to act like lunatics and then avoid accountability by yelling that Democrats are guilty, too. Democrats may be spineless and lame, as a general rule, but to accuse them of being culpable for our current morass anywhere near so much as the Republicans is wildly disingenuous and irresponsible. At best.

Give Christie credit for being more honest and sane than the rest of the Republicans. Yip de doo. But that doesn’t mean he’s doing what would really make him a hero true to his no bullshit-reputation: unequivocally calling out the Republicans for actively trying to destroy the American economy and the ability of the federal government to serve even the most basic functions. Once he does that, I’ll consider voting for him in 2016. Until then, being slightly less crazy than the rest of the GOP doesn’t make him presidential material.

Another reader:

Not one mention of Christie and Obama actually working together after Sandy?  Remember, the GOP primaries will be run by crazy people.  They will savage Christie for actually being friendly with the president.  This is all you will see:

Another elaborates:

The far-right base is sick of the establishment telling them who’s “electable.”  Even if they weren’t swooning over Ted Cruz (who got an 8-minute standing ovation from Republican women in Texas on Monday), 2016 is the year they’re adamant a “true conservative” will be the nominee. And while you may believe Christie is one, the base of the Republican party emphatically does not.  He’s the governor of a northeastern state, he welcomed Obama after Hurricane Sandy, and he just rolled over (in their perception) on marriage equality.  There is no way Christie will ever be the Republican nominee in 2016.

Another turns to Christie’s weight:

I don’t think he’s too fat to be elected president – I agree it augments his populist appeal – but he may well be too fat to successfully run for president. Eighteen months of (first) primary battles with Ted Cruz for the soul of the GOP and (second, if he wins the nomination) winner-take-all presidential campaigning in the late, sweltering summer of 2016 will require enormous physical stamina.  (Ever been in Richmond, Virginia, on Labor Day?  How about Tampa? It’s not pretty.)

Most of our recent candidates have been in tip-top shape for their age: Obama, Romney, Bush, Kerry.  The flabbiest of the last 20 years was probably Al Gore, who was a damn sight fitter than Christie, and, remember, he won the popular vote.  Contrary to the SNL image, Bill Clinton was 15 or 20 pounds overweight at his worst, and he was preternaturally durable mentally and physically.  Same with McCain, who seems to have a titanium constitution against his prisoner-of-war injuries.  Christie probably also has a deep well of energy despite his size, but since the emergence of the 24-hour electronic news cycle, nobody of his physique has attempted the most grueling task a politician can take on.

Another quotes me:

“The Jacksonian wing of the GOP – think Zell Miller or Dick Cheney – loves a fighter, cheers a brawler, and would swallow whatever disagreements they have with Christie on social issues because of his attitude.”

Like this guy?

giulianinicholasrobertsafp.jpg

Another goes local:

Here is what Chris Christie has done that should “render him unacceptable to a majority of the 2016 electorate”: he killed the proposed Trans-Hudson Passenger Rail Tunnel.

None of this can be reasonably disputed: New Jersey is highly dependent for its economic success on travel into New York.  There are terrible bottlenecks getting New Jersey commuters past the Hudson River.  It is inevitable that another bridge or tunnel has to be built on the Hudson.  The 2008 crash made the cost of building historically low.

Christie killed that project for no good reason other than to stick an ideological thumb in Obama’s eye during an economic crisis.  He said that it would be expensive, which is true, but given that that cost has to be incurred sooner or later it was foolish of him not to seize an unusually opportune moment to build.  Maintaining the Lincoln Tunnel is also very expensive, and presumably Christie would not suggest that New Jersey could save money by closing it.

Christie’s decision was horrifically stupid and destructive, and New Jersey will suffer for a generation as a result of it.  If that is what conservatives call bipartisanship, moderation, and “common sense governing,” thanks but no thanks.

(Photo by Nicholas Roberts/AFP/Getty Images)

What Can A Tech Surge Accomplish?

Over the weekend, the Obama administration announced a “tech surge” in order to fix Healthcare.gov. Timothy B. Lee doubts this will work:

Not only will more programmers not necessarily speed up the development process very much, it could actually make things worse. That’s because existing programmers will need to spend time training the new programmers, assigning them tasks, and coordinating their efforts with the new, larger development team. … [A]dding more people to the HealthCare.gov team may actually lengthen the time it takes to build the HealthCare.gov Web site.

This phenomenon, in which adding people to a late software project makes it later, is known as Brooks’s Law. In the mid-1960s, Fred Brooks managed the project to build OS/360, a complex IBM operating system for mainframes. In 1975, he wrote a famous book called “The Mythical Man Month” about the experience.

“When schedule slippage is recognized, the natural (and traditional) response is to add manpower.” Brooks wrote. “Like dousing a fire with gasoline, this makes matters worse, much worse.”

McArdle makes similar points:

While outsiders may be valuable for small, concrete tasks and a fresh take on particularly tough problems, they can’t just come in and fix everything. If the contractors and HHS managers who built the federal exchange can’t fix it in the next month, then it’s just not going to get fixed in the next month.

Yuval Levin identifies another reason why Silicon Valley’s tech mavens can’t save the site:

I think the idea that Silicon Valley types are going to rescue the bureaucracy confuses two kinds of technical mastery: experimental innovation and consolidated management. Each has its strengths and its weaknesses, but these two visions generally do not play well together. Successful technology firms do a huge amount of trial and error, avoid over-management, and create adaptive knowledge systems that work by learning and are constantly tested against competitors. The federal bureaucracy develops and enforces uniform rules meant to apply technical knowledge it (thinks it) already possesses to a complex and chaotic world to make it simpler and more orderly — to make it do the bidding of policymakers. As Max Weber put it, “bureaucratic administration means fundamentally domination through knowledge.” The maxim of the Internet age is closer to “liberation through knowledge.”

Our Memory (And Selves) Will Belong To The Cloud

cloud1

In a review of Permanent Present Tense by Suzanne Corkin, Steven Shapin reflects on the life of Corkin’s famous patient, Henry Molaison, whose ability to form new memories was destroyed by a brain operation when he was 27. How Molaison’s sense of memory relates to our own:

In everyday life, we don’t much care whether what we remember is contained between our ears or resides on a piece of paper. We rely on our partners, colleagues, and friends to remind us of obligations; we stick Post-its on computers and fridge doors to cue us to buy milk or have the car serviced; our cell phones ping to announce coming appointments and remember all our phone numbers for us. Increasingly, our memories are distributed across a landscape populated by things and other people, and, in that respect, it’s possible to see Molaison as standing merely at a pathological extreme of memory’s normalcy: after the operation, all of Molaison’s declarative memory lived outside his own body, while only some of ours does.

Shapin imagines a future in which “technology will banish forgetting”:

The Quantified Self movement encourages everyone to follow Silicon Valley utopians in forming a personal digital register of every item of food consumed and every measurable bodily state. A camera worn on the neck of a “lifelogger” records everything seen, and a digital recorder captures everything heard. It’s all there—nothing filtered, nothing lost, nothing distorted by the messiness of internal memory. Wearable computers like Google Glass hold out the promise of still more powerful modes of self-archiving. We shall be as gods, and about ourselves we shall know all things. Technology will banish forgetting, and the stores of undeformed memory will live forever in the cloud, retrievable at will. The name for our remaining problems will be “search”: all we’ll have to do is remember what we’re looking for, master a few tricks for finding it, and, finally, offload the initiation of search onto external prompts that will remind us to remember.

Last Saturday I spent a relatively harmless and hugely enjoyable few hours with some friends and the conversation got a little 3 am college dormy. My friend elaborated an epiphany about eternity.

As the years go by, and our lives are digitally recorded in more and more ways, he argued, there will be digital versions of ourselves – from selfies to web trails, from precise consumer preferences to social networks, from thousands of emails and texts to videos and Facebook likes – that will have more data embedded in them than even the most industrious biographer could have used on the most famous person in the past. We will also come, inevitably, to refer to these digital summaries of ourselves, to remind us of our past, to get digital proof of previous loves or ideas or events or friends. We will therefore need to remember less and less, even as the imprint we make on the world becomes more and more indelible and eternal. We can just look them up, the way we reach for Google when we cannot remember the answer to a trivia question or need to resolve an empirical debate.

Writing a blog every day for thirteen years and counting brings that home rather firmly. So many feelings, thoughts, asides, facts, wishes, errors, home-runs, and massive fails are all there for anyone to see and for me to flinch from. But just as surely, I do not need to remember much of my life any more. It is remembered for me and exists in something we call the cloud. The cloud is eternal. It reaches into the depths of the past and makes it instantly accessible to the present and to any non-apocalyptic future.

The concept of a personal trail that makes a life eternal isn’t new, of course. I think of Emily Dickinson’s vast trove of scribbles or Pascal’s unfinished scraps of paper we now know as the Pensees. But in the past, only a few managed to achieve anything like the record we are currently assembling of our own lives every day. For most, there was the Parish register or, if you were really lucky, a mention in government or church records. A gravestone here; a family genealogist there. Far, far more human beings ended their lives with no lasting memory of them remaining among those still living. We were almost all unknown soldiers once.

Now, we are all so known. And the key aspect of the cloud, it seems to me, is not just its powerful storage, but its instant accessibility to anyone with an Internet connection. Just like the seventeenth century poems we publish afresh here on the Dish alongside the written-a-few-minutes-ago blog-posts, the old and the new exist as equals, because they will increasingly become equally available to us. And so time itself disappears as an experience online. It truly is an eternal now, that connects past and future to a perpetual present.

And in some ways, our personal digital records are very much the summation of ourselves. Not our physical, intimate, human selves – the selves we eat with and run with and fall in love with – but our abstracted selves, the conglomeration of every detail, feeling, idea, thought, impulse, and friend that tells the story of me. Long after I am dead, will that not be the most accessible incarnation of me – alongside all the published words I have written in part to live past my physical expiration date? Will I not continue to exist in some form that is available to all of humankind for ever?

Ambitious types in the past performed all sorts of amazing and horrifying things to become immortal, to leave a legacy, to be remembered by history. But increasingly, that form of immortality is available to to more and more people in principle. In the future the scale of the recording – the Big Data Of Humanity – will only increase and deepen. In practice, of course, our lives will exist only in so far as others care to find us. And that is how it has always been. But now, more and more of us will never fully die. We will always be available for rebooting.

Is it a purely etymological accident that we call this collective memory a cloud?

Was it also purely by accident that for aeons, humankind looked up past the clouds to see Heaven?