Eve Fairbanks sizes up Mandela’s successors:
People are deeply, deeply disillusioned with the leaders who’ve followed Mandela, both official African National Congress politicians and emotional leaders like Mandela’s offspring. Mandela’s relatives seem to have bucked his example entirely; some have banked millions in mining, an industry against which the apartheid-era ANC railed against as the heart of South Africa’s satanic injustice, while others have cashed in with a reality TV show.
The allegations against the politicians in actual office are more troubling. The country’s second democratically-elected president, Thabo Mbeki, was bitterly criticized for denying South Africa’s AIDS epidemic. Mbeki’s successor, President Jacob Zuma, was prosecuted for both rape and racketeering; he was acquitted of the former, and the latter charges were dropped on technicalities, but recently a huge scandal around taxpayer-funded upgrades to his massive home dominated the papers until Mandela’s—for Zuma, very propitiously timed—death. Daily, the whole black political class is accused in the media of corruption in the awarding of government contracts and greed in treating itself to swanky vacations and flashy vehicles.
“They were heroes,” one of the students standing beside me on the police line mused grimly, “but then they started buying cars.” As they buy cars, economic growth has slowed, basic education has fallen into disrepair, and inequality has deepened. This fall, The Economist concluded in a cover package pessimistically titled “Cry, the Beloved Country” that South Africa “is on the slide both economically and politically” and that the ANC’s “incompetence and outright corruption are the main causes.”
Applebaum argues that Mandela’s death should “should cause South Africans to look critically at the state he helped create and, above all, at the ANC, the party he led”:
Andy Towle has a dizzyingly long list of what are mercifully becoming less news-worthy events. My vote goes to Ted Chalfen, a senior at Fairview High School in Boulder, Colorado, for this integrating, positive commencement speech, which managed to avoid all sense of victimology:
How can anyone not describe this as a moral advance for America?
Sy Hersh has some troubling details about the way the Obama administration explained its intelligence that Assad was solely responsible for the sarin gas attack last August 21. They also downplayed the distinct possibility – aired in their own intelligence – that the al Nusra front may have gotten access to the materials.
The always-worth-reading Jack Shafer digs up the long history of passing off advertizing as editorial:
Advertisements masquerading as editorial copy date back at least to the late 19th century, when they were called “reading notices,” according to the Encyclopedia of American Journalism. Publishers encouraged the placement of reading notices, which placed brand and company names in news stories without any disclaimers, because it was more profitable than conventional advertisements.
And the critics then sound uncannily like this blog today:
In his memoirs, Washington Gladden wrote about quitting his job as a reporter at the New York Independent in 1874 because he could not convince the paper to stop publishing reading notices and publisher’s notices that looked like news stories. “They seem to be essentially evil, and a weakness to the paper,” Gladden wrote. “My scruple may be a foolish one, but I cannot overcome it.” (Hat tips to Prof. Ronald R. Rodgers for this and several other historical pointers.) Renowned journalist Charles A. Dana despised the form, too, writing in 1895, “Let every advertisement appear as an advertisement; no sailing under false colors.” His sentiment was shared by many colleagues, including Editor and Publisher, which editorialized against them, and Adolph S. Ochs, owner of the New York Times. Idealistic publisher E.W. Scripps, who founded an ad-free newspaper in Chicago in 1911, thought avoiding unlabeled reading notices made “good business sense.”
But reading notices were considered so effective that one 1908 book on advertising devoted an entire chapter to “Puffs, Reading Notices, Want Advertisements, Etc.” The key to writing a good reading notice, author Albert E. Edgar advised, was duplicity. “A reading notice of any kind has a certain amount of value because the public reads them as matters of news and not as items of advertising.”
The core business model of sponsored content is lying to readers in order to whore out more completely to advertizers. Call that what you will – “advanced advertorial techniques” or “partners” – the better the lie the more effective the ad. The small question is whether media sites whose fundamental goal is deceiving readers eventually render themselves obsolete. As the WSJ’s Gerard Baker puts it:
An advertiser wants to advertise in The Wall Street Journal to be seen and to be associated with a brand like The Wall Street Journal, or The Financial Times or Bloomberg, because those news organizations are respected. If [advertisers] manipulate the digital or print operations of those news organizations, it makes the reader confused as to what is news and what is advertising, and the reader’s trust, the very reason that those advertisers want to advertise in those news organizations, goes away.
Nate Silver gives it eighteen months, tops. I suspect that’s optimistic. By which time no one may be able to tell the difference between an ad and a piece.
Curiosity turned into a major discovery in the art world when inventor Tim Jenison stumbled into proving that the celebrated, hyperrealistic paintings of 17th century master Johannes Vermeer were achieved not through pure genius but through the aid of the camera obscura, a device that projects a perfect image of its surroundings onto a screen:
[Jenison] traveled to Delft again and again, scouting the places where Vermeer had painted. He learned to read Dutch. He paid for translations of old Latin texts on optics and art. Much later, he did a computer analysis of a high-resolution scan of a Vermeer interior, and discovered “an exponential relationship in the light on the white wall.” The brightness of any surface becomes exponentially less bright the farther it is from a light source—but the unaided human eye doesn’t register that. According to Jenison, the painting he digitally deconstructed shows just such a diminution from light to dark.
But still, exactly how did Vermeer do it? One day, in the bathtub, Jenison had a eureka moment:
A recent Brennan Center report suggests reforming the Justice Assistance Grant (JAG) Program, which doles out money to states:
Current measures inadvertently incentivize unwise policy choices. Federal officials ask states to report the number of arrests, but not whether the crime rate dropped. They measure the amount of cocaine seized, but not whether arrestees were screened for drug addiction. They tally the number of cases prosecuted, but not whether prosecutors reduced the number of petty crime offenders sent to prison. In short, today’s JAG performance measures fail to show whether the programs it funds have achieved “success”: improving public safety without needless social costs.
Scott Lemieux wholeheartedly supports this idea:
Using the power of the purse to reduce incarceration rates would not necessarily find a hostile reception in state capitols. As the Prospect‘s Abby Rapoport reportede arlier this year, prison reform is not the sole province of the left. Several states controlled by conservative Republicans—including Texas and Kansas—have enacted salutary prison reforms. Indeed, state legislatures should consider using their own budgets to focus police and prosecutors on crime-reduction goals rather than rewarding incarceration as an end in itself.
San Francisco, the birthplace of street skateboarding, was also the first city to design solutions such as “pig’s ears” – metal flanges added to the corner edges of pavements and low walls to deter skateboarders. These periodic bumps along the edge create a barrier that would send a skateboarder tumbling if they tried to jump and slide along.
Indeed, one of the main criticisms of such design is that it aims to exclude already marginalised populations such as youths or the homeless. Unpleasant design, [PhD student Selena] Savic says, “is there to make things pleasant, but for a very particular audience. So in the general case, it’s pleasant for families, but not pleasant for junkies.”
Preventing rough sleeping is a recurring theme. Any space that someone might lie down in, or even sit too long, is likely to see spikes, railings, stones or bollards added. In the Canadian city of Calgary, authorities covered the ground beneath the Louise Bridge with thousands of bowling ball-sized rocks. This unusual landscaping feature wasn’t for the aesthetic benefit of pedestrians walking along the nearby path, but part of a plan to displace the homeless population that took shelter under the bridge.
(Photo of rocks beneath Calgary’s Louise Bridge by Flickr user anarchitect)
Jerry Saltz urges all but the wealthiest young artists to stay away from MFA programs, which after two years are “hovering near a quarter-million dollars” in cost. And besides, students have to deal with “a lot of bullshit”:
Iffy artist-teachers wield enormous artistic and intellectual influence over students, favors are doled out in power cliques. Zealous theoreticians continue to scare the creativity and opinions out their third generation of young artists and critics. Too many students make highly derivative work (often like that of their teachers) and no one tells them so. A lot of artists in these programs learn how to talk a good game instead of being honestly self-critical about their own work.
… I’ve taught at institutions across the prestige spectrum. Truthfully? Students who go to high-profile schools get a subtle eighteen-month bump after they graduate, in part because dealers and collectors (oy) see their MFA shows. However, once this short-term advantage dissipates, the artist becomes one in a crowd, with a mountain of debt, and may need to have a full-time job indefinitely to pay it off. There’s no surer way to throw away that early advantage than getting a job that saps their art-making energy.
Saltz also recommends a new essay by artist Coco Fusco, who “pulls back the curtains on the risky business and chancy racket of the Master of Fine Arts degree.”
Joshua David Stein chronicles how the letter is used less and less on the Internet:
[I]n 2004, Stewart Butterfield and Caterina Fake founded Flickr, a photograph-sharing application, without the standard penultimate E. “The most compelling reason to remove the E,” explained Ms. Fake, “was that we were unable to acquire the domain Flicker.com … The rest of the team were more in favor of other options, such as ‘FlickerIt’ or ‘FlickerUp’ but somehow, through persuasion or arm-twisting, I prevailed.” It was good news for the company but bad news for the letter. A year later, the company was acquired by Yahoo for $35 million.
Soon many startups began jettisoning their Es like toxic assets. In 2009, Grindr, a geosocial network application for gay men, chose to make do without the letter E. Membership quickly swelled. Myriad other brands followed suit, including Blendr, Gathr, Pixlr, Readr, Timr, Viewr, Pushr.
And of course, there is the blogging platform Tumblr, whose launch in 2007 may have marked the true end of E. “There are a variety of reasons why Tumblr contains no E, from branding considerations to environmental factors (fewer letters mean lower power consumption by our servers),” said the company’s editorial director, Christopher Price. “At the end of the day, however, it all comes down to one simple, absolute truth: Tumbler.com looks fucking stupid.”
(Image via Flickr user Double–M)
Charles Kenny proposes that we export the growing ranks of unemployed Americans to other countries that need more workers, pointing to Chile, Hong Kong, Indonesia, Mexico, and South Korea as potential destinations:
According to the State Department, only about 6.3 million U.S. citizens live abroad, or around 2 percent of the domestic population. In relative terms, that’s pathetic. About 5.5 million British people live permanently abroad, almost five times the U.S. level in per capita terms. Maybe they’re trying to escape the lousy weather, but it isn’t like Brits have natural advantages over Americans as travelers. British people are almost as bad at speaking other languages as Americans are, and in terms of haughty isolationism and disdain for foreigners, surely Brits are worse. (I’m allowed say this — I’m British.) So why shouldn’t America send out some huddled masses for once?
But would these other countries want American workers?
Noting that Qatar “is estimated to have recently spent well in excess of $1 billion on Western art,” James Panero considers the implications:
On the one hand, Qatar’s art initiatives can be seen as a modernizing force, one that could liberalize the tribal attitudes of the country’s native population and pave the way for further political reform. On the other hand, contemporary art may merely serve as a cover for further repressive policies. This artifice of modernism mirrors Qatar’s other contradictory diplomatic positions. An ally to the United States and host of U.S. Central Command, Qatar nevertheless reportedly helped Khalid Sheikh Mohammed escape U.S. capture in the 1990s, may have been paying protection money to al Qaeda, and is currently arming radical Syrian rebels and offering safe haven to Egypt’s Muslim Brotherhood.
The terrible history of Iran demonstrates what can happen when a modernist culture merely overlays a repressive regime. In such circumstances, artists and organizations might profit by spreading modernity, but they are also abetting a compromised state.
(Photo of skyline in Doha, Qatar, as seen from the Museum of Islamic Art, by Mark Pegrum)
One reason, according to Singer, that people are so hesitant to give is they think their kindness will not matter. This tendency to think that one can’t do very much, or to dismiss all forms of aid as useless, is what researchers call “futility thinking:” What difference can one person even make? Research indicates that money makes people more individualistic and less altruistic. In other words, as societies become wealthier, their citizens become more individualistic and depend less upon one another. Self-interest becomes the norm.
But one antidote against futility thinking is to carefully research charities and organizations—something that Singer’s $100 donation experiment allowed students do. They were presented with four organizations, asked to research and discuss their merits, and vote on where the $100 should go. They applied the lessons they’d learned in the course: that not all donations are equal, and that some donations have a measurably more positive impact than the same amount donated elsewhere…. They learn that their money will always go much further overseas: that a very small amount of money for an American can be life-saving to someone who is desperately poor. In other words, they learn about the tenets of effective altruism: how to evaluate organizations for transparency and benefits, and figure out which forms of aid are the most cost-effective. This is information that tends to inspire more giving.
I’d guess we have 12-18 months until the population develops herd immunity to ‘viral’ headlines. http://t.co/x2GLKF6K8i
— Nate Silver (@fivethirtyeight) December 7, 2013
The tweet above is a small moment of hope in a sea of upworthiness.
Meanwhile, I cannot get Derek Walcott’s equation of poetry and prayer out of my mind. Nor Jay Parini’s understanding of Jesus as a sacral moment in the meeting between East and West. Alice Quinn gave us, in another sharp take on the power of poetry, the shards of Emily Dickinson’s genius – on little scraps of paper.
Four more: the Zionist reading of Bambi; the mystery of Caravaggio’s Saint Matthew, so dear to Pope Francis; the struggles of the forgotten lower-middle-class; why high IQ kids turn out to be more likely to try drugs in later life; and Daniel Mendelsohn on the critical virtues of the emails students write to explain why their paper is late.
See you in the morning.
Mark Shea takes on Fox News, Limbaugh and Drudge in their ignorant, stupid, malevolent attacks on Pope Francis. It’s a rocking rant.
Allow me to recommend Jim Fallows’ latest post on freedom of speech and Max Blumenthal’s grueling book about the extremist elements in contemporary Israel, Goliath. The core point is that, whatever you believe about the arguments of the book, or of its author, it remains a powerfully reported account of actual people currently living in Israel, their attitudes and beliefs. You might imagine that Blumenthal’s selection of racist, extremist elements in Israeli culture obscures a larger truth, as we noted recently. But, even then, it is still a lesser truth that should be engaged, not ignored. He marshalls facts. He talks to people directly. The idea that a book that delves into such empirical questions must somehow be repudiated or ignored is a deeply illiberal idea.
[Blumenthal] has found a group of people he identifies as extremists in Israel—extreme in their belief that Arabs have no place in their society, extreme in their hostility especially to recent non-Jewish African refugees, extreme in their seeming rejection of the liberal-democratic vision of Israel’s future. He says: These people are coming, and they’re taking Israeli politics with them. As he put it in a recent interview with Salon, the book is “an unvarnished view of Israel at its most extreme.” Again, the power of his book is not that Blumenthal disagrees with these groups. Obviously he does. It comes from what he shows.
To see for yourself, just watch a few minutes of the video Blumenthal and his associates made a few months ago, about recent anti-African-immigration movements. The narration obviously disapproves of the anti-immigrant activists, but that doesn’t matter. The power of the video comes from letting these people talk, starting a minute or so in.
I don’t know how you can watch the video above without thinking of previous attempts in human history – a “cancer on our body!” – to demonize, persecute and expel marginal minorities in defense of a racially homogeneous country. Period. In a particularly glaring twist, the New York Times commissioned the video then simply refused to air it.
Now I know I can be tedious about this kind of thing, and one shouldn’t engage a book merely because some want it branded anathema. But nonfiction is at its most urgent when forcing us to confront uncomfortable reality.
You can and should criticize that reality for being untrue, or deceptive, or simply false. But not engaging it at all on empirical grounds is a sign of fear, not wisdom. That was my argument for airing “The Bell Curve” a couple of decades ago; it’s my argument for presenting Steve Jimenez’s reporting on the tragedy of Matthew Shepard on the Dish. It’s why my instinctive response to those who want a book ruled out of the discourse, is to read it as a human being or air it as an editor.
It’s staggering to me that the New York Times, for example, has not reviewed (even critically) either Goliath or The Book of Matt. Why not? Either because they are cowards or because they genuinely believe that examining arguments that undermine core factions or lobbies – gay or Jewish – is somehow offensive in itself. Neither of these is a good argument. Both sustain denial.
Aeon captions the above short film, “Unusual Choices”:
In this gentle and honest portrait, Buddhist nun Ani Chudrun explains what led her away from a life of celebrity, drugs and materialism to one of reflection, compassion and ritual. Shot at Ani’s home in a forest in Sussex, the film shows how one person can radically transform their life in the search for meaning.
In the midst of a review of Francis Spufford’s Unapologetic, Alan Jacobs relays the story of a summer he spent teaching a course called “Practical Apologetics” to a group of Christian pastors in Africa. When they spurned his advice on how to reach out to Muslims, he realized there was something missing from his suggestions – as Jacobs puts it, there was “a human dimension to this enterprise that I had failed to take into account”:
Would-be apologists cannot think only of the needs of their audience; they must think also of their own limitations. Those limitations may be intellectual: as Sir Thomas Browne wrote in the 17th century, “Every man is not a proper champion for truth, nor fit to take up the gauntlet in the cause of verity. Many from the ignorance of these maxims, and an inconsiderate zeal unto truth, have too rashly charged the troops of error, and remain as trophies unto the enemies of truth.” They overrate their own intellectual capabilities, and embarrass not just themselves but the faith they had planned to defend.
But equally important are emotional or spiritual limitations. ‘I have found that nothing is more dangerous to one’s own faith than the work of an apologist,” [C.S.] Lewis wrote in 1945, when he was at the apex of his career as defender of the faith. “No doctrine of that faith seems to me so spectral, so unreal as the one that I have just successfully defended in a public debate.” The key word in that second sentence is “successfully”: the greatest spiritual danger presents itself not to the one who has manifestly failed (in Milton’s phrase) “to justify God’s ways to man,” but to the one who succeeds, or thinks he succeeds. And the greatest danger is not even pride: it is the discovery that a doctrine put into cold print, or into one’s own (fallen, fallible) mouth, loses much of its reality and power.
Bronowski saw the purpose of art as being the same as that of science: to give meaning and order to our experience by revealing hidden structure beneath the appearances. … Though he expressed it with characteristic understatement, his was a message of rebellion, a bold attempt to correct one of the great misconceptions of modern times and single-handedly to redirect the great river of intellectual history.
The misconception might—if its perniciousness were generally acknowledged—be called antihumanism: the pattern of ideas that disparages the human species, jeers at its claims to superiority over other species, or to any special entitlement, glories in its cosmic insignificance, and reinterprets its advent as nothing special and its subsequent progress as mostly illusory or fraudulent—and thus in all these senses, denies its ascent. Antihuman ideas would have seemed wicked or insane to the great majority of thinkers since the Enlightenment at least. But during Bronowski’s lifetime, many of them had become mainstream, to the extent of being taken for granted in academic and everyday discourse. So Bronowski’s rebellion is already there in his title: The Ascent of Man. He says that it refers to the “brilliant sequence of cultural peaks”—such as the invention of stone tools, agriculture, cities, and modern science—by which humans have learned how “not to accept the environment but to change it,” thereby improving our lives. And that this progress, despite continual setbacks, has been cumulative for as long as our species has existed.
Nearly 9 million people were mobilized to serve in Britain’s military during World War I. By the time photographer Giles Price started seeking veterans of the war in 2005, there were just 23 left. … His series, “The Old Guard,” features portraits of the last 12 veterans of the war, which broke out 100 years ago next summer. At the project’s start, Price wrote letters to each of the veterans requesting to take their photo. Thus commenced a race against time. “I was 20 minutes from taking one sitter when the home rang me to say he had passed that morning. He was 106,” Price said.
Price built a small studio in each of the homes he visited and used a studio flash to light the portraits. He shot the centenarians looking upward and ahead, in an effort to place less emphasis on their extreme age and more on the pride and dignity they retained over the years. “The gaze was one of reflection, be that the war, long life, or anything that we associate with time and memories,” Price said.