A Career In Busywork

by Patrick Appel

David Graeber believes that “bullshit jobs” are on the rise:

In the year 1930, John Maynard Keynes predicted that, by century’s end, technology would have advanced sufficiently that countries like Great Britain or the United States would have achieved a 15-hour work week. There’s every reason to believe he was right. In technological terms, we are quite capable of this. And yet it didn’t happen. Instead, technology has been marshalled, if anything, to figure out ways to make us all work more. In order to achieve this, jobs have had to be created that are, effectively, pointless. Huge swathes of people, in Europe and North America in particular, spend their entire working lives performing tasks they secretly believe do not really need to be performed. The moral and spiritual damage that comes from this situation is profound. It is a scar across our collective soul. Yet virtually no one talks about it.

In response, Ryan Avent claims that “administrative jobs are the modern equivalent of the industrial line worker”:

Disaggregation may make it look meaningless, since many workers end up doing things incredibly far removed from the end points of the process; the days when the iron ore goes in one door and the car rolls out the other are over. But the idea is the same. …

The issue is not that jobs used to have meaning and now they don’t; most jobs in most periods have undoubtedly been staffed by people who would prefer to be doing something else. The issue is that too little of the recent gains from technological advance and economic growth have gone toward giving people the time and resources to enjoy their lives outside work. Early in the industrial era real wages soared and hours worked declined. In the past generation, by contrast, real wages have grown slowly and workweeks haven’t grown shorter.

Assistants Can’t Be Outsourced

by Patrick Appel

Manjoo hired a personal assistant based in India. It didn’t work out:

To be truly useful, an assistant needs to understand everything about your life and work. An assistant is a confidant. But it’s impossible to develop a deep, trusting relationship with a guy you know only by email—a guy who communicates with you in canned professional-ese, who must be monitored by security cameras to make sure he doesn’t rob you. TasksEveryDay’s assistants were pretty terrible, but even if they’d flawlessly handled every task I’d given them, they wouldn’t have been nearly as good as a local person. That person would cost a lot more. But it’d be worth every penny.

Afraid Of The Enemy Within

by Patrick Appel

Jessie Walker thinks that “Washington is petrified of itself”:

Washington is classifying documents at a remarkable rate. According to a report from the Public Interest Declassification Board last year, one intelligence agency alone classifies the equivalent of about 20 million well-stuffed four-drawer filing cabinets every 18 months. Nearly 5 million federal employees or contractors have access to at least some secret information. Even more have access to information that isn’t classified but might embarrass someone.

That creates a double bind: The more the government trusts someone with sensitive data, the more it has reason to fear that person. Trust breeds mistrust. It’s the sort of situation that might make a person paranoid.

Bruce Schneier explains how to minimize leaks:

The best defense is to limit the number of trusted people needed within an organization. [NSA Director General Keith] Alexander is doing this at the NSA — albeit too late — by trying to reduce the number of system administrators by 90 percent. This is just a tiny part of the problem; in the U.S. government, as many as 4 million people, including contractors, hold top-secret or higher security clearances. That’s far too many.

The 10,000 Hour Rule

by Patrick Appel

Peter Orszag thinks it has been debunked:

Like many others who read Malcolm Gladwell’s book “Outliers” when it came out five years ago, I was impressed by the 10,000-hour rule of expertise. I wrote a column (for a different publication) espousing the rule, which holds that to become a world-class competitor at anything from chess to tennis to baseball, all that’s required is 10,000 hours of deliberate practice. David Epstein has convinced me I was wrong. His thoroughly researched new book, “The Sports Gene,” pretty much demolishes the 10,000-hour rule — and much of “Outliers” along with it.

Gladwell protests:

“We’ve tested over ten thousand boys,” Epstein quotes one South African researcher as saying, “and I’ve never seen a boy who was slow become fast.” As it happens, I have been a runner and a serious track-and-field fan my entire life, and I have never seen a boy who was slow become fast either. For that matter, I’ve never met someone who thinks a boy who was slow can become fast. Epstein has written a wonderful book. But I wonder if, in his zeal to stake out a provocative claim on this one matter, he has built himself a straw man.

The point of Simon and Chase’s paper years ago was that cognitively complex activities take many years to master because they require that a very long list of situations and possibilities and scenarios be experienced and processed. There’s a reason the Beatles didn’t give us “The White Album” when they were teen-agers. And if the surgeon who wants to fuse your spinal cord did some newfangled online accelerated residency, you should probably tell him no. It does not invalidate the ten-thousand-hour principle, however, to point out that in instances where there are not a long list of situations and scenarios and possibilities to master—like jumping really high, running as fast as you can in a straight line, or directing a sharp object at a large, round piece of cork—expertise can be attained a whole lot more quickly. What Simon and Chase wrote forty years ago remains true today. In cognitively demanding fields, there are no naturals.

Earlier Dish on Epstein’s book here.

Not All Cancer Kills

by Patrick Appel

Virginia Postrel discourages grouping different types of cancer together:

[T]hough [prostate cancer specialist Peter] Carroll thinks calling slow-growing prostate tumors “cancer” is important to encourage vigilance, [urologist Ian] Thompson wants to change the nomenclature, using the term IDLE (indolent lesions of epithelial origin) to describe low-risk cases where waiting isn’t likely to make a difference. Just using the word “cancer,” he argues, creates unnecessary suffering.

“The number of people that will die from those slow-growing prostate cancers is really low,” he says, but the unacknowledged costs of giving them a cancer diagnosis are huge: “the person who can’t sleep for two weeks before his next test results, and all the follow-up biopsies and all the lost wages, and the people who can’t get life insurance because they now have a new cancer diagnosis, the person whose firm says, ‘Well, we’re concerned you have cancer and therefore you can’t be promoted to this job.’”

It’s a compelling case, but changing the vocabulary finesses the fundamental cultural issue: the widespread and incorrect belief that “cancer” is a single condition, defined only by site in the body, rather than a broad category like “infectious disease.” Someone doesn’t develop “cancer” but, rather, “a cancer.” How frightening that diagnosis should be depends on which one.

Deciding Between The Decades

by Patrick Appel

The results of a recent YouGov survey:

Time Travel

Drum breaks down the numbers by age:

[T]he most popular choice of nearly every age group is a decade of their youth. millennials like the ’90s, when they were growing up. My generation likes the ’80s, when we were just out of college. Only the thirtysomethings seem not to care, showing no particular preference for any decade between the ’50s and ’90s.

But it’s the nostalgia of seniors for the ’50s that intrigues me the most. I’d love to see a demographic breakdown of that. I assume that nonwhites aren’t pining away for that era, which means that white seniors must really be in love with it to produce such a high overall number.

Millman compares Republicans and Democrats:

Republicans and Democrats largely agree about the decades before the 1980s. Democrats actually like the Republican-dominated 1920s better than Republicans do, and Republicans slightly prefer the Kennedy-Johnson ’60s over Democratic views of that decade, but from the 1900s through the 1970s, memories move roughly in tandem. Then, in the 1980s, there’s a split, as Republicans prefer the ’80s somewhat over the ’70s, while Democrats feel the opposite. And then, with the ’90s, there’s a huge disparity, with Democrats preferring them slightly over the ’80s, and roughly in-line with the ’50s, ’60s and ’70s, while Republicans loathe the ’90s only slightly less than they do the ’30s or the ’10s.

And, finally, Yglesias asks why the 1990s gets so little love.

What Egypt Has Become

by Patrick Appel

Jon Lee Anderson compares Egypt’s military junta to the former military dictatorships of Latin America that the US supported in the name of fighting Communism:

Today’s Islamists can be yesterday’s Marxists, it seems: killable on behalf of notional constructs of law and order. … The no-holds-barred military terror in Egypt, and the language the military is employing to justify it, is reminiscent of the worst of human legacies. These are the sort of statements made not by ordinary armies but by armies that have embraced ideological convictions that make it easy to shoot down people in the streets, even civilians, if you believe that they are with the terrorists—or whatever it is you decide to call them.

Nathan Brown looks ahead:

What is clear now is that Egypt’s constitutional moment is over.

The hope born in the 2011 uprising was that diverse political forces would come to an agreement on the rules of politics — ones that would protect human rights, provide for a popular voice in governance, and devise mechanisms of accountability, and do such things in ways that were broadly accepted. That hope is not just dead; it was murdered by the country’s feuding leaders. The question is no longer whether the current course is the wisest one for Egypt  – it almost certainly is not. But this is the choice that Egyptian leaders have made for each other.

The result, while it is based on a destruction of the hopes of 2011, is one that will have recognizably democratic elements (elections, a multiparty system, civilian leaders). It will likely establish itself as operational even if it does not provide full stability or social and political peace. Its actual working will enable rather than avoid repression. Egypt’s international interlocutors in the West may have advised against this path, but they will have to decide soon whether or not to accept it. The current regime’s insistence that this is a sovereign decision will make Western governments uncomfortable for now but they will likely ultimately accept it. They will still face the question of whether to treat it as a distasteful autocracy or a flawed but aspiring democracy — or whether to bother to make the distinction.

Who Should Cover Climate Change?

by Patrick Appel

Everyone:

Climate change is about rapidly accelerating changes in the substrate of modern civilization, the weather patterns and sea levels that have held relatively steady throughout all advanced human development. By its nature, it affects everything that rests on that substrate: agriculture, land use, transportation, energy, politics, behavior … everything. Climate change is not “a story,” but a background condition for all future stories. The idea that it should or could be adequately covered by a subset of “environmental journalists” was always an insane fiction. It is especially insane given the declining numbers who identify themselves as such.

We need to disentangle the fate of environmental journalism from media coverage of climate change. The two need not be connected. The pressing, nay existential imperative to divert from the status quo and radically reduce greenhouse gas emissions is necessarily enmeshed in all major human decisions. And so journalists who cover those decisions, whatever their “beat,” need to understand how climate change, as a background condition, informs or shapes the decisions. In journalism, as in other fields, climate needs to be freed from the “environmental” straightjacket.

How Choice Fuels Consumerism

by Patrick Appel

Derek Thompson finds evidence that “choices reduce anxiety by making us feel like we’ve searched exhaustively — and now we’re ready to buy”:

A few years ago, Williams-Sonoma had a problem. They couldn’t get anybody to buy their breadmaker, which retailed for $279. So they did something that might strike you as bizarre: They started selling a $429 model, as well. Of course, nobody bought the expensive version. But sales of the cheaper model doubled.

Why?

You could offer a few reasons. One is that the fundamental rule of prices is that consumers don’t know what anything should cost (especially breadmakers) so we’re persuaded by clues. The expensive breadmaker here acted as a clue — a decoy that told other shoppers: Hey, you are getting an amazing deal on this breadmaker!

But another possible reason is that we’re incredibly reluctant to certain items — especially expensive items — when only one option is presented. If you walked into a Best Buy store, and there was only one TV left, would you buy it? Even if it was pretty much what you were looking for? [Daniel] Mochon’s research [pdf] would suggest that many of you wouldn’t.

Addicted To The Daily Grind

by Patrick Appel

Jordan Weissmann rounds-up research on workaholics

Even as the precise outlines of workaholism remain a bit fuzzy, various studies have tried to identify its physical and emotional effects. At the risk of carrying on like a Pfizer ad: research has associated it with sleep problems, weight gain, high blood pressure, anxiety, and depression [3]. That’s to say nothing of its toll on family members. Perhaps unsurprisingly, spouses of workaholics tend to report unhappiness with their marriages [4]. Having a workaholic parent is hardly better. A study of college undergraduates found that children of workaholics scored 72 percent higher on measures of depression than children of alcoholics. They also exhibited more-severe levels of “parentification”—a term family therapists use for sons and daughters who, as the paper put it, “are parents to their own parents and sacrifice their own needs … to accommodate and care for the emotional needs and pursuits of parents or another family member” [5].

How many people are true workaholics? One recent estimate suggests that about 10 percent of U.S. adults might qualify [6]; the proportion is as high as 23 percent among lawyers, doctors, and psychologists [7].