A&E is producing a To Catch a Predator-like show about sex workers called 8 Minutes:
This embed is invalid
Executive producer Tom Forman describes the premise:
[Kevin Brown is] a cop in Orange County who retired a couple years ago, then devoted himself full time to his church. So he was a full-time cop turned full-time pastor. As a cop, he worked vice. He saw girls who had no place else to go, who had been abused by their pimps, girls who really needed a helping hand, and what he had to do as a cop is arrest them. Now that he’s running a church, he can offer them that help.
And Brown has eight minutes to make his case to a woman each episode, hence the title. Samantha Allen pans the series:
Kevin Brown’s vigilante preacher schtick is “disappointingly unsurprising,” as Lane Champagne, a community organizer for Sex Workers Outreach Project New York City (SWOP-NYC) tells me in a phone interview.
The idea that all sex workers are victims who want to be “rescued” is so pervasive that some clients regularly try to convince sex workers to give up their trade—they just usually don’t bring a camera crew along with them. As Champagne suggests in her own critique of 8 Minutes for a sex workers group blog, Brown’s holier-than-thou angle is not a far cry from the “well-meaning but kind-of-a-dick regulars who fall in love with you and think it’s a compliment to say [things] like, ‘You’re so much better than this.’”
And in an e-mail interview, Kaelie Laochra, a sex worker who runs the popular Respect Sex Work Twitter account, agrees that clients who attempt to “save” her are often the most inadvertently insulting.
Elizabeth Nolan Brown recently called the program “an abomination that should never, ever have gotten the greenlight”:
Seeing someone’s photo online and then proceeding to track them down IRL and secretly monitor their movements would, under other contexts, be considered stalking. But apparently anything goes when your aim is to “save” women from exerting their own agency.
In an interview discussing a new volume of essays he edited about the French philosopher, Daniel Zamora portrays Foucault, especially in his later years, as more friendly to and fascinated by neoliberals Hayek and Friedman than many of his votaries on the academic left want to believe. Zamora claims to have been “astonished by the indulgence Foucault showed toward neoliberalism”:
[H]e saw in it the possibility of a form of governmentality that was much less normative and authoritarian than the socialist and communist left, which he saw as totally obsolete. He especially saw in neoliberalism a “much less bureaucratic” and “much less disciplinarian” form of politics than that offered by the postwar welfare state. He seemed to imagine a neoliberalism that wouldn’t project its anthropological models on the individual, that would offer individuals greater autonomy vis-à-vis the state.
Foucault seems, then, in the late seventies, to be moving towards the “second left,” that minoritarian but intellectually influential tendency of French socialism, along with figures like Pierre Rosanvallon, whose writings Foucault appreciated. He found seductive this anti-statism and this desire to “de-statify French society.” Even Colin Gordon, one of Foucault’s principal translators and commentators in the Anglo-Saxon world, has no trouble saying that he sees in Foucault a sort of precursor to the Blairite Third Way, incorporating neoliberal strategy within the social-democratic corpus.
Dan Drezner nods, telling conservatives and libertarians to take a second look at their unlikely ally:
One of the virtues of teaching at a policy school is that Foucault is not quite as central to scholarly conversations as in traditional humanities departments. That said, Zamora’s observation rings true — which is why conservatives should embrace him and his work. From a conservative perspective, the great thing about Foucault’s writing is that it is more plastic than Marx, and far less economically subversive. Academics rooted in Foucauldian thought are far more compatible with neoliberalism than the old Marxist academics.
In some ways, Zamora’s book is an effort by some on the left to try to “discipline” Foucault’s flirtation with the right. It will be interesting to see the academic left’s response to the book. But Zamora also reveals why free-marketeers might want to give Foucault another read and not just dismiss him with the “post-modern” epithet.
Update from a reader:
This being an area I am quite familiar with, let me just say that there is very little new in Zamora’s “discovery.” Only the petty bourgeois of identity politics in US academia could ever embrace Foucault as “left-wing.” And they sure did, bereft that they were of any context or any economic culture (let alone knowledge).
It’s not that Foucault is “right-wing” or “left-wing.” I am not even sure that he could be pegged on some sort of “progressive vs conservative” scale. On the one hand, he once stated that everything that is historically constructed can be politically changed. On the other hand, he blurted out the mother of all elisions in Discipline and Punish (if I remember well): talking of the invention of biopolitics, the thrust by the modern state to catalogue, normalize and regiment living beings (with a particular focus on humans), he wrote “this all took place on the backdrop of the industrial revolution.”
And that was that for economic forces. One sentence! The gall of that man! It’s hilarious. It was a dig at a certain naive and mind-numbing strand of Stalinist historical materialism in vogue in France at the time (the irrelevant and awful Althusser, Sartre, Bourdieu: in the late ’60s-early ’70s Foucault was cleverly waging a positional war against the grand poobahs of France’s intelligentsia).
Of course this played very well in US comp-lit departments, where all things economic or material are sneered at. Now you could be a true armchair radical, no need to worry or study actual economics and the history of capitalism. Boring. Besides, Foucault himself thought it was useless. And that way, as a tenured radical on a US campus, you did not have to question the society, the economic incentives and the motivations that had lead you there in the first place. That is, the booming market for mass education in the late-sixties. Suddenly any small-town middle-class American, low on cultural capital but armed with good grades, could legitimately compete for a tenured job at any University nearby. Fucking boomers.
Not saying it’s all bad, but hey, you must take the bad with the overall good. Incredible development in human capital comes with a few philistines. It’s a small price to pay.
In an Intelligent Life roundtable on the world’s best cuisine, Josie DeLap argues that Iranian food is underrated:
Politics has kept Iranian food tucked away in the Tupperware box of the Islamic Republic. Other Middle Eastern cuisines are brazen. Lebanon flaunts its sophistication. Morocco flourishes its tagines, with their fruit and meat, so cleverly combined. Turkey brandishes its breads and flashes its kebabs. Who thinks of Iran?
And yet this is the source of it all. Cultivated over millennia, enhanced by numerous invasions both launched and endured, Iranian food has a subtlety and intricacy unrivalled but unrecognised—at least by outsiders. Who knows of its jewelled rice, studded with ruby barberries, flickers of sour sweetness, amid rice gold-stained with saffron, run through with shards of pistachios? Who has heard of caramelised sohan, a nutty brittle, produced mostly in Qom, Iran’s holiest city, its buttery excess so at odds with the austere piety of its creators? What of kuku sabzi, an omelette thick with fistfuls of coriander, parsley, dill, chives, tarragon, fenugreek? The world is missing out.
Katherine Rundell, for her part, sticks up for English cuisine, writing that “British food is best when it has heart, literally as well as figuratively.” Bee Wilson maintains that French is foremost:
French cuisine can be seen as passé and unhealthy. Sure, it’s delicious, but who wants to eat all that heavy meat in fancy Escoffier sauces any more? To dismiss it in this way is to neglect the fact that it has always been about much more than Michelin pretension. Its genius can be seen in delicate fish soups with a dollop of fiery rouille; rare onglet steak and salads of green beans; tiny wedges of big-tasting cheese. It’s there in the habit of avoiding snacks between meals, not from self-denial, but because hunger is the best sauce. French cuisine is the best because it’s founded on an understanding of how to square the circle of pleasure and health.
But Fuchsia Dunlop finds Chinese food does a better job:
No other culture lays such an emphasis on the intimate relationship between food and health. The everyday Chinese diet is based on grains and vegetables, with modest amounts of meat and fish, and very little sugar—a model for healthy and sustainable eating. A good Chinese meal is all about balance: even a lavish banquet should leave you feeling shufu—comfortable and well.
John Paul Rollert, who braved a recent Objectivist conference held in Las Vegas, understands both the philosophy and the city to be escapes from reality. As an example, he cites the response of Yaron Brooks, the executive director of the Ayn Rand Institute, to an attendee’s question about the homelessness and poverty he saw all along the Strip:
Brook’s response began unevenly, detouring through an observation about the malice of minimum-wage laws and a presentist history of the progressive era before turning to the young man’s question. “None of these phenomena that you’re seeing out there, homeless people and so on, are phenomena of capitalism,” he declared. The people outside the gates of The Venetian, hustling in the 111-degree heat, their fates are the “phenomena of mixed economy,” the side-effects of social welfare policies and regulations. They exist despite capitalism, not because of it.
The young man did not seem entirely satisfied with the answer, and Brook, himself, seemed hesitant.
By and large, when it comes to questions about the structural shortcomings of capitalism, the most persuasive answers will be of a dry and technical nature. They won’t savor of the sulfurous clash between the forces of good and evil, an ideological battle to which Objectivists might not only contribute, but one which (if you take their word for it) they are destined to lead. “There is nobody out there who can talk about self-esteem, about individualism, and about capitalism with the moral certainty and the moral fervor we can,” Brook declared. “Objectivism is the only bulwark to what the Left is doing. The fate of Western Civilization depends on what we do.”
This is the familiar pledge of a radical philosophy. To the unaccustomed ear, it can sometimes sound like a clarion call, piercing, at last, the din of confusion. Otherwise, it can seem like the unnerving pitch of the card-clicker, an invitation to a strange and sinister world that one is very relieved to escape.
An excellent movie. The obviously unfit individuals are winnowed out through a series of entrepreneurial tests and, in the end, an enterprising young boy receives a factory. I believe more movies should be made about enterprising young boys who are given factories. —Three and a half stars. (Half a star off for the grandparents, who are sponging off the labor of Charlie and his mother. If Grandpa Joe can dance, Grandpa Joe can work.)
Writing in Nature, Arturo Casadevall and Liise-Anne Pirofski urge microbiologists to retire the pathogen paradigm, arguing that “the focus on microbes is hindering research into treatments”:
The term pathogen started to be used in the late 1880s to mean a microbe that can cause disease. Ever since, scientists have been searching for properties in bacteria, fungi, viruses and parasites that account for their ability to make us ill. Some seminal discoveries have resulted — such as the roles of various bacterial and fungal toxins in disease. Indeed, our oldest and most reliable vaccines, such as those for diphtheria and tetanus, work by prompting the body to produce antibodies that neutralize bacterial toxins.
Yet a microbe cannot cause disease without a host.
What actually kills people with diphtheria, for example, is the strong inflammatory response that the diphtheria toxin triggers, including a thick grey coating on the throat that can obstruct breathing. Likewise, it is the massive activation of white blood cells triggered by certain strains of Staphylococcus and Streptococcus bacteria that can lead to toxic-shock syndrome. Disease is one of several possible outcomes of an interaction between a host and a microbe.
It sounds obvious spelled out in this way. But the issue here is more than just semantics: the use of the term pathogen sustains an unhelpful focus among researchers and clinicians on microbes that could be hindering the discovery of treatments. In the current Ebola epidemic in West Africa, for instance, much attention has been focused on the ill and the dead, even though crucial clues to curbing the outbreak may be found in those who remain healthy despite being exposed to the virus.
Reviewing Walter Isaacson’s The Innovators, James Surowiecki notices an important one – we shouldn’t romanticize the role of lone geniuses:
That may sound odd, since the story of invention is usually told as a story of great inventors. But as Isaacson reveals, the true engine of innovation is collaboration. The pairing of a creative visionary and a more practical engineer (such as John Mauchly and J. Presper Eckert, who created ENIAC, or Steve Jobs and Steve Wozniak at Apple) can be enormously productive. And it isn’t just strong pairs, either; the organizations that have done best at innovating have typically been those that have relied on strong teams made up of diverse thinkers from lots of different disciplines. …
One of the reasons diverse teams have tended to be more successful is that they have done a better job of turning ideas into actual products.
This is an important theme in Isaacson’s book: genuine innovations are not just about brilliant insights. They’re the result of taking those insights and turning them into things that people will actually use and then finding a way to get those products into people’s hands. One of the more interesting sections of The Innovators is Isaacson’s account of John Atanasoff’s quixotic quest to build a general-purpose computer by himself in the early 1940s. Atanasoff anticipated important aspects of what would become ENIAC and constructed a prototype. But because he worked alone, in Iowa, rather than in a lab with other scientists and engineers, his computer never became fully functional, and he became a footnote to history, eclipsed by Mauchly and Eckert. Isaacson takes Atanasoff’s efforts seriously, but he notes that “we shouldn’t in fact romanticize such loners.” Real innovation isn’t just about an invention. As Eckert put it, “You have to have a whole system that works.” And that’s hard to do when you’re all by yourself.
Helen Thompson characterizes the holiday plant as “basically a vampire,” calling it “a parasite that spends its days sucking the ‘lifeforce’ from trees round the globe”:
Mistletoe’s parasitism starts with poop and exploding berries. Mistletoe bushes clump on branches like something out of a Dr. Seuss book. Their parasitism is airborne. Birds eat their berries, which are coated in gluey material called viscin. The birds poop all over the forest, and thanks to the viscin, the mistletoe seeds in said poop stick to branches. Once firmly attached to the branch, mistletoe sprouts and drills down into the branch until it reaches the tree’s veins. It sticks a haustorium (basically a straw) in and sips the tree’s mineral and water cocktail.
Another group of mistletoes, dwarf mistletoes, does things a bit differently. In a dramatic twist on mistletoe reproduction, their seeds explode, literally. The blast zone can reach up to 15 feet. Seeds stick to saplings and wedge themselves into the tree’s innards, infecting the entire tree, and sprouting sometimes years later. These guys are full parasites, taking sugar, water, and minerals from the tree. “Dwarf mistletoe is freaky, freaky, freaky stuff,” says David Watson, an ecologist at Charles Sturt University in Australia. “Its [shoots] look like miniature asparagus.”
Eventually, the mistletoe bush grows, blooms, and forms berries, and the cycle begins anew.
For a more festive take on the plant, check out the below Broad City webisode, which follows Abbi’s pursuit of her first below-the-mistletoe kiss:
In his sci-fi series Black Mirror, the great Brit Charlie Brooker uses his mordant wit to send up technologically-dependent modern life. The series’ newest episode, White Christmas, is a holiday special critics are hailing as “about as festive as being bludgeoned to death by a stocking full of coal, but … also an unerringly brilliant piece of lo-fi sci-fi.” Louisa Mellor sums up the episode’s central conceit:
Taking Google Glass to its logical conclusion, the world of White Christmas is populated by augmented humans whose Z-Eye implants let them control and share what they see. It’s in this kind of technological advance – one that doesn’t seem far off in the realm of possibility but that has the potential to shatter human relationships – that Black Mirror specialises.
The episode features John Hamm as Matt, who makes a living off such advanced tech. Sam Wollaston thinks Brooker’s “dystopia isn’t outrageous, it’s plausible, and all the more terrifying for it. Less sci-fi, more like now after a couple of software updates”:
Matt’s day job … – working for Smartintelligence, a company that, accompanied by Rossini’s Thieving Magpie overture, extracts “code” from people under anaesthetic, stores it in a widget called a cookie before implanting it into a simulated body, which is what happens to Greta (Oona Chaplin) – well, that’s basically human cloning. There’s probably a company in South Korea that’s about five minutes away from doing that. There’s a company in South Korea that thinks it can clone a woolly mammoth from a piece of 40,000-year-old frozen mammoth meat; putting people’s code inside plastic eggs like this, before turning them into slaves, has got to be easier.
Yes slaves. Because this isn’t just about the technology, it’s about the issues – very real world ethical issues – surrounding the technology. It’s about slavery and morality and torture and separation and access to children as well as the technology, and what the technology does to us. It’s about people, which is its real beauty. Along with all the razor-sharp wit, the nods and the winks, it manages to be a very human story.
Willa Paskin also finds the episode eerily relevant:
In “White Christmas,” wearable tech has advanced beyond the rudimentary stages of Google glass and become Z-eyes, irremovable implants that let you take pictures and record things and, if you must, “block” other people. As used in “White Christmas,” blocking makes the person who is blocked and the person who has done the blocking look like grey, fuzzy outlines to one another. They can’t hear each other or communicate, and the blocked party has no recourse.
The terminology suggests a lineage with blocking someone on Facebook or Twitter. But here in 2014 we tend to understand blocking as essentially protective: It insulates people from unwanted attention and (often misogynistic) threats—though of course it also keep ex-friends and other irritants out of your feed. While Black Mirror understands the protective quality of blocking—the episode expressly deals with some messy, vile misogyny—it is more concerned with the ways blocking can be abused. In “White Christmas,” blocking is largely something women do to men in lieu of communicating with them. It’s the technological equivalent of sticking one’s fingers in one’s ears, an all-powerful silent treatment that can, sometimes, take on draconian legal backing.
If that sounds a bit wrong-headed, “White Christmas,” like the most disturbing episode of Black Mirror that exists, the similarly titled “White Bear,” asks questions about what we are willing to do to the least worthy amongst us: the convicted, the guilty, the criminal. When technology makes it easy to ignore, ostracize, manipulate, and torture the worst of us, might not the rest of us comply?
Merry Christmas.
The Christmas special is currently only available outside the UK on Direct-TV, but other would-be viewers can stream the previous two seasons on Netflix. The show, as Emily Yoshida explained last year in a spoiler-free guide, is best watched without much foreknowledge.
Bridget’s cards won their internet moment not just because they’re clever and funny, but because they articulate that thought and the way it can become so explosive during the holidays: All this jollity rings false to me. Everyone, no matter what their relationship status, feels a little like Bridget this time of year — outcast, exiled, discontent; her cards capture a sentiment heartfelt enough to catch the attention of millions of newsfeed browsers but also noncontroversial enough to bypass the conformity threshold. Hark! Bridget has tapped our inner misfit, which emerges in a guffaw like a genie from a bottle. Isn’t that a gas?
Meanwhile, Sarah Condon ponders yuletide misery from a theological point of view:
Whether or not we realize it, we pointedly deny the harsh realities of our lives this time of year. I know I do. The moment we remember Jesus coming into the world is the same moment we hold up our perfectly posed Christmas photos. To the one who came to save us we say, “See how good we look? We are totally pulling this off.”
Only, we are not pulling it off, not even remotely. We are disappointed in our children, our spouses, the world, and ourselves. We deny our feelings of loneliness and inadequacy. We scurry to hide them. And the season provides some pretty amazing crutches for our denial: Jim Beam, party mix, and online shopping. All of a sudden, it becomes easy to make failure look like success. To make heartache look like mildly hungover.
Isn’t it odd that, more than any other time of year, Christmastime is when we want everyone to know just how glorious our lives are? I’m already bracing myself for the post-Christmas newsfeed. We can all gather round the old iPhone and sing a hymn of Sanctification by Gift Giving. For the record #besthusbandever #fairtrade and #santarocks are my own self-righteous picks for the season.
But it doesn’t have to be this way. We are actually allowed to admit that we are screwed up, yes, even at Christmas. Let the record show, St. Paul already gave us the bones in Romans 7: I do not understand what I do. For what I want to do I do not do, but what I hate I do.
I know you are not prone to posting Norwegian music-videos, but this Santa exposé from a children’s program in 2005 caused some turmoil when every five year old in the country was told at the same time that “Santa does not exist”. The video is quite self-explanatory – and very funny! The refrain goes something like “Santa does not exist, it’s just bullshit.”
Another perpetuates that theme:
I love the reader’s stories on learning about Santa Claus. Mine’s probably not as interesting as some of the others, but here it is anyway. I was eight and was deeply fascinated by science. I had just learned about the experimental method and decide to see if I could apply it. The test was simple: I asked Santa for one set of things and told my parents that I had asked for a different set. I then waited to see which I would get. As I suspected, I got the second set and that was that.
Another reader:
It just happened an hour ago with my two kids. We went to a department store to see Santa. After they saw him, the older one said that “If Santa kept us waiting for half an hour, then how can he be clever enough to deliver presents to a million children?”
Another pins the blame on a Muslim:
I think I hung on to the Santa myth a little longer than most kids around me. One day, as I was playing on the floor, I noticed that Muhammad Ali was being interviewed on TV. I had no idea what he was talking about, but I distinctly remember him saying something was a lie, that it was a lie just like Santa Claus. At that moment, it all came to me – the flying reindeer, giving toys to every child in the world in one night, climbing down the chimney even though we didn’t have a fireplace. All of it was exposed as conspiratorial fraud.
Another got greedy:
I was maybe 7 years old, and my dad told me that if you put a list of what you want in your stocking, Santa will bring it for you. I remember that I wrote down every possible toy I could think of and put it in my stocking. My mom took the list out and gave my dad the dirtiest look ever – that was when I knew.
How another’s consciousness was raised:
When I was 7 years old, I got a Super Nintendo for Christmas, and I was probably the most excited boy in the entire world. A lot of my friends wanted Super Nintendos that Christmas, but I and maybe one other kid actually got one. But for some reason, I didn’t think that was fair. Why did I get one when ALL my friends wanted one too?
I decided to ask my mom why Santa wasn’t fair in gift giving. All of my friends, at least to my knowledge, were “good” too, so why did Santa not give them Super Nintendos too? It was then that my mom decided to tell me how Santa worked.
As I got older, this was a lesson that stuck with me when I had kids of my own. It was a sort of realization that what we are doing is equating “being good” with our parents’ socioeconomic status. My wife and I talked a lot about it and decided that we wouldn’t “do Santa” with our kids. We didn’t want our kids thinking that they were “better” than other kids because we could afford to give them a nice Christmas and other kids’ parents might not be able to. Our kids know of Santa and what he represents at Christmas time, but we prefer for our kids to appreciate that their gifts come from love and, to an extent, our financial capabilities, and not equating these things to their behavior.
Another gets a bit morbid, even for us:
It’s good to teach children to believe in Santa Claus, because they need to learn, early on, that those they love and trust will conspire to deceive them.
Another ends on a cheerier note:
One Christmas, when we had no money because I was unemployed, I spent hours out in the barn in my workshop building a dollhouse for our two girls, aged four and six. It was a replica of our house. We got them scaled people and furniture to go with it. It turned out really nice.
On Christmas morning, I did the whole “Waltons” thing, stolen from “The Homecoming.” I whipped a big rock on to the roof above their bedrooms and made a lot of noise downstairs until they came pounding down. I told the girls that I had just caught a burglar in the house, a fat guy in a red suit with a bag of our stuff. I told them that I had whipped a rock at him and that he had run away and jumped in a sled. I told them that some packages had fallen out of his bag when he ran. They saw the dollhouse and the people and the furniture and their eyes went wide. Merry Christmas.
I am getting to the point. A couple years later, when they were about six and eight years old, the older neighbor girl clued our girls in on the Santa thing. They came back for dinner and asked us directly whether Santa was real. I don’t lie to my kids, so I told them that no, there was no Santa.
“Then who made the dollhouse?”
My wife pointed at me.
Their eyes went wide again. I was Santa. Merry Christmas times two.