Porn is always two clicks away, and, hovers at the edge of so many conversations from analyses of Girls to sending messages on phones to the NSA. The problem, however, is that there are costs to even talking about pornography. This is true even in our supposed bastions of intellectual freedom, as several of the articles make clear. “I have been told ‘You don’t want to be ‘the porn guy’” and ‘you will have to deal with the content issue of your work,’” writes Nathaniel Burke in his essay Positionality and Pornography. I’d heard similar things from journalists, male and female alike. Very few people want to be “the porn guy.” And so researchers and critics choose to do work on less fraught, less important topics. Perhaps having a publication that serves as a gathering place will create some strength in academic numbers.
Felix Salmon links Zuckerberg’s decision to acquire WhatsApp back to its “stroke-of-genius” decision to go public in 2011 and conquer the mobile market through acquisitions:
Zuckerberg knew, circa Facebook’s IPO, that his company was not good at mobile: it didn’t have the problem solved. And he knew that asking his existing corps of engineers to turn their attention to mobile would probably not work. But the good news was that he was now running a public company, with lots of cash, and a highly-valued acquisition currency in the form of Facebook stock. …
Facebook bought Instagram for $1 billion in 2012 not because the product was particularly great, but because the product was insanely popular. The same when he offered $3 billion for Snapchat. Sometimes, lightning strikes. And while Facebook is happy writing its own mobile apps in the hope that lightning will strike them, it knows better than to count on such a thing happening. If you want to be certain that hundreds of millions of people are using your mobile products, the only way to do that is to buy mobile products which hundreds of millions of people are using.
Peter Yared sees another rationale for Facebook buying WhatsApp. Tapping into people’s mobile contacts:
Pascal-Emmanuel Gobry chides the media for buying that paper predicting the death of Facebook. Why “any journalist with, not a science degree, but with a lick of common sense, could have figured out that the study wasn’t reliable”:
The study uses an epidemiology model. Many stories pointed this out, so they read this part! This tells you two things: 1) this study is based on a model, i.e. an abstract and formal representation of the world, not experimentation, which is the evidentiary gold standard in science. When you have a model, you have an idea and a spreadsheet. You don’t have evidence. 2) An epidemic is when lots of people get a disease. Facebook is a website that people sign up for. Those two things are not the same thing! At all! (Insert your own joke here.) You can apply an epidemiology model to Facebook. You can apply a macroeconomic model (what’s Facebook’s demand curve?). You can apply a financial model. You can really apply any model–a “model” is just a fancy word for playing lego with numbers. You can use all sorts of lego to do all sorts of things, but it doesn’t mean your lego “plane” looks anything like a plane, much less will fly.
Dish readers got there first. Nonetheless, Facebook does have legitimate reasons to be concerned about its declining popularity:
Kottke argues that the once-relevant medium has evolved into something new and disparate:
Instead of blogging, people are posting to Tumblr, tweeting, pinning things to their board, posting to Reddit, Snapchatting, updating Facebook statuses, Instagramming, and publishing on Medium. In 1997, wired teens created online diaries, and in 2004 the blog was king. Today, teens are about as likely to start a blog (over Instagramming or Snapchatting) as they are to buy a music CD. Blogs are for 40-somethings with kids.
Instead of launching blogs, companies are building mobile apps, Newsstand magazines on iOS, and things like The Verge. The Verge or Gawker or Talking Points Memo or BuzzFeed or The Huffington Post are no more blogs than The New York Times or Fox News, and they are increasingly not referring to themselves as such. … Sites like BuzzFeed and Upworthy aren’t seeking traffic from blogs anymore. Even the publicists clogging my inbox with promotional material urge me to “share this on my social media channels” rather than post it to my blog.
He elaborates on his “deliberately provocative” argument:
[A]s someone who’s been doing it since 1998 and still does it every day, it’s difficult to ignore the blog’s diminished place in our informational diet.
Christopher Orr considersHer the best film of the year:
Her is a remarkably ingenious film but, more important, it is a film that transcends its own ingenuity to achieve something akin to wisdom. By turns sad, funny, optimistic, and flat-out weird, it is a work of sincere and forceful humanism. Taken in conjunction with [Spike] Jonze’s prior oeuvre—and in particular his misunderstood 2009 masterpieceWhere the Wild Things Are—it establishes him firmly in the very top tier of filmmakers working today. Like Eternal Sunshine of the Spotless Mind—of which Her is a clear descendant—Jonze’s film uses the tools of lightly scienced fiction to pose questions of genuine emotional and philosophical weight. What makes love real: the lover, the loved one, or the means by which love is conveyed? Need it be all three?
Her isn’t, in the end, a political or socio-cultural satire, much less a nostalgic tract about the need to throw away our devices and truly live. It’s a wistful portrait of our current love affair with technology in all its promise and disappointment, a post-human Annie Hall.
Brett McCracken compares the protagonist to the operating system:
Eric Gibson thinks overzealous smartphone users are ruining the museum experience:
Rather than contemplating the works on view, visitors now pose next to them for their portrait. In pre-digital photography the subject was the work of art. Now it is the visitor; the artwork is secondary. Where previously the message of such images was “I have seen,” now it is “I was here.”
If visitors now regard a museum’s treasures as mere “sights,” they might come to regard the institution itself in a similar vein—not as a place offering a unique, one-of-a-kind experience but just another “stop” on a crowded itinerary, and as such interchangeable with any other. At the very least, it’s hard to see how this new culture of museum photography can fail to undermine the kind of long-term visitor loyalty to museums toward which so many of their public engagement efforts are directed. On the one hand, the visitor who makes an emotional connection with a work of art is likely to return. On the other hand, I can’t imagine there are many tourists who, having once had themselves snapped propping up the Leaning Tower, feel compelled to do so again.
Silvia Killingsworth responds to Oxford Dictionaries choosing “selfie” as “Word of the Year”:
Strictly speaking, the modern-day selfie is a digital affair, but it’s a novel iteration of an old form: the self-portrait (a friend on Twitter joked, “was Lascaux the first selfie?”). As Kate Losse points out in her excellent primer, a notable point of inflection in the selfie’s recent meteoric rise was the addition of a front-facing camera to the iPhone 4. A selfie doesn’t even have to be of one’s face; my colleague Emily Greenhouse described Anthony Weiner as “a distributor of below-the-waist selfies.” Jack Dorsey, arguably the pioneer of the mass-distributed selfie, also introduced us to selfie Vines, six-second videos shareable on Twitter. Indeed, the selfie is nothing if not a visual shorthand for Dorsey’s initial vision for Twitter as a status updater—“here’s where I am, here’s what I’m doing.”
Are all selfies the same? I don’t think so. There is quite a difference in taste and message between selfies on various social media platforms. Facebook selfies tend to be the most ordinary self-portraitures; they are pictures posted by people who want to look normal, happy, nice. Instagram is for ‘stylish’ selfies or ‘stylies’. On Instagram, you don’t portray yourself; you paint a desirable persona.
You may think that Wikipedia, Twitter, Snapchat, Google Maps, and so on are valuable. But, as far as G.D.P. is concerned, they barely exist. The M.I.T. economist Erik Brynjolfsson points out that, according to government statistics, the “information sector” of the economy—which includes publishing, software, data services, and telecom—has barely grown since the late eighties, even though we’ve seen an explosion in the amount of information and data that individuals and businesses consume. “That just feels totally wrong,” he told me.
Brynjolfsson is the co-author, with Andrew McAfee, of the forthcoming book “The Second Machine Age,” which examines how digitization is remaking the economy. “We’re underestimating the value of the part of the economy that’s free,” he said. “As digital goods make up a bigger share of economic activity, that means we’re likely getting a distorted picture of the economy as a whole.” The issue is that, as Kuznets himself acknowledged, “the welfare of a nation . . . can scarcely be inferred from a measure of national income.” For instance, most Web sites are built with free, open-source applications. This makes running a site cheap, which has all sorts of benefits in terms of welfare, but G.D.P. ends up lower than it would be if everyone had to pay for Microsoft’s server software. Digital innovation can even shrink G.D.P.: Skype has reduced the amount of money that people spend on international calls, and free smartphone apps are replacing stand-alone devices that once generated billions in sales. The G.P.S. company Garmin was once one of the fastest-growing companies in the U.S. Thanks to Google and Apple Maps, Garmin’s sales have taken a severe hit, but consumers, who now have access to good directions at no cost, are certainly better off.
Yglesias is pleased by Snapchat’s refusal to sell to Facebook for $3 billion:
The company’s founders and investors may or may not be making a terrible mistake. But from the sidelines, I think one should almost always root against the acquisition exit. It’s boring. It’s lame. The bet when you turn down $3 billion is that there’s some chance that in the future your company will be worth $30 billion or $300 billion and you want to reach for those stars and dare to dream.
Joshua Gans calls this “jaw droppingly bad analysis”:
How credible is it that Obama was unaware that the NSA was tapping the phones of 35 world leaders? I don’t know. But the evidence is mounting that it is not credible. Ed Morrissey, for one, doesn’t buy it:
[W]ho exactly would be the customer of this data, once collected? Here’s a hint: It’s not going to be the undersecretary of agricultural development at the USDA. The only reason to surveil Angela Merkel is to provide real-time intelligence to the highest level of government about the intentions of the German Chancellor. Furthermore, that intelligence would have to be specified as to its source for the policymaker to validate it for its consideration. If that policymaker is not Barack Obama, then perhaps we should be asking who exactly is making decisions at the top level of government.
The idea that Obama didn’t know about this program is absurd on its face. That doesn’t mean it started with Obama, and it’s almost assured that it didn’t. However, more than four years after taking office, Obama can’t seriously think that anyone will believe that he just found out about this NSA effort from the funny papers.
I have a hard time believing that the President in his many hundreds of intelligence briefings – scores of which surely involved intelligence about allied leaders in run-ups to various diplomatic and political meetings – did not know that some of the information was gleaned through collection against the leaders themselves. (I am not saying that the White House is lying about what the President knew – only that its statement about the President’s ignorance is extraordinary, and that I suspect that someone in the White House knew.)
The White House and State Department signed off on surveillance targeting phone conversations of friendly foreign leaders, current and former U.S. intelligence officials said Monday, pushing back against assertions that President Obama and his aides were unaware of the high-level eavesdropping.
Ambers thinks is possible that Obama wasn’t told about the NSA’s activities:
Life can indeed be sad and lonely, but technology can bridge that gap when used “correctly.” Now you can see photos of your niece who lives on the other side of the country; you can feel as if you’re not completely missing out on her infant years. You can Skype your boyfriend who’s studying abroad and feel relieved once you look into each other’s pixelated eyes. Your friend from college—the one with whom you immediately clicked because his music taste was miraculously similar to yours—sends you a song through Spotify and you listen to it and of course you love it and you appreciate that there’s someone, somewhere who not only shares your taste but appreciates it, and you, enough to try to turn you on to a new band. Or perhaps you hate the song and you laugh as you recall spending countless hours getting high as a fucking kite with that person and debating which bands did and did not suck at that point in time and for what reasons. You might have disagreed, but you were just happy to have someone with whom to sit around with, to get high and listen to music with for hours at a time. Your friend sends you a Snapchat, a 10-second long video performance—and you and you alone are the intended audience. …
Technology does not necessarily advance or diminish people’s lives. Technology is an extension of life itself: it can be as lonely or happy as you make it. You can use it to get lost in an echo chamber of self-importance, or you can use it to make genuine connections with people.
Pivoting off various news stories about Instagram, Ali Eteraz describes the rise of a world that leaves out the written word:
The virtual world that took off in the mid 90′s started as a place for words. Every person made a screen-name and then used text to communicate their ideas and feelings. But in an extremely accelerated manner the supremacy of text was weakened. First, by progressively smaller bursts of text (websites became blogs, became status updates, became 144 character tweets), and then through the enthronement of the image. Whether it is moving pictures (Youtube, Vimeo, Liveleak), or photo-sharing sites like Instagram, Pinterest, and Snapchat, it goes without saying that we are well on our way to communicating with each other by way of pictures. And let’s not forget about selfies and nudity (where we communicate to our privates in pictures).
For many people this transformation hasn’t been jarring. After all, we are descendants of cavemen that told their stories upon stone walls by way of images. And we are descended of societies where the primary language was the hieroglyph, which is nothing more than words represented in imagistic forms. From this perspective we shouldn’t show much concern if our societies transition away from words and move to communicating by way of the image. And, in fact, most people won’t care. Language has only one use, which is to tell a story, and a story can be told in a thousand different ways. In fact, you only have to look at the billions of dollars that world’s various film industries earn to realize that maybe the transition to communicating by way of the image has already happened.
Patrick McConlogue’s wants to teach a homeless man to code:
The idea is simple. Without disrespecting him, I will offer two options:
1. I will come back tomorrow and give you $100 in cash.
I was glad to read Susan Jacoby’s op-ed today, pushing back against the Brown-Quinn thesis that women are somehow victims of sexting culture and not full, eager participants. She makes some of the points I did last week:
There is no force involved here; people of both sexes are able to block unwanted advances. Women are certainly safer on the Web than they would be going home with strangers they meet in bars.
But then she veers off into this diatribe:
The morality of virtual sex, as long as no one is cheating on a real partner, is not what bothers me. What’s truly troubling about the whole business is that it resembles the substitution of texting for extended, face-to-face time with friends. Virtual sex is to sex as virtual food is to food: you can’t taste, touch or smell it, and you don’t have to do any preparation or work. Sex with strangers online amounts to a diminution, close to an absolute negation, of the context that gives human interaction genuine content. Erotic play without context becomes just a form of one-on-one pornography.
Bingo: “one-on-one pornography”. That’s the most concise description of sexting at its best that I have yet read. And it prompts me to ask: what, pray, is so wrong with that? Sex has always been about fantasy and reality and the sometimes ridiculous and sometimes incredibly hot experiences that mix can engender. The most fundamental sexual organ is the brain, as my shrink often points out to me. And masturbation – which is solitary sex based on fantasy (sometimes from pornography, sometimes from real life, sometimes from an Old Spice commercial) – is as old as human beings’ brains.
That’s why virtual sex is not like virtual food. You can have an orgasm in your body as well as your mind without any actual “work” in a way you cannot eat or taste something virtually. In fact, your sexual experiences through masturbatory fantasy can be far more satisfying and intense than the actual thing – you know, when one of you has come and the other hasn’t, when the dog jumps on the bed in the middle of it, when one of you farts or queefs, when the word “ow” occasionally surfaces, or when your mind wanders for a bit and your already sated spouse has to look at the ceiling for a while and think of the skim milk that needs buying, as you plug away to get it over with.
Nothing is as over-rated as bad actual sex or as under-rated as good virtual sex. And, yes, it isn’t real in the way that a loving, physical fuck-fest with a loved partner is real. But so what? Since when is the ideal the enemy of the good? And the fact that it isn’t real – that it’s a fantasy deriving from a sexual avatar – means it’s less perilous. It’s a form of play, the kind of activity that marks intelligent beings from those with less developed frontal cortexes. It’s play between two fantasy partners; it victimizes no-one; it transmits no diseases; it risks no pregnancy; it renders both partners radically more equal than they would be in the actual sack; and, as long as it is kosher with your partner, if you have one, it is much more moral than actual adultery, precisely because it isn’t real.
And women would be the most likely to gain sexual pleasure from this without all the attendant headaches and dangers of an actual physical, real-life sexual encounter.
As Megan Garber considers a new high-tech device for capturing smells – called the Madeleine, demonstrated above - she runs through many predecessors:
The tradition of scent-mapping goes back, it seems, to the 1790s, when the physician and pioneering hygienist Jean-Noël Hallé embarked on an odor-recording expedition of Paris. Hallé had a grand vision that was technologically limited: the equipment he used to document his six-mile expedition along the banks of the Seine started and stopped at his notebook and his nose.
Nearly two centuries later, however, in the 1970s, the Swiss fragrance chemist Roman Kaiser developed the odor-preservation technique he dubbed headspace capture: