James Coyne challenges a recent story suggesting that living meaningfully is healthier than living happily, concluding, “Press coverage for this story is pure hokum.” Here he examines the study questions:
Participants completed online assessments of hedonic and eudaimonic well-being [Short Flourishing Scale, e.g., in the past week, how often did you feel. . . happy? (hedonic), satisfied? (hedonic), that your life has a sense of direction or meaning to it? (eudaimonic), that you have experiences that challenge you to grow and become a better person? (eudaimonic), that you had something to contribute to society? (eudaimonic); answered on a six-point frequency metric whereby 0 indicates never, 1 indicates once or twice, 2 indicates approximately once per week, 3 indicates two or three times per week, 4 indicates almost every day, and 5 indicates every day]
These questions are odd, vague, and unlikely to be encountered in everyday life unless someone happens to be Barack Obama or Bill Gates. I don’t know about you, but my wife has not recently asked me at dinner nor have I been asked another friend at a happy hour get together, “hey Jim, what did you do today to contribute to society?” “Did you happen to run into any problems that challenge you to grow and become a better person?”
It’s not surprising that research participants requested to answer these questions came up with something vague and affectively toned, i.e., related to their mood at the moment. I’m sure that if investigators had done a cognitive interview, it would’ve revealed that respondents struggle with trying to find answers and the basis of the answers vary widely…. This is a particularly poorly constructed assessment instrument. And whatever solid biomedical science was done, it is sustained or falls on an empirically indefensible distinction derived from poor assessment of what people have to say about themselves on the Internet.
Andrew popped on the blog today to praise the better-late-than-never endorsement of medical marijuana from Dr. Sanjay Gupta. And his hope in the Francis-led Vatican deepened following the resignation of one of its most homophobic Bishops. A few more quotes in honor of Dusty here and here.
During today’s slow news cycle, we weighed the pros and cons of boycotting the Olympics in the face of Russian anti-gay laws and kept tabs on the diplomatic rift between Putin and Obama. As the demand for journalists actually increases these days, bloggers continued to envision what a Bezos-led WaPo could look like. The last post of the day, on the beliefs of scientists, is already taking off on Facebook. And just when you thought US prison statistics couldn’t get more startling, they do.
Get your latest fix of airplane views from readers here. Today’s regular VFYW – from the Wadi Rum Desert in Jordan – was anything but ordinary. And we marked the end of Ramadan with a boy’s quizzical face.
Steven Pinker says scientists are guided by two fundamental precepts:
The first is that the world is intelligible. The phenomena we experience may be explained by principles that are more general than the phenomena themselves. These principles may in turn be explained by more fundamental principles, and so on. In making sense of our world, there should be few occasions in which we are forced to concede “It just is” or “It’s magic” or “Because I said so.” The commitment to intelligibility is not a matter of brute faith, but gradually validates itself as more and more of the world becomes explicable in scientific terms. The processes of life, for example, used to be attributed to a mysterious élan vital; now we know they are powered by chemical and physical reactions among complex molecules.
The second ideal is that the acquisition of knowledge is hard.
The world does not go out of its way to reveal its workings, and even if it did, our minds are prone to illusions, fallacies, and superstitions. Most of the traditional causes of belief – faith, revelation, dogma, authority, charisma, conventional wisdom, the invigorating glow of subjective certainty – are generators of error and should be dismissed as sources of knowledge. To understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.
David McRaney observes that the scientific method is totally unlike how humans naturally make sense of the world – which is why it’s so effective:
Your natural tendency is to start from a conclusion and work backward to confirm your assumptions, but the scientific method drives down the wrong side of the road and tries to disconfirm your assumptions. …. You prefer to see causes rather than effects, signals in the noise, patterns in the randomness. You prefer easy-to-understand stories, and thus turn everything in life into a narrative so that complicated problems become easy. Scientists work to remove the narrative, to boil it away, leaving behind only the raw facts. Those data sit there naked and exposed so they can be reflected upon and rearranged by each new visitor.
(Photo: Charles Darwin’s earliest extant sketch of an evolutionary tree, 1837)
Writing on the final day of Ramadan, Sajida Jalalzai notes the absurdity of Guantánamo Bay officials accommodating religious fasters while cracking down on hunger strikers. In fact, she says, both types of hunger are “intimately connected”:
[They] push and pull together: one hunger, mandated in religious texts and traditions, and limited by clearly demarcated temporal bounds; the other, undertaken in protest of injustice, and theoretically as indefinite as the prisoners’ detentions. The military respects the detainees’ refusal of food on religious grounds, but refuses to honor their choice to hunger strike. Taken together, these two types of hunger point out the military’s confusion about just what to do with the prisoners at Guantánamo, not only in terms of where or for how long to keep them, but also in terms of the bases and ways in which to honor or deny their rights and autonomy. The military’s divergent responses to the same action, namely, the detainees’ refusal of food, underlines the irony of paying lip service to the religious rights of indefinitely detained prisoners of war.
A reader can relate to our post on civil asset forfeiture laws:
I’m an attorney living in Indiana. My sister-in-law was recently arrested for calling in a prescription for herself (impersonating a doctor’s office) for a pain killer. She was an addict and needed treatment (which she is receiving now, thank goodness). She was in her car when she picked up the prescription at the drive-thru window of the pharmacy. After her arrest, they began civil forfeiture proceedings to take her car (her dead mother’s car, by the way – with great sentimental value). I read the statute and couldn’t believe how broad it is.
She was lucky; she had money to rent a car for the four months it took her expensive attorney to fight to get her car back. I told her that my reading of the statute made the prospects look awfully grim. As far as I can tell, she got her car back just because they felt like being nice. There was nothing anywhere in the law saying that she had to get it back. I have no faith in our state legislature, of course, but it would be nice if these laws could be re-written to make some kind of sense – only taking property away from the really bad criminals. Someone who has an addiction shouldn’t be in that category.
Update from another attorney:
Here’s where I get pissed about civil forfeiture: It is damn near only used in drug crimes. The don’t forfeit the car when someone commits DUI. But what I wonder is, why not financial crimes?
When HSBC got indicted for laundering money for drug cartels, on day one the justice department should have frozen accounts, seized buildings, phone systems, computers, bank branches, everything that in any way touched on the scheme. CITI? You committed massive mortgage fraud? JP Morgan, you bribed officials in Birmingham? That’s racketeering; we’re taking your computer systems, and seizing accounts that are at all traceable. If there’s a place where civil forfeiture actually makes sense, it’s fraud and financial crimes (what with money being the goal of such crimes), yet forfeiture is never pursued in those cases, and jail is not an option for a corporate defendant, so we are left with a peanuts fine and it’s off to the next scam for the peep.
A young Saudi boy looks up during the morning Eid al-Fitr prayer at Turki bin Abdullah grand mosque in Riyadh, on August 8, 2013. Muslims worldwide observe the Eid al-Fitr prayer to mark the end of the fasting month of Ramadan and the beginning of the new month of blessing, Shawwal. By Fayez Nureldine/AFP/Getty Images.
Daniel Engber thinks that lab-grown meat is nonsensical. He notes that the world’s first test-tube burger began “yellow-white, so [creator Mark Post] colored it with beet juice, caramel and saffron”:
[L]aboratory meat only seems “real”—it only matches up with the taste you know and love—when mixed with additives to improve its color, flavor, and mouthfeel. But if that’s true, then what’s the point? Does a base of cultured cow cells really get us any closer to a perfect substitute for flesh than soy or wheat or mushroom? Last year, Slate’s Farhad Manjoo raved about the latest veggie form of “chicken,” a product made from extruded soy paste that, in the words of its founder, gets “the proteins to align in a way that makes them almost indistinguishable from animal proteins.” Sounds like dopey marketing to me, but Farhad swears Beyond Meat is “90 to 95 percent as realistic as chicken.”
The whole idea, of course, is that lab-made meat can do much better because it’s made from bovine stem cells: Instead of 95-percent authenticity, it will get to 99 percent or even higher. But there’s little reason to be hopeful. It’s true that scientists can make synthetic versions of natural products—much of the vanilla flavor in our food supply, for example, comes not from the vanilla orchid but a process of industrial fermentation. (Either way, the ingredient tastes pretty much the same.) Meat is a far more complicated substance, though—a mix of muscle, fat, tendons, ligaments, and blood—and one that doesn’t lend itself to “nature-identical” formulations. (The life history of an animal also affects its taste, and makes it harder to recreate that taste by hand.)
Earlier Dish on Post’s test-tube burger here and here.
In July, The Wolverine, helmed by supposed movie star Hugh Jackman, opened to relatively underwhelming figures. 2 Guns, last weekend’s big Mark Wahlberg/Denzel Washington vehicle, underperformed too, but not as epically as The Lone Ranger’s flop in July. That movie featured another supposed movie star, Johnny Depp (Hollywood’s third-most valuable, according to Vulture), and movie-star hopeful Armie Hammer. “Tumbleweeds blew through theaters playing The Lone Ranger over the weekend, calling into question Johnny Depp’s star power,” fretted the Times. This calamity came just one week after Channing Tatum’s White House Down took in a disappointing $25.7 million its opening weekend. “[D]oes he even deserve his A-list status?,” wrote Vulture’s Kyle Buchanan. And summer movie season started with After Earth, an appallingly bad Will Smith star vehicle that failed to connect with audiences, prompting The Independent to wonder, “Is Will Smith’s reign at the summer box office over?”
Why all the performance-anxiety when it comes to male leads?
Yes, male movie stars tend to be more bankable than their female counterparts, and so it’s not great for the business as a whole if there are fewer of them. But that doesn’t entirely explain the endless, nervous parsing of what Channing Tatum’s stardom or (non-stardom) means. This isn’t solely a crisis about profits; it’s a cultural identity crisis. We go to the movies to see heroes doing heroic things, unlike the small screen, where the episodic nature of television has given way to the rise of the anti-hero. The emphasis on actors being able to singlehandedly, swaggeringly “open” or “carry” or “rescue” a movie seems like an extension of that wish. And now movie stars, like sports and political figures before them, have let us down. Or maybe not “us,” but more specifically, America’s men. Hollywood movies are made to appeal to a male audience, after all. It’s not so much that women are rejecting Hollywood’s vision of what manhood is; it’s more that American men don’t know who they want to be any more.
Update from a reader:
Wow, there are a lot of fudged facts and bad assumptions in the rather uninspired piece you posted by Noreen Malone. As someone in the film business, let me run a few things down:
1) The Wolverine opened just fine and will make a nice profit over its run. I’m not sure where she comes up with “underwhelming”, unless she was expecting it to make $100 million first weekend, which nobody out here expected it to do.
2) 2 Guns opened at #1 at the BO, made more than a third of its budget first weekend, and will end up as a minor hit. It was also made for a lower budget ($61 million) than the names in it would suggest, and marketed at that lower budget, so the success it did have was due in large part to its stars’ likability.
3) White House Down has the unfortunate distinction of being the second White-House-related terrorism movie of the summer. It was destined to fail. The fact that it even opened at $25 million is a testament to Tatum’s fan base. And, other than that one, Tatum has been gold, and he’ll have another huge hit next year with the 21 Jump Street sequel. The guy’s as legit a movie star as exists in the world today.
4) The Lone Ranger has been a well-publicized disaster for two years. Its budget was such that it has essentially no chance to make money, and they pulled a lot of advertising dollars at the end to try to cut their losses. Also, Johnny Depp has said repeatedly that he doesn’t care if his movies make money. He’s an accidental movie star on the strength of the Pirates movies and Alice in Wonderland, but other than that, he’s done quirky, indie things, many of which have failed at the BO. A huge name, but not a guy who does enough promotion or gives enough of a shit that he should be a barometer for the state of movie stardom generally.
5) After Earth was a vanity project for Will Smith that he essentially bought as a present for his son. It’s a complete anomaly and will likely not be repeated. It’s his Battlefield Earth. If he did Men in Black 4 tomorrow, it would be a huge hit.
Most importantly, what Malone says is nothing new. People have been saying that for 20 years. William Goldman wrote about it extensively in his topical books on Hollywood. No movie star has ever been able to get a huge opening for a terrible movie. Adam Sandler had 10 hits in a row … then he did Little Nicky, and nobody came. That doesn’t mean he’s no longer a movie star. It means that people don’t want to see movie stars in shit movies. Which, if you think about it, is actually a reassuring thing to know about American audiences. It’s comforting to know that quality still matters somewhat.
(Unless we’re talking about Grown Ups 2. I don’t know what to make of the fact that that POS is a hit.)
I see that in the context of your Spotifyconversation, a reader has claimed that musicians make “all their money from touring, and to a lesser extent, merchandise.” The music industry is sort of unusual in that many people have strong opinions about how musicians “ought” to be making money, despite not having any first-hand experience in how musicians actually DO make a living. We actually surveyed over 5000 musicians about their income and among the things we’ve learned is that only 27% comes from touring – much less than many people assume. And only 2% comes from merchandise, like t-shirts, posters, etc.
With regard to Spotify and streaming services, an important thing to understand is that what works for some artists may not work for others. In particular, streaming services are set up in a way that makes sense for labels and artists whose goals are based on mass-audience assumptions, like the major labels (which own equity stakes in the services). They may not work as well for artists who are trying to build sustainable careers on a more intimate scale. Here’s a blog post we wrote on the subject. In general we’d like to see systems that allow people who aren’t Justin Bieber or Taylor Swift to make a living.
Update from a reader:
The statistics offered by your reader are very misleading.
It would appear that the 5,000 musicians surveyed include very few who actually record and release records as featured artists for a living. The survey appears to be dominated by orchestra players, music teachers, studio musicians, and others. None of this is relevant to the issue at hand, which is whether Spotify and the other streaming services reduce the income recording artists make from their recordings and whether or not this is significant.
If one were to survey recording artists, I think you would find that a much larger percentage of their income is derived from live performance, particularly since record sales have declined in the last decade.
Your reader also misunderstands the nature of the relationship between major record companies and the streaming services. Sure, the three majors have equity interests in Spotify and in some other services. But they also have licensing agreements with the services under which the labels are paid royalties. I have some familiarity with this area and I can tell you that the equity interest owned by the major record companies doesn’t lead to a lower royalty rate under the licenses.
The real divide is between current artists who release new albums and so-called “catalog” artists. Anecdotal evidence suggests that streaming services have had a negative impact on sales of new releases. On the other hand, for older artists who have aging catalogs of recordings, sales of physical records and downloads have in many cases withered away, and for those artists, streaming services, as well as digital radio services, provide a new source of income.
Another reader:
I found the discussion of artist royalties very interesting. Two years ago I earned my masters degree in music business and focused my final paper on independent artist and what the business looks like for them. I’ll agree that royalties earned through Spotify, Pandora, and other similar free services are very low. But I’d argue that the alternative to those services isn’t people purchasing a download, but finding other free ways to access the music.
The bottom line for artists is that the cat is out of the bag with regards to getting music for free. It’s been over a decade since the Napster years, and the consumer expectations have changed drastically. Big-name artists like Thom Yorke can get away with pulling their music from Spotify because dedicated fans will pay for it anyway. For new artists, though, it is unlikely that unfamiliar fans would be willing to invest money up front in something they haven’t heard before. They’ll look for it for free, and if it’s not on Spotify or YouTube then there are other means to get the same thing without even a fraction of a cent going towards the artist. There are still dedicated music fans out there willing to invest their time and money to find the music they want, as the niche vinyl market shows, but they’re the minority.
I do feel sorry for people working to make it in music today. Technology has made it easier to record music and distribute it worldwide instantly, but harder to get noticed. Few artists have ever been able to make their living only through music, but more people than ever are trying to with little success. I’m currently working at SoundExchange, a performing rights organization that processes royalties for public performances of sound recordings on digital radio stations, so I know what those numbers look like.