There’s no legitimate doubt that some of the chemicals in cannabis have medical value. But “marijuana” doesn’t name a medicine, if a medicine is a material of known chemical composition that clinical trials have shown, at some specific dosage and route of administration, to be safe and effective in the treatment of some specific ailment. The huge variations from strain to strain, and from one means of administration to another, mean that clinical trials would have to be done on specific cannabis preparations, not on “marijuana” as a general category. And it’s only those specific preparations that would then qualify for “downscheduling.”
Even an arbitrary decision to move the plant itself from Schedule I to Schedule II (or even Schedule III) would have mostly symbolic effects. It would still be a federal offense to grow, sell or possess cannabis except as a Food and Drug Administration-approved drug available by prescription. Downscheduling would be a consequence of clinical trials leading to FDA approval and prescription availability, not a substitute for them.
That essay doesn’t include one item on which the discussion has been especially confused: the claim that the President, by himself, has the power to reschedule. In fact, the Controlled Substances Act gives that power to the Attorney General, and requires that the AG get medical advice from the Secretary of HHS and take that advice as authoritative. The AG has delegated his responsibility to the DEA Administrator, and the HHS secretary has delegated hers to the FDA Commissioner.
Put me in the category of those who believe that naming winter storms trivializes how serious they have become. Unlike hurricanes, which are given names based on an agreed upon format (used to be all female names, now they alternate between male and female, which probably pisses off someone on the right), giving snow storms names like “Snowmeggedon” is all hype, no information. While there is no question that this winter has been weird (I like to tell my conservative friends it’s a good thing that climate change is a hoax or we’d be in trouble) you’d think the media would take them more seriously then they do, because people are being hurt and some are dying.
Another agrees:
Naming storms that aren’t hurricanes is – excuse my French – bullshit. It’s a Weather Channel marketing ploy so people can use hashtags to talk about snow, driving search traffic to TWC’s web page. It’s click-bait, pure and simple.
The names are chosen to sound scary, but unlike the National Hurricane Center names, they don’t undergo any sort of peer review. That leads them to choose really insensitive names like XENIA. What’s wrong with calling a storm Xenia? Well, it’s a Greek word for hospitality … but it’s also the name of a town in Ohio (just outside Dayton) with a history of being hit by massive damaging tornadoes. Xenia is famous among meteorologists. Nobody with any understanding of American weather history would use that name to refer to a snowstorm, any more than Boeing would build a “Model 911” commercial passenger aircraft.
Maybe I’m biased because I grew up in hurricane country, but for me, if it doesn’t spin and come yowling off the ocean like an angry cat between August and October, it doesn’t deserve a name.
Previous Dish on Weather Channel click-bait here. Another reader:
Let’s be clear that WaPo and TWC are not competing to name storms. WaPo, specifically the Capital Weather Gang (CWG), only names a few of the bigger winter storms and only after they’ve already hit. It makes it easier to refer to these events in future discussion, rather than saying “Oh hey, remember the east coast nor’easter of February 12-13, 2014?” Yeah, I don’t remember that one either. They named one storm last year, March’s “Snowquester”, and that followed a two-year lull in naming following 2011’s appropriately titled “Commutageddon”. Also, folks going to WaPo for weather news are likely those living the DC Metro area, so its names probably don’t resonate much beyond the Beltway.
What TWC is doing is far more debated in meteorological circles (CWG has a great blog post about it here, which explores both the good and the bad). The biggest beef held by most non-TWC meteorologists seems to be that such a large private, for-profit entity should not be the arbiter for naming winter storms, hoping that everyone just goes along with it. The federal government does this duty for summer hurricanes, and many feel that if we DO need to start naming storms they should be in the lead so there’s no naming confusion across different media outlets. Hurricanes are also very distinct and dangerous storms systems, with a very clear need for uniformity of advance information and warning.
However, most seem to agree that we don’t need to name winter storms since they are totally different animals, often made up of the collision of multiple high and low pressure zones across vast areas. Sure they can produce serious consequences, but TWC has named more than a few that had very little noticeable impact. You could even have multiple TWC-named winter storms combining to form one large event. How do you apply the naming for that type of event without further confusing the public?
Sadie Stein investigates the practice of dipping strawberries into layers of white and milk chocolate, or the “art of dressing strawberries in tuxedos”:
As we know, chocolate-dipped strawberries have long held a cherished place in the echelons of Romantic Signifiers. Apocryphal sources from around the Internet claim that a woman named Lorraine Lorusso invented them in the sixties, when she was a candy buyer for Stop & Shop in Chicago—she used to demonstrate new products at the front of the store, and as looked at the strawberries one day, it occurred to her that they ought to be dipped, immediately, in chocolate. But who was the genius who decided to dress the strawberry in formal wear? Google yields no results, although one suspects the eighties. Like the large martini glass filled with mashed potatoes, or the bed strewn with rose petals, its origins are lost in the mists of time. Like love itself, it simply Is. In the immortal words of James Baldwin, “Love is a battle, love is a war; love is a growing up.” Love is a strawberry dressed in a chocolate tuxedo.
Meanwhile, Meg Favreau notes that “from the time Spanish explorers brought chocolate to Europe all the way until the 1900s, chocolate was considered to be more medicine and less treat”:
Even when the humoral theory of medicine faded out of fashion, people remained convinced of chocolate’s cure-all properties. In an 1845 issue of The Magazine of Science, and Schools of Art, there’s this note:
Chocolate is a very important article of diet, as it may be literally termed meat and drink; and were our half-starved artisans, over-wrought factory children, and rickety millinery girls induced to drink it instead of the innutritious and beverage called tea, its nutritive qualities would soon develop themselves in their improved and more robust constitutions.
And of course, in the 1800s and early 1900s, there were chocolate tonics to cure all ills, some more legit than others – Dr. Day’s Chocolate Tonic Laxative and Hauswaldt Vigor Chocolate, among others.
Throughout all of this, people did also consume chocolate solely for pleasure – although not nearly to the extent that they do today. By some accounts, the wives of Spanish colonists were obsessed with drinking chocolate, and in The True History of Chocolate, writers Sophie D. Coe and Michael D. Coe state that, “There is a misconception among some food writers that solid chocolate confectionery is a fairly modern invention… Yet there is evidence that such sweets were being manufactured early on in Mexico… [and] They almost certainly graced many a banquet table in Baroque Europe.” But despite all of this eating of chocolate as sweet, it wasn’t until the 1950s that chocolate was solely marketed for pleasure, no medical claims attached.
American wives are finally better educated than their husbands:
For the first time since Pew Research has tracked this trend over the past 50 years, the share of couples in which the wife is the one “marrying down” educationally is higher than those in which the husband has more education. Among married women in 2012, 21 percent had spouses who were less educated than they were – a threefold increase from 1960, according to a new Pew Research Center analysis of Census data. The share of couples where the husband’s education exceeds his wife’s increased steadily from 1960 to 1990, but has fallen since then to 20 percent in 2012.
The trend toward wives being more educated than their husbands is even more prevalent among newlyweds, partly because younger women have surpassed men in higher education in the past two decades. In 2012, 27 percent of newlywed women married a spouse whose education level was lower than theirs. By contrast, only 15 percent of newlywed men married a spouse with less education.
Considering how much educational attainment in general has risen over the past 50 years, you could argue record numbers of men are marrying up. But as Kelly Faircloth notes, better-educated women aren’t necessarily better-paid.
In a surprise announcement Tuesday, Facebook revealed that it is now offering custom categories in the gender selection option beyond the traditional “male” and “female” demarcations.
Unlike Google+, however, which uses “other” as a totalizing option for anybody who wants to express their gender identity outside of the male/female binary (which may help explain why only one percent of its users choose to use it), Facebook is doing its users one better: offering 50 unique terms to choose from. So now instead of just selecting either “male” or “female,” you can choose “custom,” which lets you select your preferred gender identification.
Facebook isn’t allowing users to just write whatever they want, mind you. (I just tried to enter the term “walrus” and had no luck.) But it’s giving more than 50 new options, including everything from “genderqueer” to the defiantly simple “neither.” … As for pronouns, the service is adding the vaguely neutral term “them” to the existing “him” or “her” options.
I must admit, I haven’t the slightest clue what some of these terms mean, much less the distinction between, say, Trans Female, Trans*Female, Trans*Woman, Transexual Female, and Transgender Female. For that matter, if I’m understanding the terms correctly, I’m not only male I’m also Cis, Cis Male, Cis Man, Cisgender, and Cisgender Male; I have no idea what advantage is created by having so many choices to describe the exact same thing.
My point isn’t to make light of this. On balance, I think this is the right move. The fact that simply choosing male was a no-brainer for me under the old category system—indeed, it never occurred to me to wonder why additional choices weren’t provided—is an indicator of privilege. Facebook may have overcompensated here. But at least they’ve inspired a useful conversation.
I doubt Facebook did this as an act of pure progressiveness to cater for its transgender users and I can’t help wondering what the commercial imperative is. What money is there to be made by sorting people into ever more specific boxes? Advertisers will be wetting themselves – particularly anyone selling wigs, chest binders, or any of the other specialised products aimed at those of us who seek to change our gender. It’s the monetisation of minorities.
But – as many of my Facebook friends have pointed out, wouldn’t it be better to leave a blank box for people to dream up their own gender identities? Why choose from 56?
Update from a reader, who answers Lees’ question:
From someone in the tech field, it’s because data normalization on a free-form field is next to impossible. Where you earlier quoted someone as calling it the “monetization of minorities” (which is entirely correct), there’s no way to target them if FB doesn’t know exactly who it’s targeting. That’s why you have to choose a specific gender – so FB knows exactly what it’s targeting.
That’s the name of the above animated musical short by Matt Berenty and David Bokser, which Casey Chan praises as “adorable in all the right ways” and “just as good as any Pixar short”. In an interview last November, the filmmakers spoke about their inspiration for the project:
Matt Berenty: “The project started when David Bokser and I were just about to graduate from the Savannah College of Art and Design to pursue jobs in the animation industry. We were driving to lunch one day toward the end of school and I looked up at a billboard and asked, ‘What would it be like to live in a billboard?’ This instantly sparked an interest in telling a story based around this idea.”
David Bosker: “I think what made it stick so well for us was that the field of advertising was so ripe for satire. We are constantly being bombarded by advertisements which are essentially messages that somebody, somewhere has thought up in order to influence our actions and emotions. We wanted to find a way to wrap that concept inside a traditional love story without it being too heavy handed in it’s message or too sentimental.”
Workers at the Volkswagen plant in Chattanooga vote today on whether to join the United Automobile Workers. Lydia DePillis outlines what’s at stake:
If a majority of Volkswagen’s 1,570 hourly workers vote yes, it would mark the first time in nearly three decades of trying that the UAW has successfully organized a plant for a foreign brand in the United States. This time, the union has a powerful ally: Volkswagen itself, which is hoping the union will collaborate in a German-style “works council” and help manage plant operations.
Tennessee’s GOP leaders — along with well-funded conservative activists such as Grover Norquist — aren’t letting the UAW in without a fight.
Seth Michaels looks at the Republican campaign to stop the plant from unionizing:
Sen. Bob Corker held a press conference Tuesday specifically to ask VW employees to reject a union, claiming potential negative impact on the state’s economy. More insidiously, state legislators suggested that they might punish VW if employees vote for the union – withholding tax incentives for future expansions, for example. In addition to likely being illegal, these threats have a special kind of irony: “we’re so worried that you’ll have a negative jobs impact that we’re willing to block future jobs to prevent it.”
Waldman thinks this reveals something ugly about the Republican anti-union agenda:
Here you have a highly profitable company that wants to have a more cooperative relationship with its workers, and obviously sees a union as a path to that relationship, because they know that they can work that way with unions, since they do it already all over the world. But the Republican politicians don’t care about what the corporation wants. They are so venomously opposed to collective bargaining that they’ll toss aside all their supposed ideals about economic liberty in a heartbeat.
[T]he truth is that there’s little evidence that the anti-union manufacturing model (exemplified by companies like Nissan) works better than the unionized one, particularly when it incorporates Germanic work councils. VW actually credits the system with making it the largest and richest auto firm in the world. What’s more, the workers I’ve spoken with in Chattanooga say they are less interested in pay hikes than in having some say over which cars get made in the factory, what sorts of training programs they’ll have access to, etc.
Composer Alexis Kirke creates music that changes in response to listeners’ physical states, as tracked by biosensors. Others are working on similar experiments:
In 2011, tech firm Sensum debuted its interactive horror film Unsound at the South by Southwest festival in Austin, Texas. The firm uses galvanic skin response sensors to alter film content to heighten – or reduce – emotional reaction to it. For my own short film, Many Worlds, algorithms use brainwaves, muscle tension, perspiration and heart rate in a selection of the audience to adjust the story in real time, choosing the most appropriate of the film’s four narratives to maximize intensity.
Writing this kind of multiple narrative is a challenge, and one that science fiction author Hannu Rajaniemi has tackled with technologist Samuel Halliday for an e-book project called Neurofiction. The first short story, Snow White is Dead, has 48 different narrative paths, selected automatically based on the results of an EEG brainwave monitor attached to the computer on which the e-book is being read. “Readers have been enthusiastic – several commented that they found the story very moving,” says Rajaniemi.