The World’s First Trillionaire

Elliot Hannon wonders who it will be:

The two industries are best positioned to take a billionaire to the next level are technology and retail, for the same reason: Both use a global labor pool to make affordable products, inexpensive enough that workers can soon become the consumers of the products they are making. Of course, the tech industry is building on Internet infrastructure to create billionaires faster than ever. In October 2010, photo-sharing social network Instagram came into being; less than two years and a mere 13 employees later, Facebook snapped up the company for a cool billion.

A look at the Forbes list of the world’s wealthiest shows that the rise of technology and telecommunications has made a big impact:

Mexico’s Carlos Slim, who vies with Gates for the top spot on global-richest lists, made the bulk of his fortune in telecom. Lurking just below these technology magnates are Amancio Ortega, the founder of Zara; the Walton family and their ubiquitous Walmart chain; the Mars family of candy fame; Stefan Persson, chairman of H&M; and Jeff Bezos of Amazon. The rise of these individuals and families is perhaps more surprising than that of Gates, Slim, and their peers—and may be revealing about the clearest sustainable path to 10 figures. These magnates are best positioned, in different ways, to find and use available labor no matter where it is. That will help them sell their products at prices that a wealthier world will be able to afford most quickly. For that to happen, they’ll need emerging markets to continue emerging, creating not only employees but customers, too. If that happens, in two generations – about 60 years – Credit Suisse predicts there could be as many as 11 trillionaires walking among us.

Of course, forecasts can be way off; in 1999, Wired predicted that Bill Gates would reach the trillion-dollar mark by August 2015. He’s currently about $928 billion short.

The Consciousness Of A Cockroach?

Responding to the experiments of cockroach-controlling neurologists Backyard Brains – featured previously on the Dish – Brandom Keim looks at the evidence for insect sentience:

Before dismissing bug consciousness out of hand — their brains are so tiny! And, they’re bugs! — it’s worth recalling that one of the first scientists to seriously consider the notion was Charles Darwin, who spent most of his adult life, even as he completed The Descent of Man (1871) and On the Origin of Species (1859), thinking about earthworms. … Among the surprising — to me, anyway — facts detailed by [ethologist Mathieu] Lihoreau, [and researchers James] Costa and [Colette] Rivault about Blattella germanica (the German, or small cockroach) and Periplaneta Americana (the American, or large cockroach), found in kitchens and sewers worldwide, is their rich social lives:

one can think of them as living in herds. Groups decide collectively on where to feed and shelter, and there’s evidence of sophisticated communication, via chemical signals rather than dances. When kept in isolation, individual roaches develop behavioural disorders; they possess rich spatial memories, which they use to navigate; and they might even recognise group members on an individual basis. Few researchers have studied their cognition, says Lihoreau, but cockroaches likely possess ‘comparable faculties of associative learning, memory and communication’ to honeybees.

As to whether cockroaches possess a self, in the pages of Cockroaches: Ecology, Behavior, and Natural History (2007), co-written by William J Bell, Louis M Roth and Christine A Nalepa, I happened upon a reference to Archy, a popular early-20th-century cartoon cockroach who said: ‘Expression is the need of my soul.’ Archy’s inclusion was intended in fun, but there was a grain of truth. Cockroaches could very well possess a sense of self, and one that’s perhaps not entirely alien to our own.

Painting America’s Imaginary Past

James Parker reviews Deborah Solomon’s American Mirror: The Life and Art of Norman Rockwell, which tells the story of how “this rather strange, marginal-feeling man contrived to represent the inner life of a mass audience” – a less straightforward portrayal of “simpler times” than you might surmise:

The secret, clearly, is that Rockwell’s productions, his tableaux of American innocence, are notNorman_Rockwell-_Scout_at_Ships_Wheel simple at all. If they were, we would have forgotten them by now. Instead they are loaded with unconscious energy, with nervous hum and erogenous gleam. The grotesque, like some goblin field of gravity, seems to bend and warp even his most conventional subjects. Shiny noses, stringy throats, eyes too avid or too dull; a freakish quiddity in his rendering of shoes, elbows, baskets, bricks. Which to us, post-Freudians that we all inevitably are, can mean only one thing: repression! Rockwell’s preference for the company of boy models and rugged apprentices over that of, say, women is a recurring note in Solomon’s book. (“Draws Boys Not Girls” was the headline of a 1923 Boston Globe article.) He married unhappily, twice, and more happily a third time. And although his demons were not very sulphurous, not very rock-and-roll, they were demons nonetheless: shame, awkwardness, a cyclical sense of his own artistic insufficiency, and an enormous, hellacious self-imposed pressure to get everything right.

And about that repression:

If he was gay, he was never—on the evidence currently available—actively so. Solomon’s point is more that he wasn’t gay, he wasn’t anything. He hovered fraughtly, jamming his canvases with inference. Of his habit of placing a dog, person, hatbox, or similar item in the foreground of his paintings, between the viewer and the action, she writes: “It is one of the tensions in Rockwell’s art. He paints objects with the kind of fastidious realism intended to bring you closer to the touchable, handleable world. But then he paints a barricade in the foreground to keep you from touching. He cannot allow himself to touch what he wants.”

Ben Davis emphasizes the way the new biography shows the paradoxical relationship of the artist’s life to his work:

What drama there is comes not from the incidents of Rockwell’s life but from how, in Solomon’s telling, everything in his art actually represents its opposite. Rockwell created the imagery of the Boy Scouts—his most lucrative and long-lasting gig was for the annual Boy Scout calendar—but he was himself not particularly outdoorsy, a neat freak who couldn’t bear to get dirty. He created memorable images of piety (Saying Grace, 1951), but his clan was uninterested in religion; captured scenes of scampy rebellion (The Shiner, 1954) but was rule-bound and order-obsessed; and, most damningly, painted odes to family togetherness (The Homecoming, 1948) but was so affectionless that his own family despaired of ever knowing him. His first bride, Irene O’Connor, divorced him in 1930 on grounds of “mental cruelty;” his second wife, Mary Barstow, was driven to alcoholism and finally the mental hospital by his remove. Only his third wife, Molly Punderson, whom he met when he was 65 and she 64, seems to have been a fit, and they slept in separate beds. “At last he had found his feminine ideal,” Solomon writes: “an elderly schoolteacher who was unlikely to make sexual demands on him.”

Recent Dish on Rockwell here and here.

(Image of Scout at Ship’s Wheel, Rockwell’s first published magazine cover illustration, from the September 1913 cover of Boys’ Life, via Wikimedia Commons)

The Good Old Days Of Hideous Webpages

Screen Shot 2013-11-25 at 1.41.17 PM

Amid the sleek uniformity of Twitter and Facebook, Joe Kloc longs for the Geocities era:

Geocities began in 1994, advertising an enticing 15 megabytes of free space to any homesteader looking to make their place on the Web. The World Wide Web was only a few years old when this digital Northwest Ordinance was issued, and so its users, often referred to as netizens, were necessarily having their first interactions with the Internet, learning to make a place for themselves in the newly discovered online world. Millions of netizens with little to no experience or understanding of how a webpage “should” look utilized the site’s built-in development tools to create clapboard homes spattered with stray GIFs, looping MIDI files, and busy backgrounds. It was the Internet’s Wild West.

It is easy to dismiss these pages as a sort of outsider art. But outside of what? There was no such thing as a personal page before Geocities. And, in almost every meaningful sense of that word, there is no equivalent today.

Consider a page from [Geocities’] Heartland neighborhood, where one resident wrote, “Hi! My name is Sherry, my husband is Richard. We have three children, Colleen, Alicia, and James and we are out here in the desert of southern California.” Further down on the page is a link to “Richard’s Original Bedtime Stories.” … Where today do families publish their homemade bedtime stories about giant pickles? Sites like these have simply disappeared.

Geocities was bought by Yahoo in 1999, during the height of its popularity. Then along came Myspace in 2003, Facebook in 2004, and Twitter in 2006. And by 2009, Petsburg, Heartland, and the rest of Geocities had been shuttered. The world had chosen the pre-fab aesthetics of social networks over the 15-megabyte tracts of open land offered by Geocities. Jacques Mattheij, the founder of the Geocities archive site Reocities, explained this choice to me: “The Geocities environment offered more freedom for expression. Don’t like blue? Then Facebook probably isn’t for you.”

You’ll certainly find less comic sans. Update from a reader:

Know about Neocities? Started by a bitcoin “cyberpunk” dude named Kyle in Portland, Oregon.

From the About page:

(Screenshot via Reocities)

Sharing In The Pain

Jamil Zaki looks to a classic psychological experiment about the desire for company:

One of my all-time favorite studies, conducted by Stanley Schachter in the late 1950s, examined whether fear might bring people together.  Schachter convinced college-aged women that they would receive a series of electric shocks about 15 minutes later.  Some were told that these shocks would barely tickle, and others were told they would be very painful.  Participants were then asked whether they wanted to wait for their shocks in a room alone, or with other people.  People who believed that shocks would be painful strongly preferred being near others, whereas those who believed the shocks would be mild generally did not care whether or not they had neighbors in their waiting room.  On Schachter’s logic, this exposed a powerful rule about social behavior: in times of anxiety, people seek each other out.  Like penguins in February, we tend to face adversity by huddling up.

Schachter refined this rule through a follow-up study.  Some participants were given the option to be alone or with others who would also be shocked; others could wait alone or with people who would receive no shocks.  When individuals could no longer wait with fellow shock-fearers, their preferences for company disappeared.  This suggests that the benefit of crowds depends on our belief that others share our experiences.  Schachter put this more poetically, claiming that his finding “removes one shred of ambiguity from the old saw ‘Misery loves company.’  Misery doesn’t love just any kind of company, it loves only miserable company.”

Who Becomes A Surrogate?

In an excerpt from The Baby Chase: How Surrogacy Is Transforming the American Family, Leslie Morgan Steiner has the answer:

The typical profile runs like this: married, Christian, middle class, with two to three biological children, working a part-time job, living in a small town or suburb rather than a big city, with a degree of college education but usually without a college degree. Women who shop at Wal-Mart and Costco, not Whole Foods and Neiman-Marcus. In the United States, statistics show that surrogates fall into the average household income category of under $60,000. About 15 to 20 percent are military wives. Some are single women. Those who are married have husbands who support paid surrogacy; surrogacy is obviously not something you can hide, or withstand with a spouse who is not on board emotionally.

She also notes some “quirks” in the system:

There are very few Jewish surrogates and almost zero Jewish egg donors. … Daughters of surrogates frequently decide to be surrogates themselves; surrogacy can become a family tradition. Gay men are the favorite clients of many surrogate moms; one emotional complication is removed from the tricky relationship, because the gay intended parents don’t suffer the understandable jealousy/inferiority issues that can plague infertile intended mothers.

Kat Stoeffel zooms in on the financial aspects of surrogacy:

No one gets into the surrogate parenting business for the money – for one thing, the agencies won’t allow it. … But if one were to get into it for the money, hypothetically speaking, one would probably be interested to know that surrogates do not pay taxes on the payments from intended parents, “which are technically for pain and suffering incurred, not for carrying a baby,” and run $20,000 to $30,000 a pregnancy, not counting good karma.

Why “Black Friday”?

Daven Hiskey presents the above video, which explores (and dispels) some of the myths about how we refer to the day after Thanksgiving:

“Black Friday” as a name for the day after Thanksgiving was coined by police officers in New England. One of the earliest documented references of this was in December of 1961, where Denny Griswold of Public Relations News stated: “in Philadelphia, it became customary for officers to refer to the post-Thanksgiving days as Black Friday and Black Saturday. Hardly a stimulus for good business, the problem was discussed by… merchants with their Deputy City Representative… He recommended adoption of a positive approach which would convert Black Friday and Black Saturday to Big Friday and Big Saturday.” (Referring to the traffic and number of accidents.) “Big Friday” never caught on, but over the next decade, more and more references can be found in various newspaper archives of this particular Friday being called “Black Friday” for this reason.

In the 1980s as the name’s popularity spread throughout the United States, a new origin theory popped up, often touted by the media, that most retailers operated at a financial loss for the majority of the year and Black Friday was named such because it was the day of the year when the retailers would finally see a profit, moving out of the red and into the black.

This simply isn’t true. While there are some retailers that depend on the Christmas season’s revenue to make a profit for the year, most see profits every quarter based on the quarterly SEC filings of major retailers in the United States. There are also no documented references to this potential origin predating November of 1981.

Picking The Poor Kids Last

John Greenya is concerned about the affordability of youth sports:

An examination of who plays youth sports from ESPN The Magazine finds that while there may be 21.5 million kids between age six and 17 playing on a team, including teams at schools, the earliest participants come from upper-income families. “We also see starkly what drives the very earliest action: money,” wrote Bruce Kelley and Carl Carchia. “The biggest indicator of whether kids start young, [sports researcher Don] Sabo found, is whether their parents have a household income of $100,000 or more.” Kids from low-income families are the least likely to be on multiple teams.

And disturbingly, 3.5 million kids are expected to lose school sports by 2020, especially in financially strapped states like California and Florida and big inner cities: “Living in poor corners of cities culls even more kids from sports. Nationwide, according to the Robert Wood Johnson Foundation, only a quarter of eighth- to 12th-graders enrolled in the poorest schools played school sports.”