God In The Hands Of American Sinners

by Dish Staff

In an interview about his new book, Our Great Big American God, Matthew Paul Turner dissects the problems with an all-too-Americanized God, our habit of “affecting, reimagining, shaping, and changing God’s story”:

God was never meant to be a nationalized deity. The very idea that God would showcase geographical favorites or advance the kingdom of one at the expense of another or several others goes against many of Jesus’s basic teachings. Moreover, our relationship with God has caused a large majority of America’s Christians to posses an elitist attitude or worldview, at times even imperialistic. Rather than humility, mercy, and redemption, God seems to have made us controlling, know-it-alls, materialistic, and far too certain of what God thinks about political, social, and spiritual issues.

Throughout our history we’ve branded God into a deity that works for us, one that mixes well with American values, one that agrees with our wars, and one who not only adheres to our way of life, in many cases, our way of life is God’s ideal, which we often suggest is one of the reasons he blesses us with prosperity. The biggest issue perhaps is that many of us are so comfortable with our American God, so certain of his ways, that to believe that we might be wrong is impossible.

In an excerpt from the book, Turner explores the complex legacy of Jonathan Edwards, the theologian and preacher most famous for his hellfire and brimstone sermon, “Sinner in the Hands of an Angry God,” and whom he identifies as one of the key influences on the American understanding of the divine:

Jonathan Edwards changed the story of America’s God. He changed how the people of his time engaged God, editing a theology that was often portrayed harshly and dogmatically. He made strides to shape it with words into an almost beloved relationship between a grandiose God and a broken and depraved American heart. His words set the stage for what would become a steady foundation for America’s God to revolt against the Old World and bring about revolution. Historian Perry Miller suggests that America’s Enlightenment began and ended with Jonathan Edwards. And Edwards played a most defining role in bridging the space between Puritanism and what would eventually become American evangelicalism.

It was Edwards’s talent as a writer that, on one hand, makes him unforgettably important to so many still today. Preachers like John Piper, Tim Keller, Mark Driscoll, and others wouldn’t have much to write or preach about without Edwards dedicating the majority of his existence to literally emoting his version of John Calvin’s God onto the page. But it’s that same talent, that profound ability to create rich imagery with sentences and paragraphs, that would ultimately backfire on him. Rather than his gift becoming defined by his thoughts on God’s glory or God’s beauty, Edwards’s words helped to Americanize God’s hell, turning this country’s doctrines about fire and brimstone into HELLTM, an idea that would eventually become a method for introducing millions to God.

His thoughts on the future of this American God:

We’re beginning to see conservative and progressive ideas about God morphing together. We’re seeing younger Christians shift their ideas about God’s thoughts on homosexuality, and other social justice issues. We’re seeing, perhaps, more understanding of how God come to be. On the flip side, there’s this Tea Party God: they’ve dug in their heels, and created this God. It’s like they’re saying, ‘You didn’t like our evangelical God? OK, then wait until you get to know our Tea Party God!’ In some ways, this God is an asshole.

But again, what you’re seeing — is it God that you’re seeing or just reflections of people? We change, and God changes with us. We should be careful about what our actions and words suggest about God. Rather than forcing God to be in politics, or to be the middleman on social issues, we should bring God back to our communities, and invite everyone to the table.

Dogs vs Cats: The Great Debate, Ctd

by Dish Staff

Henri Cole cites Rilke’s thoughts on the age-old divide:

Look at the dogs: their confident and admiring attitude is such that some of them appear to have renounced the oldest traditions of dogdom in order to worship our own customs and even our foibles. It is just this which renders them tragic and sublime. Their choice to accept us forces them to dwell, so to speak, at the limits of their real natures, which they continually transcend with their human gazes and melancholy snouts.

But what is the demeanor of cats?—Cats are cats, briefly put, and their world is the world of cats through and through. They look at us, you say? But can you ever really know if they deign to hold your insignificant image for even a moment at the back of their retinas. Fixating on us, might they in fact be magically erasing us from their already full pupils? It is true that some of us let ourselves be taken in by their insistent and electric caresses. But these people should remember the strange, abrupt manner in which their favorite animal, distracted, turns off these effusions, which they’d presumed to be reciprocal. Even the privileged few, allowed close to cats, are rejected and disavowed many times.

Montaigne’s take:

“When I play with my cat”, he wrote, “who knows if I am not a pastime to her more than she is to me?”

He borrowed her point of view in relation to him just as readily as he occupied his own in relation to her. And, as he watched his dog twitching in sleep, he imagined the dog creating a disembodied hare to chase in its dreams – “a hare without fur or bones”, just as real in the dog’s mind as Montaigne’s own images of Paris or Rome were when he dreamed about those cities. The dog had its inner world, as Montaigne did, furnished with things that interested him.

Meanwhile, Jessica Love ponders why dogs are “so good at reading our nonverbal cues—so much better, even, than chimpanzees and bonobos, to whom we’re more closely related”:

Researchers now believe that dogs’ ability likely evolved during domestication, probably due to selective breeding. There’s some disagreement about whether our own ancestors were selecting for communicative skills specifically (perhaps to create better hunters, retrievers, or herders), or whether this prowess was merely a by-product of selecting for something else, like tameness.

But though the sensitivity dogs exhibit is truly impressive, it nonetheless falls short of what humans—even very young ones—are capable of. Infants will communicate information to their adults when they know that it is of interest to the caregivers; dogs will only do so if they are the ones interested. Young children also pick up on information conveyed to a third party; dogs, not so much. And a brand new study finds that two-year-old humans are much better than dogs at gauging from a situation whether a communicative signal is unintentional (and thus ignorable).

Meanwhile, the cat—mere feet away from a tuna treat, and despite the best efforts of an insistent pointing hand—does nothing.

Thoughts from Andrew and Dish readers here.

A Founder Left Behind By The Left

by Dish Staff

640px-Alexander_Hamilton_portrait_by_John_Trumbull_1806

Christian Parenti advises liberals to look to Alexander Hamilton for inspiration, not Thomas Jefferson. He especially praises Hamilton for his far-reaching economic insights:

Hamilton was alone among the “founding fathers” in understanding that the world was witnessing two revolutions simultaneously. One was the political transformation, embodied in the rise of republican government. The other was the economic rise of modern capitalism, with its globalizing networks of production, trade, and finance. Hamilton grasped the epochal importance of applied science and machinery as forces of production.

In the face of these changes, Hamilton created (and largely executed) a plan for government-led economic development along lines that would be followed in more recent times by many countries (particularly in East Asia) that have undergone rapid industrialization. His political mission was to create a state that could facilitate, encourage, and guide the process of economic change — a policy also known as dirigisme, although the expression never entered the American political lexicon the way its antonym, laissez-faire, did.

Parenti goes on to suggest how an appreciation of Hamilton might connect with a pressing contemporary issues – climate change:

Even today, Hamilton’s ideas about state-led industrialization offer much. Consider the crisis of climate change. Alas, we do not have the luxury of making this an agenda item for our future post-capitalist assembly. Facing up to it demands getting off fossil fuels in a very short time frame. That requires a massive and immediate industrial transformation, which must be undertaken using the actually existing states and economies currently on hand. Such a project can only be led by the state — an institution that Hamilton’s writing and life’s work helps us to rethink.

Unfortunately, many environmental activists today instinctively avoid the state. They see government as part of the problem — as it undoubtedly is — but never as part of the solution. They do not seek to confront, reshape, and use state power; the idea of calling for regulation and public ownership, makes them uncomfortable.

(Portrait of Hamilton by John Trumbull, 1806, via Wikimedia Commons)

The Disparities In Dining Out

by Dish Staff

Screen Shot 2014-08-29 at 1.42.10 PM

Roberto Ferdman flags a report that reveals gender wage gaps in the restaurant industry, even when tips are accounted for:

The median hourly wage paid to women is less than it is for men in all but one of the eleven jobs surveyed in a report by the Economic Policy Institute. In some cases, the gap is slight—for cashiers, dishwashers, food preparation workers, and hosts and hostesses, it’s a matter of cents. But in others, including supervisors and bartenders, the difference is well over a dollar. For managers, the highest earning occupation, the disparity was nearly three dollars per hour.

“This is what we identify as pay discrimination,” said Valerie Wilson, an economist at the Economic Policy Institute. “The work women are doing is being valued at less than the work men do in the same job.” Women, however, aren’t merely being paid less to do the same job—they’re being paid less and less compared to men as they move up in the ranks, too. Some of the highest earning occupations—managers, bartenders, and supervisors—are also the ones with the largest gender pay gap.

Drum adds his two cents:

There are other reasons besides gender for pay gaps, and the EPI report lists several of them. Whites make more than blacks. High school grads make more than dropouts. Older workers make more than younger ones. You’d need to control for all this and more to get a more accurate picture of the gender gap.

But in a way, that misses the point. There are lots of reasons for the gender gap in pay. Some is just plain discrimination. Some is because women take off more time to raise children. Some is because women are encouraged to take different kinds of jobs. But all of these are symptoms of the same thing. In a myriad of ways, women still don’t get a fair shake.

But that’s not to say that the business is especially kind to men, either. Tom Philpott pulls some other salient findings from the EPI report, which paints a grim portrait of restaurant workers overall:

Restaurant workers’ median wage stands at $10 per hour, tips included—and hasn’t budged, in inflation-adjusted terms, since 2000. For nonrestaurant US workers, the median hourly wage is $18. That means the median restaurant worker makes 44 percent less than other workers. Benefits are also rare—just 14.4 percent of restaurant workers have employer-sponsored health insurance and 8.4 percent have pensions, vs. 48.7 percent and 41.8 percent, respectively, for other workers. …

As a result, the people who prepare and serve you food are pretty likely to live in poverty. The overall poverty rate stands at 6.3 percent. For restaurant workers, the rate is 16.7 percent. For families, researchers often look at twice the poverty threshold as proxy for what it takes to make ends meet, EPI reports. More than 40 percent of restaurant workers live below twice the poverty line—that’s double the rate of nonrestaurant workers.

Slight Is Right?

by Dish Staff

Screen Shot 2014-08-28 at 6.35.33 PM

Research suggests that people are willing to pay a premium to support small businesses:

In one experiment, the researchers gave subjects a $5-off coupon upon entering a local bookstore. Some of the coupons had short blurbs explaining that the store was in direct competition with a store on the scale of Barnes and Noble, and some explained that the store was competing with another mom-and-pop bookstore. (A control group was given coupons without any text like this.) Those who received the large-competitor coupon were far more likely to make a purchase than those who got small-competitor coupons—even though the coupons made no comments on the store’s superior quality.

Proximity also matters. The researchers analyzed Yelp reviews of local coffee shops, and found that, in general, the closer a shop was to the nearest Starbucks, the higher its rating was on Yelp.

Firms may already be aware that the narrative of the underdog carries immense sway with consumers—researchers suspect people are drawn to companies whose stories they perceive to mirror their own experiences. But this study expands that: “Consumers may want to punish stronger competitors…[and watch] them fail,” the study’s authors write. And this doesn’t just have to do with products, like coffee, that benefit from being seen as authentic; a local electronics store could theoretically sell more batteries by simply playing up its stiff competition with Radioshack.

Keeping The World On Track

by Dish Staff

dish_gloablroadmap2

Of the 15.5 million miles of road estimated to be added to the planet’s surface by the year 2050, 90% will be in developing countries. For a new paper headed by conservation biologist William Laurance, scientists created a Global Road Map to mark “regions that should stay road-free, those where roads would be most useful and those where there is likely to be conflict between the competing interests of human development and protecting nature”:

Laurance and his colleagues argue that roads could have the most benefit when they link agricultural areas to the rest of society, since global food demand is expected to double by the middle of this century. With that in mind, the researchers identified the regions of the world that are most suitable for intensifying agricultural production. These are largely areas that are warm for at least part of the year and have enough rainfall to grow crops. Then the team created a map of regions that would be best to preserve, such as those with high biodiversity, those important for carbon storage and protected areas like national parks. …

The Amazon, Siberia and southwest Africa were among the regions where further road building would be unwise, according to the maps. India, Africa just south of the Sahara, large swaths of land stretching from Eastern Europe west into Russia, and the central United States would be home to prime spots for new roads that would assist agriculture. Central America, Southeast Asia, Madagascar, Turkey and Spain, though, have a lot of area where the nations would have to weigh the needs of their populations with the desire to protect the land. Many of the conflict areas are in poor countries, “and telling those countries not to build roads is hardly going to be popular,” Stephen Perz of the University of Florida, Gainesville, writes in an accompanying commentary. However, “a global road plan is not intended to ‘keep developing countries poor’, but rather to highlight the costs as well as the benefits of building roads,” he argues.

(GIF showing roads and croplands encroaching on the Amazon rainforest in Brazil between 2000 and 2012 via NASA Earth Observatory and Smithsonian.com)

Don’t Knock Weird Science

by Dish Staff

That’s Josie Glausiusz’s takeaway after reading a paper published earlier this year by Patricia Brennan, an evolutionary biologist who’s received federal funding for her research into the sexual anatomy of ducks (a subject explored in the hilarious video from Ze Frank seen above):

Brennan and her colleagues explain that many people believe the federal government should fund only applied science designed to “cure disease, develop renewable energy, or improve agriculture.” They may not understand that the scientific process is “convoluted and unpredictable,” or that it takes a great deal of basic science work before its application leads to significant health or economic benefits. Another problem, Brennan told me, is that many people “have absolutely no idea how science is funded and how little money we actually get for it.” In fact, as she notes, the percentage of the overall budget that Congress allocates to science “has declined from 2.91 to 2.77 percent of our GDP between 2009-2011 (and that percentage includes the science budget for the Department of Defense, which is about half of all our research budget).” For comparison, 19 percent of the U.S. budget, or $643 billion, was allocated for defense and “security-related international activities” in 2013.

She and her colleagues cite a number of technologies inspired by esoteric evolutionary innovations. Examples include Geckskin, “a reusable, glue-free adhesive pad” invented after decades of research on the soft hairs coating gecko toepads, which enables the lizards to walk upside down; and widespread use of an enzyme called Taq polymerase—first isolated in 1965 from a bacterium surviving in hot springs in Yellowstone National Park—to replicate short strings of DNA. That enzyme has brought “vast benefits” to medicine, agriculture, and the criminal justice system, they say. Brennan’s own research could lead to improved understanding of hypospadias, a birth defect that causes malformation of the penis in baby boys.

(Un)happy Labor Day

by Dish Staff

Is the American labor movement “friendless”? Roger Martin reflects on its unpopularity in partisan politics:

A key marker occurred in 1992 when President Bill Clinton signed into law a tax change that allowed only the first $1 million in CEO compensation to be deducted for corporate income tax purposes. It was supposed to discourage corporations from paying their CEOs more than what was then thought to be an excessive $1 million (imagine that!) – and failed spectacularly as they were given stock options instead, which made them wealthier than ever before.

But in whose favor was this measure intended? Labor?  Hardly.  There was no obvious benefit to them.  Capital?  Yes indeed. Shareholders were complaining about CEOs demanding ever-higher compensation – and the Democrats responded to help capital reign in CEO talent. Arguably the attention to the needs of capital has continued in the Obama administration. This administration featured enthusiastic embrace of the TARP bailouts of banks that protected their shareholders first and foremost and the continued low interest policies that favor capital owners.  Of course, the argument can be made that these policies help labor too, by avoiding a recession/depression. But the careful attention to capital first is a relatively new behavior for the Democrats.

Meanwhile, the Republican Party has increasingly shifted its allegiance to high-end talent, a tiny offshoot of labor that began to emerge around 1960. During the Reagan era, for instance, they cut the top marginal income tax rate from 70% in 1980 to 50% just two years later. By 1988 it was 28%. In seven years, an executive earning a million-dollar salary went from keeping $340,000 after federal taxes to keeping $725,000. That’s quite a raise.

Lessons From A Long-Time Loner

by Dish Staff

dish_mainemoonlight

Christopher Knight spent nearly three decades living alone in the woods of Maine, earning him the nickname “the North Pond Hermit,” before getting caught for theft and sentenced to prison. Michael Finkel asked Knight about what he learned from a solitary, hardscrabble existence:

Anyone who reveals what he’s learned, Chris told me, is not by his definition a true hermit. Chris had come around on the idea of himself as a hermit, and eventually embraced it. When I mentioned Thoreau, who spent two years at Walden, Chris dismissed him with a single word: “dilettante.”

True hermits, according to Chris, do not write books, do not have friends, and do not answer questions. I asked why he didn’t at least keep a journal in the woods. Chris scoffed. “I expected to die out there. Who would read my journal? You? I’d rather take it to my grave.” The only reason he was talking to me now, he said, is because he was locked in jail and needed practice interacting with others.

“But you must have thought about things,” I said. “About your life, about the human condition.”

Chris became surprisingly introspective.

“I did examine myself,” he said. “Solitude did increase my perception. But here’s the tricky thing—when I applied my increased perception to myself, I lost my identity. With no audience, no one to perform for, I was just there. There was no need to define myself; I became irrelevant. The moon was the minute hand, the seasons the hour hand. I didn’t even have a name. I never felt lonely. To put it romantically: I was completely free.”

That was nice. But still, I pressed on, there must have been some grand insight revealed to him in the wild. He returned to silence. Whether he was thinking or fuming or both, I couldn’t tell. Though he did arrive at an answer. I felt like some great mystic was about to reveal the Meaning of Life.

“Get enough sleep.”

He set his jaw in a way that conveyed he wouldn’t be saying more. This is what he’d learned. I accepted it as truth.

(Photo by Flickr user Ctd 2005)