Prose For The Road

Reviewing Patrick Leigh Fermor’s posthumously published The Broken Road: From the Iron Gates to Mount Athos, Daniel Mendelsohn samples the charms of the man “considered by some to be the greatest travel writer of the twentieth century”:

The author’s chattiness, his inexhaustible willingness to be distracted, his susceptibility to detours geographical, intellectual, aesthetic, and occasionally amorous constitute, if anything, an essential and self-conscious component of the style that has won him such an avid following. It has more than a little in common with the “centrifugal lambency and recoil” he found in Central European design, the “swashbuckling, exuberant and preposterous” aesthetic that he so extravagantly admired in a picture of Maximilian I’s knights, which he came across one night while leafing through a book on German history in the luxurious apartment of a charming girl he met and ended up staying with in Stuttgart. (The strange new city, the chance meeting, the aesthetic reverie, the hints of money and eros: this would prove to be the pattern of the young man’s progress across the continent.)

It is indeed odd that, among the many classical authors to whom Leigh Fermor refers in his writing—none more famously than Horace, verses of whose Soracte Ode the author found himself swapping, in Latin, with a German general he had kidnapped on Crete during World War II, a famous incident that was later turned into a film starring Dirk Bogarde—Herodotus does not figure more prominently. There is no writer whose technique Leigh Fermor’s more closely resembles. Expansive, meandering, circular, it allows him to weave what is, after all, a relatively straightforward tale of a youthful backpacking hike into a vast and highly colored tapestry, embroidered with observations, insights, and lessons about the whole panorama of European history, society, architecture, religion, and art.

When Leigh Fermor died in 2011, David Bentley Hart paid enthusiastic tribute to his writing:

He was…a man of boundless erudition: a classicist, a linguist, an historian, deeply and broadly read, widely and wisely traveled, with impeccable taste in literature and the arts. As it happens, his formal education was of the most irregular and intermittent kind. He was sent as a boy to a “progressive school” (which was something of a nudist colony), had a good private tutor for a while, got himself expelled from Canterbury’s King’s School, was drummed out of Sandhurst before beginning studies, and never attended university. And yet few men of his time could match him for breadth of learning.

Early on in life, he acquired a passion for Greece and all things Byzantine (in part, under the influence of Robert Byron). He even celebrated his twentieth birthday by staying at the Russian monastery of St. Panteleimon on Mt. Athos, and his book Roumeli includes some of the most illuminating writing on Orthodox monasticism in English. (Even Leigh Fermor’s close friends seem uncertain whether he had any particular religious convictions, but he definitely had a fascination with the monastic life.) And for a great deal of his life, he kept his home in southern Greece.

In the end, Leigh Fermor will chiefly be remembered for his prose , which has few credible rivals in modern English letters. He was an exacting and excruciatingly slow writer, by all accounts. He could polish a single sentence obsessively, draft upon draft, for months on end. Nothing went to print before it met his highest standards, which were already far higher than most of his contemporaries could hope to achieve. He also spent a great deal of his life living rather than writing. The result is that, when one adds up the sum of his published works, one sometimes cannot help but feel he was a little parsimonious towards his readers.

Dreher recently discovered Leigh Fermor as well, and offers similar praise, especially for his A Time of Gifts. Check out his excerpt-heavy posts on the man here, here, and here.

A Short Story For Saturday

The opening paragraphs of Leo Tolstoy’s 1885 story, “Where Love Is, God is“:

IN A CERTAIN TOWN there lived a cobbler, Martin Avdéiteh by name. He had a tiny room in a basement, the one window of which looked out on to the street. Through it one could only see the feet of those who passed by, but Martin recognized the people by their boots. He had lived long in the place and had many acquaintances. There was hardly a pair of boots in the neighbourhood that had not been once or twice through his hands, so he often saw his own handiwork through the window. Some he had re-soled, some patched, some stitched up, and to some he had even put fresh uppers. He had plenty to do, for he worked well, used good material, did not charge too much, and could be relied on. If he could do a job by the day required, he undertook it; if not, he told the truth and gave no false promises; so he was well known and never short of work.

Martin had always been a good man; but in his old age he began to think more about his soul and to draw nearer to God. While he still worked for a master, before he set up on his own account, his wife had died, leaving him with a three-year old son. None of his elder children had lived, they had all died in infancy. At first Martin thought of sending his little son to his sister’s in the country, but then he felt sorry to part with the boy, thinking: ‘It would be hard for my little Kapitón to have to grow up in a strange family; I will keep him with me.’

Read the rest here. For more check out Great Short Works of Leo TolstoyPrevious SSFSs here.

Face Of The Day

Screen Shot 2014-05-30 at 12.31.43 PM

Luisa Whitton photographs the future:

London-based photographer Luisa Whitton devoted several months to documenting the efforts of scientists who are striving to create robots that are nearly indistinguishable from humans. The results are as fascinating as they are unsettling.

Whitton says that the goal of her project, titled What About the Heart?, is to “subvert the traditional formula of portraiture and allure the audience into a debate on the boundaries that determine the dichotomy of the human/not human. The photographs become documents of objects that sit between scientific tool and horrid simulacrum”

See more of Whitton’s work here.

Failing The Screen Test

When F. Scott Fitzgerald moved to Hollywood, he took screenwriting seriously – which was a serious mistake, Richard Brody argues:

Much to his credit and much to his misfortune, he was unable to sell out. He didn’t condescend to the movies, but took them seriously—so seriously that he made the mistake of thinking that screenwriting was writing, and that it could take its place in his oeuvre, which, in turn, would mark the cinema with his original artistry. In the introduction to Fitzgerald’s screenplay for his story “Babylon Revisited,” the novelist and screenwriter Budd Schulberg (who fictionalized their relationship in the novel “The Disenchanted”) explained:

Instead of rejecting screenwriting as a necessary evil, Fitzgerald went the other way and embraced it as a new art form, even while recognizing that it was an art frequently embarrassed by the “merchants” more comfortable with mediocrity in their efforts to satisfy the widest possible audience. …

In short, Fitzgerald was undone by his screenwriting-is-writing mistake. It’s a notion that has its basis in artistic form. Look at Fitzgerald’s books: they are stylistically pellucid, following on the great realistic tradition, brushed only lightly by the wings of self-consciously interventionist, modernist formalism (as in the list of party guests in “The Great Gatsby”). By contrast, William Faulkner, who went to Hollywood in the early nineteen-thirties, had no such illusions about screenwriting—in part because his sinuous and syntactically profuse writing bore so little relation to the lens-like transparency of a screenplay’s overt storytelling.

How, then, could Fitzgerald have sold out successfully? Brody proposes an alternate history: “Had Fitzgerald only been born half a century later, he … might have made the successful transition to a television-series showrunner.”

Previous Dish on Faulkner’s Hollywood detour here.

Murakami In Motion

Pointing to the animated film seen above, Psychologist Ilana Simons pays tribute to her favorite living writer, the “psychologically minded” Haruki Murakami:

Haruki Murakami’s fans tend to say he touches the otherwise-private areas in their minds. Murakami is a Japanese novelist who is a minimalist with language: He inspires grand journeys of imagination without using too many adjectives. He says he sometimes writes his novels in English—which is not his first language—to limit his vocabulary and assume an odd relationship to ordinary life.

Read Murakami’s recent New Yorker short story, “Yesterday,” here.

“The Least Understood Of The Great Modernists”

Adam Plunkett awards that honor to Robert Frost, “despite the aspects of his work that children can understand easily”:

His work tends not to challenge us in the ways that we have come to expect from good poetry. His famously intricate forms are often rather easy to spell out, as are his less famous allusions (Romantic poets, Emerson, Shakespeare, Dante, Virgil, Catullus, Plato, Swedenborg, Darwin). His singular achievement in meter – what he called “the sound of sense”was such an achievement precisely because he could convey a wide range of emotions by sentence sounds alone, even when we haven’t made sense of them. He was no great student of philosophy or history or politics, and most of his abstract ideas, almost all about poetry, are right there in the Collected Prose or the poems.

The challenge, then, is the lack of challenge: that we experience the poems with more depth than we can usually comprehend. In this we are uncannily like the people in his poems walking past forests or down country roads or finding other sights that entrance them, who know that they reflexively project their desires and fears onto what they see even though they can’t help themselves. No matter how false their projections, their experience of them is as real and as intricate as that of what they see (and as that of their own theories and doubts). To read Frost is to feel his characters’ inner conflicts and to feel as conflicted as his characters, who are all too often lost in themselves. So the critic is tasked with the slippery business of tracing her patterns of feeling and thought back to the source without leaving too much of herself or, like most critics, too little.

The View From Your Window Contest

vfywc-200

You have until noon on Tuesday to guess it. City and/or state first, then country. Please put the location in the subject heading, along with any description within the email. If no one guesses the exact location, proximity counts.  Be sure to email entries to contest@andrewsullivan.com. Winner gets a free The View From Your Window book or two free gift subscriptions to the Dish. Have at it.

When Is Appropriation Appropriate?

Nabeelah Jaffer considers how culture comes together:

[Philosophers James O.] Young and [Conrad G.] Brunk are among the first philosophers to have begun exploring the morality of different forms of appropriation, and in The Ethics of Cultural Appropriation (2009) they deem it ‘reasonable’ to be offended at the inauthentic representation of your religious beliefs. But Young also argues that giving reasonable offence might not be morally wrong. …

Every nation – every community – has its dish_tartaninvented rituals and symbols, which spring up to help people grasp the terms of this intangible ‘communion’ in everyday life, fading away when they no longer do the job. And then new practices that suit new ways of life rise up to fill their place. Sir Walter Scott improvised many supposedly ‘ancient’ clan tartans in the 19th century – the cloth that had once been banned as a symbol of Scottish patriotism was being reimagined as a symbol of British unity. Middle Eastern belly dancing might originally have been connected with pagan fertility rituals or with travellers from India, just as early Christmas celebrations appropriated the festival of the pagan winter solstice. Private aural confession emerged as a Christian ritual in Ireland only in the sixth century, before spreading as a sacrament to the rest of Europe. All rituals and symbols are constructed, and inherited tradition is not always the best test of cultural authenticity. Religions and cultures – and indeed nations – have survived only by being open to new ways of representing themselves. Few have scrupled about drawing inspiration from others in the process. No matter how offensive – and even destructive – cultural appropriation can be, it is almost impossible to separate every murky incident from this wider process of exchange and adaptation.

(Photo: In a full Stewart Clan Tartan Kilt, Norm Taflinger, 3rd Air Force, RAF Mildenhall, plays “Amazing Grace” on the bagpipes as part of a memorial held at RAF Lakenheath, via Flickr user Beverly & Pack)

A Death Knell For “Disruption”?

Jill Lepore has had enough of it:

Ever since The Innovator’s Dilemma, everyone is either disrupting or being disrupted. There are disruption consultants, disruption conferences, and disruption seminars. This fall, the University of Southern California is opening a new program: “The degree is in disruption,” the university announced. “Disrupt or be disrupted,” the venture capitalist Josh Linkner warns in a new book, The Road to Reinvention, in which he argues that “fickle consumer trends, friction-free markets, and political unrest,” along with “dizzying speed, exponential complexity, and mind-numbing technology advances,” mean that the time has come to panic as you’ve never panicked before. …

Most big ideas have loud critics. Not disruption. Disruptive innovation as the explanation for how change happens has been subject to little serious criticism, partly because it’s headlong, while critical inquiry is unhurried; partly because disrupters ridicule doubters by charging them with fogyism, as if to criticize a theory of change were identical to decrying change; and partly because, in its modern usage, innovation is the idea of progress jammed into a criticism-proof jack-in-the-box.

It’s not just an overused buzzword, she argues. In fact, the influential concept of “disruptive innovation” is probably bunk:

Disruptive innovation as a theory of change is meant to serve both as a chronicle of the past (this has happened) and as a model for the future (it will keep happening). The strength of a prediction made from a model depends on the quality of the historical evidence and on the reliability of the methods used to gather and interpret it. Historical analysis proceeds from certain conditions regarding proof. None of these conditions have been met. … The handpicked case study, which is [Innovator’s Dilemma author Clayton] Christensen’s method, is a notoriously weak foundation on which to build a theory.

Timothy B. Lee will hear nothing of the sort:

[T]he term “disruption” has often come hand in hand with a certain amount of hucksterism. People who have no connection to Christensen, many of whom don’t seem to even understand his theory, have declared everyone and everything to be disruptive. The problem has become so bad that that many intelligent people have begun writing it off as a meaningless buzzword. Yet Lepore’s nitpicking aside, Christensen’s theory has a lot of explanatory power. It’s impossible to talk about what’s happening to companies as diverse as Kodak, Microsoft, and the New York Times without the vocabulary and concepts Christensen developed. And while understanding the theory won’t solve all the problems these companies face, it will certainly allow them to make more thoughtful decisions than if they follow Lepore’s advice and write it off altogether.

Will Oremus has a measured approach:

Lepore’s piece largely succeeds in skewering the zaniest of the “disruptive innovation” zealots and highlighting the fallibility of their prophet and holy text. And her revisionist reading of Christensen’s book adds some important caveats to his model. Importantly, the “disruptors” don’t always win in the end, and in some cases established businesses might harm themselves more by overreacting than underreacting. It would also be a great service if Lepore’s story had the effect of making people stop and think before they throw around “disrupt” as a buzzword. That said, Lepore’s cherry-picked counterexamples don’t definitively overthrow Christensen’s theory any more than his own cherry-picked examples definitively prove it.

Drake Bennett wonders, “If the evidence is, in fact, more ambiguous, why have Christensen’s ideas proven so broadly popular?”

Some part of it is clearly the rise of the Internet. In the age of Uber and Craigslist, the idea certainly feels true (particularly to those, like journalists, whose business is feeling disrupted at the moment).  … Part of it, too, I’d argue, is that Christensen’s description of how the world works matches how Silicon Valley sees itself, and Silicon Valley has gotten a lot more culturally important. The disruption narrative is one in which the upstarts are the heroes. Their eventual victory over the established order is foreordained, and they are the force that moves society – or at least technology – forward, disruption by disruption. Starting a company holds the potential to be not only lucrative, but also revolutionary.

At the same time, Kevin Roose resents the way some use the term as an “all-purpose rhetorical bludgeon [to] distract us from the real issues with emergent products and companies”:

Frequently, when start-ups working in heavily regulated industries encounter resistance from lawmakers or industry overseers, the concept of disruption is invoked almost instinctively. “But we’re disruptive!” the start-up pleads. “How can you be against disruption?” The problem with this reaction is that it lumps all opposition to new technology into the same category – anti-progress Luddites protecting the status quo at the expense of innovation. In reality, motives differ widely. Maybe a flashy new biotech start-up is being opposed becuse regulators are in the pocket of Big Pharma. Or maybe the FDA is holding it up because the founder is a charlatan selling fake stem-cell treatments to children. When every new innovation is cast as disruptive, there’s no way to distinguish between legitimate opposition and mere protectionism.