Thursday, February 10, 2011

Escaping the middle-income trap

I returned a few days ago from Kuala Lumpur, the capital of Malaysia, where the talk of the town – well, at least among economists -- is the “middle-income trap.” What's that, you ask? A developing nation gets “trapped” when it reaches a certain, relatively comfortable level of income but can't seem to take that next big jump into the true big leagues of the world economy, with per capita wealth to match. Every go-go economy in Asia has confronted this “trap,” or is dealing with it now. Breaking out of it, however, is extremely difficult. The reason is that escaping the “trap” requires an entire overhaul of the economic growth model most often used by emerging economies.

Malaysia's caught in the “trap” right now, and getting out if is going to be tough. Simply put, Malaysia needs to change what it has been doing economically for the past 40 years. How Malaysia got itself into the “trap,” and how it could escape from it, can provide us with some valuable lessons on development and, more specifically, how developing nations can graduate into becoming fully advanced economies.

The concept behind the “middle-income trap” is quite simple: It's easier to rise from a low-income to a middle-income economy than it is to jump from a middle-income to a high-income economy. That's because when you're really poor, you can use your poverty to your advantage. Cheap wages makes a low-income economy competitive in labor-intensive manufacturing (apparel, shoes and toys, for example). Factories sprout up, creating jobs and increasing incomes. Every rapid-growth economy in Asia jumpstarted its famed gains in human welfare in this way, including Malaysia.

However, that growth model eventually runs out of steam. As incomes increase, so do costs, undermining the competitiveness of the old, low-tech manufacturing industries. Countries (like Malaysia) then move “up the value chain,” into exports of more technologically advanced products, like electronics. But even that's not enough to avoid the “trap.” To get to that next level – that high-income level – an economy needs to do more than just make stuff by throwing people and money into factories. The economy has to innovate and use labor and capital more productively. That requires an entirely different way of doing business. Instead of just assembling products designed by others, with imported technology, companies must invest more heavily in R&D on their own and employ highly educated and skilled workers to turn those investments into new products and profits. It is a very, very hard shift to achieve. Thus the “trap.”

South Korea is probably the best current example of a developing economy making the leap into the realm of the most advanced. Companies like Samsung and LG are becoming true leaders in their fields. Taiwan isn't far behind. China's policymakers are fully aware that, with labor costs rising, it needs to follow suit.


Malaysia, though, is quite far from where it wants to be. That's a bit surprising based on its remarkable recent history. Malaysia has been among the best performing economies in the world since World War II, one of only 13 to record an average growth rate of 7% over at least a 25-year period. The country has an amazing record of improving human welfare. In 1970, some 50% of Malaysians lived in absolute poverty; now less than 4% do. Yet Malaysians also feel that they've become somewhat stuck where they are. GDP growth has slowed up, from an annual average of 9.1% between 1990 and 1997 to 5.5% from 2000 and 2008. Meanwhile, other Asian economies have zipped by Malaysia. According to the World Bank, the per capita gross national income (GNI) of South Korea in 1970 was below that of Malaysia ($260 versus $380), but by 2009, South Korea's was three times larger than Malaysia's ($21,530 versus $6,760).  Malaysia is getting “trapped” as a relatively prosperous but still middle-income nation.

Can Malaysia escape? The initial indications are not encouraging. The economy's growth engine remains unchanged – export-oriented manufacturing backed by foreign investment. Its companies are just not innovating or adding much value to what they produce. You can find all of the ugly details in a very thorough study by the World Bank, released in April. Private investment has sunk precipitously, from more than a third of GDP in the mid-1990s to only some 10% today. Labor productivity is growing more slowly than in the 1990s. The “value-added” in manufacturing in Malaysia trails many of its neighbors – an indication that Malaysian factories are mainly assembling goods designed elsewhere. R&D spending remains frighteningly low, at about 0.6% of GDP (compared to 3.5% in South Korea). If Malaysia is going to break the “trap,” it has to reverse all of these trends.

How can Malaysia achieve that? The World Bank report has pages of recommendations. The basics include slicing apart the bureaucratic red tape that stifles competition and suppresses investment, bolstering the education system so it can churn out more top-notch graduates, and funneling more financial resources to start-ups and other potentially innovative firms. To its credit, the government of Malaysia is fully aware of what it needs to do. In March, Prime Minister Najib Razak introduced a reform program called the New Economic Model. You can read the initial report here. The NEM shows that Najib realizes that excessive government interference in the economy is dampening investor sentiment and holding back Malaysian industry. All eyes now are waiting for the more detailed policy recommendations for the NEM (though it is not clear when those might appear).

Yet I'm wondering if getting policy right is really enough. Of course, it would help, by setting in place better incentives for private businessmen to invest in innovative projects, and creating the tools they need to make those projects work. But I don't think that's the whole story. I've been musing on the differences between South Korea and Malaysia. Why has Korea jumped so far ahead? I think the reason is embedded in the different methods the two countries used to spur rapid growth.

Both countries relied exports to create rapid gains in income, but they did so differently. South Korea, from its earliest days of export-led development in the mid-1960s, had been determined to create homegrown, internationally competitive industries. Though Korean firms supplied big multinationals with components or even entire products, that was never enough – Korea wanted to manufacture its own products under its own brands. The effort was often a painful one – remember Hyundai's first disastrous foray into the U.S. car market in the late 1980s and early 1990s – but Korea is where it is today because its private companies have been working on getting there for a very long time, backed in full by the financial sector and the government.

Malaysia, on the other hand, relied much, much more on foreign investment to drive industrialization. That's not a bad thing – multinational companies provide an instant shot of capital, jobs, expertise and technology into a poor country. MNCs, however, aren't going to develop Malaysian products; that has to take place in the labs and offices of Malaysia's private businesses. But those businessmen have been content to squeeze profits from serving MNCs and maintaining their original, assembly-based business models.

In other words, what is needed for Malaysia to break from the “middle-income trap” is a greater national commitment to innovate on its own. Entrepreneurs and bankers have to be willing to take more risks to support inventive ventures and new technologies. Talented workers have to be willing to take jobs at home instead of Silicon Valley. The Malaysian private sector has to be more devoted to the country's future. This is fuzzy stuff, outside of the realm of usual economics. But I fear the kind of commitment needed to escape the “trap” unfortunately can't be created by government initiatives alone.

Wednesday, February 9, 2011

Avoiding the Middle Income Trap

New York Times, 25 October 2010

GYEONGJU, SOUTH KOREA — The past is not an infallible guide to the future, but a reading of how economies have developed suggests that China needs to get ready for a slowdown in economic growth in the coming years.

And that same history lesson could have Beijing praying that it can follow in the footsteps of vibrant South Korea, not stagnant Japan.

The gathering of finance officials from the Group of 20 major economies last weekend was aimed at securing short-term economic growth and currency stability. But the opulence of the resort where the Group of 20 met was a vivid illustration of how South Korea has avoided the so-called middle income trap and continued to push living standards closer to those of rich economies.

For decades, many countries in Latin America and the Middle East have failed in this task. In Asia, the Philippines is a prominent example.

“Many countries make it from low income to middle income, but very few actually make that second leap to high-income,” said Ardo Hansson, a World Bank economist in Beijing. “They seem to get stuck in a trap where your costs are escalating and you lose competitiveness.”

Not so South Korea. When war on the divided peninsula came to a halt in 1953, the south was poorer than the north. By 1997, though, the South Korean per capita gross domestic product (at purchasing power parity exchange rates) had reached 57 percent of the average of the Organization for Economic Cooperation and Development, a group of free-market democracies which Seoul joined in 1996.

The 1997-98 Asian financial meltdown set back many countries across the region. Investment, vital to sustaining medium-term economic growth, has still not recovered to precrisis levels in Malaysia, the Philippines and Thailand.

South Korea, though, after nearly defaulting on its debts at the end of 1997, pulled itself together and resumed its march up the value chain.

The key reason is that Seoul embarked on far-reaching market changes. In particular, the government reduced the power of the chaebol, the sprawling debt-heavy conglomerates whose links to the state created the impression that they were too big to fail.

But many did fail as South Korea injected more competition into the economy, liberalized imports and deregulated the financial sector, which was a captive source of financing for the chaebol.
“They really changed the rules of the game for the large corporations,” said Randall Jones, who heads the O.E.C.D.’s South Korea desk. “It became clear that being big and being close to government was not enough to keep you alive.”

Since the crisis, the South Korean economy has grown more than twice as fast as the O.E.C.D. average, propelling per capita gross domestic product to 83 percent of the group average by 2008.

“Korea is a success story because of what they’ve been able to do during the past decade, and it’s the wave of reform back in 1997-98 that gave them that second wind,” Mr. Jones said.

The lessons for Beijing seem evident. The chaebol can be likened to China’s state-owned enterprises, which generally enjoy cozy monopolies and favorable financing from state-owned banks that are themselves cosseted.

Beijing needs to emphasize the efficiency of investment, not its scale. It must foster innovation and make it easier for more productive private companies to enter sectors like finance and logistics.

“Part of it is just making sure that you are creating new sources of growth all the time,” said Mr. Hansson of the World Bank.

A particular lesson from South Korea is that investing in human capital is critical to avoiding the middle income trap.

“Korea, 50 years ago, already had very high levels of educational attainment,” Mr. Hansson said. “There has to be some sense in which making that final leap really depends upon widespread access to high-quality education.”

Emulating South Korea would help China to improve the structure of its economy and actually benefit from the loss of momentum that history suggests is looming.

According to data compiled by Angus Maddison, an economic historian, and cited by Morgan Stanley, about 40 economies have attained a per capita gross domestic product level of $7,000 over the past century or so.

Remarkably, the average economic growth rate of 31 of those 40 economies was 2.8 percentage points less in the decade after the $7,000 inflection point was reached than in the preceding decade.

Japan and South Korea reached the $7,000 mark around 1969 and 1988, respectively, whereupon their annual average economic growth rates decelerated in the following decade by 4.1 and 2.4 percentage points, respectively, Morgan Stanley calculates.

China’s per capita gross domestic product is less than $4,000 at market exchange rates, but Morgan Stanley said China had reached Mr. Maddison’s magic number, which is based on purchasing power, in 2008.
“If history is a guide and the law of gravity applies to China, China’s economic growth is set to slow,” Morgan Stanley said in a report.

China’s slowdown might be gentler given its continental-size economy and the potential for catch-up in the poorer interior. But the development experience of its neighbors, including Taiwan, is a benchmark too powerful to ignore.

Morgan Stanley has penciled in average economic growth for China of 8 percent a year between 2010 and 2020, down from 10.3 percent between 2000 and 2009.

Slower, though, can mean a better balance. In Japan and South Korea, consumption and labor income rose sharply as a share of gross domestic product in the decade after the growth rate peaked, while their service sectors expanded strongly.

China’s new five-year plan proclaims the same goals.

“China is not unique,” said Steven Zhang, a Morgan Stanley economist in Shanghai. “It will follow the pattern of Korea and Japan and, after the inflection point, consumption will take off and investment will decline.”

Monday, January 31, 2011

EYES ON THE PRIZE

Mantel.jpg
“Wolf Hall”, by Hilary Mantel, didn’t just win the Man Booker prize last year: it became the fastest-selling Booker winner ever. But behind her triumph, as she reveals in this memoir, lay a complicated relationship with awards ...

From INTELLIGENT LIFE Magazine, Autumn 2010

In 1994 I brought out a novel called “A Change of Climate”, which was shortlisted for a small prize given to books with a religious theme. It was the first time a novel had got so far with the judges, and I was surprised to be in contention. The main characters in my book were Christian missionaries, but after their God had watched unblinking while they endured trials that would shake any faith, their religion became to them no more than a habit, a set of behavioural tics, and in the absence of a belief in a benign universe they carried on grimly trying to be good because they hardly knew how to do anything else.

The winner was to be announced at a low-key gathering at an old-fashioned publishing house near the British Museum. I had never been to a literary party that was anything like this. Some of the invitees seemed to be taking, with shy simpers, their first alcoholic drink of the year. Conversation was a struggle; all we had in common was God. After I didn’t win, I came out into the fine light evening and hailed a cab. What I felt was the usual flatness after a wasted journey; I told myself I hadn’t really expected to win this one. But as we inched through the traffic, a reaction set in. I was swept, I was possessed, by an urge to do something wicked: something truly odious, something that would reveal me as a mistress of moral turpitude and utterly disqualify me from ever being shortlisted for that prize again. But what can you do, by yourself, in the back of a taxi on the way to Waterloo? Wishing to drain the chalice of evil to the dregs, I found myself out of ideas. I could possibly lean out of the window and make hideous faces at pedestrians; but how would they know that it was my hideous face? They might think I was always like that.

For a week or so, after I won the 2009 Man Booker prize for fiction with “Wolf Hall”, people in the streets did recognise me. They’d seen my triumph face, my unpretending grin of delight stretched as wide as a carved pumpkin. Sometimes they would burble happily at me and squeeze my hand, and sometimes they would just smile warmly as they passed, not quite sure who I was but knowing that they’d seen me in the paper, and in a happy context. On the train home one evening, a pale glowing woman with a Vermeer complexion, alighting prosaically at Woking, followed me through a carriage and whispered to me, leaving on my shoulder a ghost-touch of congratulation. All this was new to me. Before the Man Booker, I had trouble being recognised by a bookseller when I was standing next to a stack of my own books. 

I am a veteran of shortlists. I have served my time in the enclosures where the also-rans cool down after the race, every back turned, the hot crowds sucked away as if by a giant magnet to where the winner basks in the camera-flash. I have sat through a five-hour presentation ceremony in Manchester, where the prize was carried off by Anthony Burgess, then a spindly, elderly figure, who looked down at me from his great height, a cheque between thumb and finger, and said, “I expect you need this more than me,” and there again I experienced a wicked but ungratified impulse, to snatch the cheque away and stuff it into my bra. After such an evening, it’s hard to sleep; your failure turns into a queasy mess that churns inside you, mixed in with fragments from the sponsors’ speeches, and the traitorous whispers of dissatisfied judges. Lunchtime ceremonies are easier; but then, what do you do with the rest of the day? Once, when I was trudging home from my second failure to win the £20,000 Sunday Express Book of the Year award, a small boy I knew bobbed out on to the balcony of his flat.

“Did you win?”

I shook my head.

“Never mind,” he said, just like everyone else. And then, quite unlike everyone else: “If you like, you can come up and play with my guinea pig.”

That’s what friends are for. You need distraction; or you need to go home (as I do these days when I lose) and defiantly knock together a paragraph or two of your next effort. At my third shortlisting, I did win the Sunday Express prize. This time it was an evening event, and as the announcement approached I found myself pushing and shoving through a dense crowd of invitees, trying to get somewhere near the front just in case; and getting dirty looks, and elbows in the ribs. At the moment of the announcement I thought that a vast tray of ice-cubes had been broken over my head; the crackling noise was applause, the splintered light was from flashbulbs. The organisers made me hold up, for the cameras, one of those giant cheques they used to give to winners of the football pools. I did it without demur. Did I feel a fool? No. I felt rich.


How to conduct yourself as winner or loser is something the modern writer must work out without help from the writers of the past. As a stylist, you may pick up a trick or two from Proust, but on prize night he’d just have stayed in bed. As prizes have proliferated and increased, advances and royalties have fallen, and the freakish income that a prize brings is more and more important. Prizes bring media attention, especially if the judges can arrange to fall out in public. They bring in-store displays, and press advertising, and all the marketing goodies denied the non-winner; they bring sales, a stimulus to trade at a time when bookselling is in trouble. By the time I won the Man Booker I had scrabbled my way to half a dozen lesser awards, but in the 1980s and 1990s marketing was less sharp, and the whole prize business looked less like a blood sport. I had been publishing for over 20 years, and although the reviewers had been consistently kind, I had never sold in great numbers. But moments after I took my cheque from the hands of the Man Booker judges, an ally approached me, stabbing at an electronic device in her hand: “I’ve just checked Amazon—you’re number one—you’re outselling Dan Brown.”

Amazon itself—with its rating system, its sales charts, its reader reviews—feels like part of the prize industry, part of the process of constantly ranking and categorising authors, and ranking and categorising them in the most public way. To survive the scrutiny you must understand that (much as you love winning them) prizes are not, or not necessarily, a judgment on the literary merit of your work. Winners emerge by negotiation and compromise. Awards have their political aspects, and juries like to demonstrate independence of mind; sometimes a book which has taken one major award is covertly excluded from consideration for others. Sometimes the judges are actors or politicians, who harbour a wish to write fiction themselves—if, of course, they had the time. I have sat on juries where the clashing of celebrity egos drowned out the whispers from the pages surveyed, and the experience has been so unfair and miserable that I have said to myself “never again”. 
But you do learn from being a judge that, in a literary sense, some verdicts matter and some don’t.

Sometimes the pressure on judges seems intolerable. When I was a Booker judge myself, back in 1990, we read about 105 books. Last year there were 132. I was lucky enough to serve under a businesslike chairman, Sir Denis Forman, a man who knew how to run a meeting from his time at Granada Television. All the same, I remember the final judging session as one of the great ordeals of my life. So much depended on it, for the winners and the losers. I was already nettled by the leaking of tittle-tattle and misinformation to journalists. It came from the administration, I think, not the judges. “Just mischief,” someone suggested, smiling. I was taking it too seriously, I suppose: as if it were a capital trial, and we were going to hang five authors and let one escape. But we were, it seemed to me, giving one author a life—a different life. There was an element of bathos when the winner, A.S. Byatt, said that she would use the money to build a swimming pool at her second home. At times of crisis—and winning this prize is a crisis—people say the most extraordinary things. I seem to recall one novelist saying more humbly that his winner’s cheque would pay for an extra bathroom. For years I dreamt of pursuing the watery theme: of flourishing my £50,000 with a cry of, “At last, I see my way to an indoor lavatory.”

I didn’t say it, of course. Jokes are wasted at prize time. I had never been shortlisted for the Booker till the year of my win, but I looked at those who were already winners with a narrow eye; I read their books, and also searched their faces. But whether it has changed their lives as it has changed mine is a mystery to me. The smiling repression—so many years of congratulating others, so many trudges home, so many taxis, guinea pigs; the sheer hypocrisy of pretending you don’t mind losing. These take their toll. You become a worse person, though not necessarily a worse writer, while you’re waiting for your luck to turn. When finally last year at the Guildhall in London, after an evening of dining and of speeches that, it seemed to me, were excruciatingly prolonged, when finally the moment came and I heard the name of a book and that book was mine, I jumped out of my chair as if I had been shot out of a catapult, and what I felt was primitive, savage glee. You have to win the Man Booker at the right time, pious folk tell you. You have to win it for the right book, runs the received wisdom. Balderdash, said my heart; but it used a stronger, shorter word. You just have to win it, right now. Hand it over. It’s been long enough.

The writer inside you feels no sense of entitlement. She—or it—judges a work by internal standards that are hard to communicate or define. The “author”, the professional who is in the prose business, has worldly concerns. You know the first question from the press will be, “What will you do with the money?” The truth was that I would use it to reduce my mortgage. But that reply would by no means do, and I felt obliged to say “Sex, drugs and rock’n’roll.” The public don’t like to think of authors as citizens who pay their debts. They like to think of them living lives of fabulous dissipation in warm climates, at someone else’s expense. The public want to regard you as a being set apart, with some quirk of brain function or some inbuilt moral freakishness that would explain everything, if only you would acknowledge it. They want to know, what is the stimulus to your creativity? What makes you write?

Sometimes you want to shrug and say, it’s my job. You don’t ask a plumber, what makes you plumb? You understand he does it to get his living. You don’t draw him aside and say, “Actually I plumb a bit myself, would you take a look at this loo I fitted? All my friends say it’s rather good.” But it’s little use insisting that writing is an ordinary job; you’d be lying. Readers understand that something strange is going on when a successful work of fiction is created—something that, annoyingly, defies being cast into words. If we poke the author with a stick for long enough, hard enough, he’ll crack and show us the secret, which perhaps he doesn’t know himself. We have to catch him when he’s vulnerable—and he is never more vulnerable than when someone has caught him on a public platform and given him a big cheque. He may be grinning from ear to ear, but he’s swarming with existential doubts.


“You are currently the top writer in the world,” an interviewer said to me on Booker night.
“It’s not the Olympics,” I said, aghast. The progress of the heart—which is what your writing is—cannot be measured like the progress of your feet on a race track. And yet, you can’t deny, it has been. On a particular night in October, you’ve got your nose ahead of J.M. Coetzee.
   
I have found I can live with the contradictions. I think there is one kind of writer who might be scalped and skinned by the demands the prize imposes, and that is the writer who finds public performance difficult, who has failed to create a persona he can send out to do the show. As people often observe, there is no reason why skill in writing and skill in platform performance would go together; I have witnessed some horrible scenes in the back rooms of bookshops, as writers sweat and stutter and suffer a mini-breakdown before going out to face 20 people, some of whom have wandered in because they saw a light, some of whom have manuscript-shaped parcels under their seats, some of whom have never heard of you before tonight, and have come on purpose to tell you so. Generally, it seems to me, authors are better at presenting themselves than they were ten years ago. Festivals flourish, we get more practice; you could give a reading somewhere every week of the year if you liked. For me the transition between desk and platform seems natural enough. I think of writing fiction as a sort of condensed version of acting and each book as a vast overblown play. You impersonate your characters intensively, you live inside their skins, wear their clothes and stamp or mince through life in their shoes; you breathe in their air. “Madame Bovary, c’est moi.” Of course she is. Who else could she be?

Some nine months on, I can report that the Man Booker has done me nothing but good. Because I am in the middle of a project—my next book is the sequel to the prize-winner—it has not destabilised me, just delayed me. The delay is worthwhile, because the prize has helped me find publishers in 30 countries. It has made my sales soar and hugely boosted my royalties. In doing these things it has cut me free. For the next few years at least, I can write what I like, just as I could before I was ever in print. I wrote for 12 years before I published anything, and in those years I felt a recklessness, a hungry desire, a gnawing expectation, that I lost when I became a jobbing professional who would tap you out a quick 800 words, to a deadline, on almost anything you liked. It is hard to make a good income from fiction alone, but now perhaps I can do it. I haven’t lived in a glamorous whirl since I won the prize. I could have taken up any number of invitations to festivals abroad, but only if I ditched the commitments at home that were already in my diary. I am, anyway, a bit world-weary and more than a bit ill, and intensely interested in the next thing I will write. Even when you are taking your bow, lapping up applause, you do know this brute fact: that you are only as good as your next sentence. You might wake up tomorrow and not be able to do it. The process itself will not fail you. But your nerve might fail.

On the evening of the Man Booker, if you are a shortlisted candidate, you are told that if you win you will be speaking live on air within a moment or two, and that after a long and late night you must be up early for breakfast TV, and that you will be talk-talk-talking into the middle of next week, to an overlapping series of interviewers. You must be ready, poised; so everyone is given a copy of the winner’s schedule to tuck away in their pocket or bag. So, for some hours, no one is the winner and you are all the winner. I already had plans for my week should I lose, and as I waited, watching the TV cameras manoeuvre in the run-up to the chairman’s speech, I split neatly into two component parts: one for schedule A, one for schedule B. All such decisions are narrow ones. You win by a squeak or you lose. Your life changes or it doesn’t. There is really no cause for self-congratulation: no time, either. You do not know till the moment you know; or at least, no wash of rumour reached me, lapping towards the stage from the back of the hall. So I wonder, what happened to the woman on schedule B, the one with the sinking heart and the sad loser’s smile? I can’t help worrying she’s escaped and she’s out there by night, in the chill of last autumn, wandering the city streets in a most inappropriate gold dress.

"Wolf Hall" is out now in paperback (Fourth Estate)

(Hilary Mantel is the author of ten novels, including "Wolf Hall".)
Picture Credit: Diver Aguilar

Friday, January 28, 2011

The Philosophical Novel

by James Ryerson
New York Times, 20 January 2011

Can a novelist write philosophically? Even those novelists most commonly deemed “philosophical” have sometimes answered with an emphatic no. Iris Murdoch, the longtime Oxford philosopher and author of some two dozen novels treating highbrow themes like consciousness and morality, argued that philosophy and literature were contrary pursuits. Philosophy calls on the analytical mind to solve conceptual problems in an “austere, unselfish, candid” prose, she said in a BBC interview broadcast in 1978, while literature looks to the imagination to show us something “mysterious, ambiguous, particular” about the world. Any appearance of philosophical ideas in her own novels was an inconsequential reflection of what she happened to know. “If I knew about sailing ships I would put in sailing ships,” she said. “And in a way, as a novelist, I would rather know about sailing ships than about philosophy.”

Some novelists with philosophical backgrounds vividly recall how they felt when they first encountered Murdoch’s hard-nosed view. Rebecca Newberger Goldstein, whose first novel, “The Mind-Body Problem” (1983), was published after she earned a Ph.D. in philosophy from Princeton, remembers being disappointed and confused. “It didn’t ring true,” she told me. “But how could she not be being truthful about such a central feature of her intellectual and artistic life?” Still, Goldstein and other philosophically trained novelists — including David Foster Wallace, William H. Gass and Clancy Martin — have themselves wrestled with the relationship between their two intellectual masters. Both disciplines seek to ask big questions, to locate and describe deeper truths, to shape some kind of order from the muddle of the world. But are they competitors — the imaginative intellect pitted against the logical mind — or teammates, tackling the same problems from different angles?

Philosophy has historically viewed literature with suspicion, or at least a vague unease. Plato was openly hostile to art, fearful of its ability to produce emotionally beguiling falsehoods that would disrupt the quest for what is real and true. Plato’s view was extreme (he proposed banning dramatists from his model state), but he wasn’t crazy to suggest that the two enterprises have incompatible agendas. Philosophy is written for the few; literature for the many. Philosophy is concerned with the general and abstract; literature with the specific and particular. Philosophy dispels illusions; literature creates them. Most philosophers are wary of the aesthetic urge in themselves. It says something about philosophy that two of its greatest practitioners, Aristotle and Kant, were pretty terrible writers.

Of course, such oppositions are never so simple. Plato, paradoxically, was himself a brilliant literary artist. Nietzsche, Schopenhauer and Kierkegaard were all writers of immense literary as well as philosophical power. Philosophers like Jean-Paul Sartre and George Santayana have written novels, while novelists like Thomas Mann and Robert Musil have created fiction dense with philosophical allusion. Some have even suggested, only half in jest, that of the brothers William and Henry James, the philosopher, William, was the more natural novelist, while the novelist, Henry, was the more natural philosopher. (Experts quibble: “If William is often said to be novelistic, that’s because he is widely — but wrongly — thought to write well,” the philosopher Jerry Fodor told me. “If Henry is said to be philosophical, that’s because he is widely — but wrongly — thought to write badly.”)

David Foster Wallace, who briefly attended the Ph.D. program in philosophy at Harvard after writing a first-rate undergraduate philosophy thesis (published in December by Columbia University Press as “Fate, Time, and Language”), believed that fiction offered a way to capture the emotional mood of a philosophical work. The goal, as he explained in a 1990 essay in The Review of Contemporary Fiction, wasn’t to make “abstract philosophy ‘accessible’ ” by simplifying ideas for a lay audience, but to figure out how to recreate a reader’s more subjective reactions to a philosophical text. Unfortunately, Wallace declared his most overtly philosophical novel — his first, “The Broom of the System” (1987), which incorporates the ideas of Ludwig Wittgenstein — to be a failure in this respect. But he thought others had succeeded in writing “philosophically,” especially David Markson, whose bleak, abstract, solitary novel “Wittgenstein’s Mistress” (1988) he praised for evoking the bleak, abstract, solitary feel of Wittgenstein’s early philosophy.

Another of Wallace’s favorite novels was “Omensetter’s Luck” (1966), by William H. Gass, who received his Ph.D. in philosophy from Cornell and taught philosophy for many years at Washington University in St. Louis. In an interview with The Paris Review in 1976, Gass confessed to feeling a powerful resistance to the analytical rigor of his academic schooling (“I hated it in lots of ways”), though he ultimately appreciated it as a kind of mental strength-training. Like Murdoch, he claimed that the influence of his philosophical education on his fiction was negligible. “I don’t pretend to be treating issues in any philosophical sense,” he said. “I am happy to be aware of how complicated, and how far from handling certain things properly I am, when I am swinging so wildly around.”

Unlike Murdoch, Gass and Wallace, Rebecca Newberger Goldstein, whose latest novel is “36 Arguments for the Existence of God,” treats philosophical questions with unabashed directness in her fiction, often featuring debates or dialogues among characters who are themselves philosophers or physicists or mathematicians. Still, she says that part of her empathizes with Murdoch’s wish to keep the loose subjectivity of the novel at a safe remove from the philosopher’s search for hard truth. It’s a “huge source of inner conflict,” she told me. “I come from a hard-core analytic background: philosophy of science, mathematical logic. I believe in the ideal of objectivity.” But she has become convinced over the years of what you might call the psychology of philosophy: that how we tackle intellectual problems depends critically on who we are as individuals, and is as much a function of temperament as cognition. Embedding a philosophical debate in richly imagined human stories conveys a key aspect of intellectual life. You don’t just understand a conceptual problem, she says: “You feel the problem.”

If you don’t want to overtly feature philosophical ideas in your novel, how sly about it can you be before the effect is lost? Clancy Martin’s first novel, “How to Sell” (2009), a drug-, sex- and diamond-fueled story about a high-school dropout who works with his older brother in the jewelry business, was celebrated by critics as a lot of things — but “philosophical” was not usually one of them. Martin, a professor of philosophy at the University of Missouri at Kansas City, had nonetheless woven into the story, which is at its heart about forms of deception, disguised versions of Kant’s argument on the supposed right to lie in order to save a life, Aristotle’s typology of four kinds of liars, and Nietzsche’s theory of deception (the topic of Martin’s Ph.D. dissertation). Not that anyone noticed. “A lot of my critics said: ‘Couldn’t put it down. You’ll read it in three hours!’ ” Martin told me. “And I felt like I put too much speed into the fastball. I mean, just because you can read it in three hours doesn’t mean that you ought to do so, or that there’s nothing hiding beneath the surface.”
Which raises an interesting, even philosophical question: Is it possible to write a philosophical novel without anyone knowing it?

James Ryerson is an editor at The New York Times Magazine. He wrote the introduction to David Foster Wallace’s “Fate, Time, and Language: An Essay on Free Will,” published in December.

Nonfiction: Nabokov Theory on Butterfly Evolution Is Vindicated

by Carl Zimmer
New York Times, 25 January 2011


A male Acmon blue butterfly (Icaricia acmon). Vladimir Nabokov described the Icaricia genus in 1944.

Vladimir Nabokov may be known to most people as the author of classic novels like “Lolita” and “Pale Fire.” But even as he was writing those books, Nabokov had a parallel existence as a self-taught expert on butterflies.

He was the curator of lepidoptera at the Museum of Comparative Zoology at Harvard University, and collected the insects across the United States. He published detailed descriptions of hundreds of species. And in a speculative moment in 1945, he came up with a sweeping hypothesis for the evolution of the butterflies he studied, a group known as the Polyommatus blues. He envisioned them coming to the New World from Asia over millions of years in a series of waves.

Few professional lepidopterists took these ideas seriously during Nabokov’s lifetime. But in the years since his death in 1977, his scientific reputation has grown. And over the past 10 years, a team of scientists has been applying gene-sequencing technology to his hypothesis about how Polyommatus blues evolved. On Tuesday in the Proceedings of the Royal Society of London, they reported that Nabokov was absolutely right.

“It’s really quite a marvel,” said Naomi Pierce of Harvard, a co-author of the paper.

Nabokov inherited his passion for butterflies from his parents. When his father was imprisoned by the Russian authorities for his political activities, the 8-year-old Vladimir brought a butterfly to his cell as a gift. As a teenager, Nabokov went on butterfly-hunting expeditions and carefully described the specimens he caught, imitating the scientific journals he read in his spare time. Had it not been for the Russian Revolution, which forced his family into exile in 1919, Nabokov said that he might have become a full-time lepidopterist.

In his European exile, Nabokov visited butterfly collections in museums. He used the proceeds of his second novel, “King, Queen, Knave,” to finance an expedition to the Pyrenees, where he and his wife, Vera, netted over a hundred species. The rise of the Nazis drove Nabokov into exile once more in 1940, this time to the United States. It was there that Nabokov found his greatest fame as a novelist. It was also there that he delved deepest into the science of butterflies.


Nabokov spent much of the 1940s dissecting a confusing group of species called Polyommatus blues. He developed forward-thinking ways to classify the butterflies based on differences in their genitalia. He argued that what were thought to be closely related species were actually only distantly related.

At the end of a 1945 paper on the group, he mused on how they had evolved. He speculated that they originated in Asia, moved over the Bering Strait, and moved south all the way to Chile.

Allowing himself a few literary flourishes, Nabokov invited his readers to imagine “a modern taxonomist straddling a Wellsian time machine.” Going back millions of years, he would end up at a time when only Asian forms of the butterflies existed. Then, moving forward again, the taxonomist would see five waves of butterflies arriving in the New World.

Nabokov conceded that the thought of butterflies making a trip from Siberia to Alaska and then all the way down into South America might sound far-fetched. But it made more sense to him than an unknown land bridge spanning the Pacific. “I find it easier to give a friendly little push to some of the forms and hang my distributional horseshoes on the nail of Nome rather than postulate transoceanic land-bridges in other parts of the world,” he wrote.

When “Lolita” made Nabokov a star in 1958, journalists were delighted to discover his hidden life as a butterfly expert. A famous photograph of Nabokov that appeared in The Saturday Evening Post when he was 66 is from a butterfly’s perspective. The looming Russian author swings a net with rapt concentration. But despite the fact that he was the best-known butterfly expert of his day and a Harvard museum curator, other lepidopterists considered Nabokov a dutiful but undistinguished researcher. He could describe details well, they granted, but did not produce scientifically important ideas.

Only in the 1990s did a team of scientists systematically review his work and recognize the strength of his classifications. Dr. Pierce, who became a Harvard biology professor and curator of lepidoptera in 1990, began looking closely at Nabokov’s work while preparing an exhibit to celebrate his 100th birthday in 1999.

She was captivated by his idea of butterflies coming from Asia. “It was an amazing, bold hypothesis,” she said. “And I thought, ‘Oh, my God, we could test this.’ ”

To do so, she would need to reconstruct the evolutionary tree of blues, and estimate when the branches split. It would have been impossible for Nabokov to do such a study on the anatomy of butterflies alone. Dr. Pierce would need their DNA, which could provide more detail about their evolutionary history.

Working with American and European lepidopterists, Dr. Pierce organized four separate expeditions into the Andes in search of blues. Back at her lab at Harvard, she and her colleagues sequenced the genes of the butterflies and used a computer to calculate the most likely relationships between them. They also compared the number of mutations each species had acquired to determine how long ago they had diverged from one another.

There were several plausible hypotheses for how the butterflies might have evolved. They might have evolved in the Amazon, with the rising Andes fragmenting their populations. If that were true, the species would be closely related to one another.

But that is not what Dr. Pierce found. Instead, she and her colleagues found that the New World species shared a common ancestor that lived about 10 million years ago. But many New World species were more closely related to Old World butterflies than to their neighbors. Dr. Pierce and her colleagues concluded that five waves of butterflies came from Asia to the New World — just as Nabokov had speculated.

“By God, he got every one right,” Dr. Pierce said. “I couldn’t get over it — I was blown away.”

Dr. Pierce and her colleagues also investigated Nabokov’s idea that the butterflies had come over the Bering Strait. The land surrounding the strait was relatively warm 10 million years ago, and has been chilling steadily ever since. Dr. Pierce and her colleagues found that the first lineage of Polyommatus blues that made the journey could survive a temperature range that matched the Bering climate of 10 million years ago. The lineages that came later are more cold-hardy, each with a temperature range matching the falling temperatures.
Nabokov’s taxonomic horseshoes turn out to belong in Nome after all.

"What a great paper," said James Mallet, an expert on butterfly evolution at University College London. "It's a fitting tribute to the great man to see that the most modern methods that technology can deliver now largely support his systematic arrangement."

Dr. Pierce says she believes Nabokov would have been greatly pleased to be so vindicated, and points to one of his most famous poems, “On Discovering a Butterfly.” The 1943 poem begins:

I found it and I named it, being versed
in taxonomic Latin; thus became
godfather to an insect and its first
describer — and I want no other fame.

“He felt that his scientific work was standing for all time, and that he was just a player in a much bigger enterprise,” said Dr. Pierce. “He was not known as a scientist, but this certainly indicates to me that he knew what it’s all about.”

Tuesday, January 25, 2011

Where have all the thinkers gone?

By Gideon Rachman
Financial Times, 24 January 2011


A few weeks ago I was sitting in my office, reading Foreign Policy magazine, when I made a striking discovery. Sitting next door to me, separated only by a narrow partition, is one of the world’s leading thinkers. Every year, Foreign Policy lists the people it regards as the “Top 100 Global Thinkers”. And there, at number 37, was Martin Wolf.

I popped next door to congratulate my colleague. Under such circumstances, it is compulsory for any English person to make a self-deprecating remark and Martin did not fail me. The list of intellectuals from 2010, he suggested, looked pretty feeble compared with a similar list that could have been drawn up in the mid 19th century.

This was more than mere modesty. He has a point. Once you start the list-making exercise, it is difficult to avoid the impression that we are living in a trivial age.

The Foreign Policy list for 2010, it has to be said, is slightly odd since the magazine’s top 10 thinkers are all more famous as doers. In joint first place come Bill Gates and Warren Buffett for their philanthropic efforts. Then come the likes of Barack Obama (at number three), Celso Amorim, the Brazilian foreign minister (sixth), and David Petraeus, the American general and also, apparently, the world’s eighth most significant thinker. It is not until you get down to number 12 on the list that you find somebody who is more famous for thinking than doing – Nouriel Roubini, the economist.

But, as the list goes on, genuine intellectuals begin to dominate. There are economists such as Joseph Stiglitz, journalists (Christopher Hitchens), philosophers (Martha Nussbaum), political scientists (Michael Mandelbaum), novelists (Maria Vargas Llosa) and theologians (Abdolkarim Soroush). Despite an inevitable bias to the English-speaking world, there are representatives from every continent including Hu Shuli, a Chinese editor, and Jacques Attali, carrying the banner for French intellectuals.

It is an impressive group of people. But now compare it with a similar list that could have been compiled 150 years ago. The 1861 rankings could have started with Charles Darwin and John Stuart Mill – On the Origin of Species and On Liberty were both published in 1859. Then you could include Karl Marx and Charles Dickens. And that was just the people living in and around London. In Russia, Tolstoy and Dostoevsky were both at work, although neither had yet published their greatest novels.

Even if, like Foreign Policy, you have a preference for politicians, the contrast between the giants of yesteryear and the relative pygmies of today is alarming. In 1861 the list would have included Lincoln, Gladstone, Bismarck and Garibaldi. Their modern equivalents would be Mr Obama, Nick Clegg, Angela Merkel and Silvio Berlusconi.

Still, perhaps 1861 was a freak? So let us repeat the exercise, and go back to the year when the second world war broke out. A list of significant intellectuals alive in 1939 would have included Einstein, Keynes, TS Eliot, Picasso, Freud, Gandhi, Orwell, Churchill, Hayek, Sartre.

So why does the current crop of thinkers seem so unimpressive? Here are a few possible explanations.
The first is that you might need a certain distance in order to judge greatness. Maybe it is only in retrospect that we can identify the real giants. It is certainly true that some of the people I have listed were not widely known or respected at the time. Marx worked largely in obscurity; Dickens was dismissed as a hack by some of his contemporaries; and Orwell’s reputation has also grown hugely since his death. But most of the giants of 1861 and 1939 were recognised as great intellects during their lifetime and some – such as Einstein and Picasso – became much-admired celebrities.

A second possibility is that familiarity breeds contempt. Maybe we are surrounded by thinkers who are just as great as the giants of the past, but we cannot recognise the fact because they are still in our midst. The modern media culture may also lead to overexposure of intellectuals, who are encouraged to produce too much. If Mill had been constantly on television; or Gandhi had tweeted five times a day – they might have seemed less impressive people and been less profound thinkers.

Another theory is that the nature of intellectual life has changed and become more democratic. The lists of 1861 and 1939 are dominated by that notorious species – the “dead white male”. In fact, “dead, white British males” seem to predominate. Perhaps there are intellectual giants at work now, but they are based in China or India or Africa – and have yet to come to the notice of Foreign Policy or the Financial Times.

In the modern world more people have access to knowledge and the ability to publish. The internet also makes collaboration much easier and modern universities promote specialisation. So it could be that the way that knowledge advances these days is through networks of specialists working together, across the globe – rather than through a single, towering intellect pulling together a great theory in the reading room of the British Museum. It is a less romantic idea – but, perhaps, it is more efficient.

And then there is a final possibility. That, for all its wealth and its gadgets, our generation is not quite as smart as it thinks it is.

Monday, January 17, 2011

Wake up and smell the jasmine

By David Gardner in London
Financial Times, 16 January 2011
Tunisia protester
A demonstrator in Tunis

The ignominious demise of Zein al-Abidine Ben Ali in Tunisia’s “Jasmine Revolution” has put a dent in the armour of the Arab national security state that will set tyrants trembling across the Middle East. The idea that Arab autocracies, with their backbone in the military and their central nervous system in the security services, are uniquely resilient to popular pressure has evaporated in the smoke of Tunis.

While that does not necessarily herald a wave of uprisings across the Arab world, such as those that swept across eastern Europe after the fall of the Berlin Wall, autocrats from Algiers to Amman and from Rabat to Cairo are at last aware that they now live in a different era. They will be on hyper-alert not only to stirrings among their usually cowed peoples but to any hint of change from a west that has acquiesced in their tyranny in the interests of short-term stability in a volatile and strategic region.
 
The west’s long connivance in this “Arab Exception” may be a welcome casualty of the Tunisian drama. The last 30 years have seen waves of democracy burst over almost every other despot-plagued region of the world, from Latin America to eastern Europe, and from sub-Saharan Africa to south-east Asia. Yet the Arab world remained marooned in tyranny. In the post-Communist era there is no other part of the world – not even China – treated by the West with such little regard for the political and human rights of its citizens.
The rationale has changed over time. In the late 19th and first half of the 20th century, France and Britain aborted the normal evolution of constitutional politics in the Arab colonies they carved out of the Ottoman Empire. For Britain the imperative was to secure the western approaches to India. After World War Two and the onset of the Cold War, the priority became to secure cheap oil, safeguard Israel and restrict the intrusion of the Soviets.

More recently, Arab regimes have frightened the west into believing that, but for them, Islamists (and Iran’s Shia theocrats) would take over the region. They maintain residual opposition parties – such as Egypt’s Wafd – as down-at-heel courtiers to exhibit to preachy westerners. Meanwhile they have laid waste to the political spectrum, leaving their opponents no rallying point except the mosque.

In the era of satellite TV and social media that has now changed. Tunisia was the second instance of this. Lebanon’s 2005 “Cedar Revolution” was a precursor, a civic uprising that ended three decades of Syrian occupation in less than three months. The digital revolution has reintegrated a fragmented Arab world in ways its technologically challenged leaders did not foresee and means socioeconomic grievances can quickly translate into broader political demands.

Economic hardship is, of course, the tinder that tends first to ignite, especially in a period of food- and fuel-price inflation. The lack of opportunity for young, increasingly educated populations, where between half and two-thirds are under the age of 25, is also a timebomb. The kleptocratic monopoly by most Arab regimes of resources as well as power is another.

But the narrative that economic reform must precede political reform – “let’s build the middle classes and then we’ll have some liberals to liberalise with” as one US ambassador once put it –is crudely determinist and an alibi for indefinitely postponing any political opening. Liberalising the economy quickly hits the wall of the national security states and the interests vested in them – which have no time for liberals.

President Hosni Mubarak of Egypt, under US pressure, in 2005 allowed the liberal Ayman Nour to stand against him. He restricted his majority to a mere 88 per cent, and then jailed his opponent on bogus charges. When Mr Mubarak took power three decades ago, 39 per cent of Egyptians were in absolute poverty; now 43 per cent are.

Mr Ben Ali was a western poster boy for economic reform, as his family fed on the economy.
Last week, as the fire in Tunisia raged, Hillary Clinton, US secretary of state, highlighted the region’s economic stagnation. Michelle Alliot-Marie, France’s foreign minister, even suggested sending French riot police to help. Wake up and smell the jasmine: it’s the politics, stupid.