Monday, January 31, 2011

EYES ON THE PRIZE

Mantel.jpg
“Wolf Hall”, by Hilary Mantel, didn’t just win the Man Booker prize last year: it became the fastest-selling Booker winner ever. But behind her triumph, as she reveals in this memoir, lay a complicated relationship with awards ...

From INTELLIGENT LIFE Magazine, Autumn 2010

In 1994 I brought out a novel called “A Change of Climate”, which was shortlisted for a small prize given to books with a religious theme. It was the first time a novel had got so far with the judges, and I was surprised to be in contention. The main characters in my book were Christian missionaries, but after their God had watched unblinking while they endured trials that would shake any faith, their religion became to them no more than a habit, a set of behavioural tics, and in the absence of a belief in a benign universe they carried on grimly trying to be good because they hardly knew how to do anything else.

The winner was to be announced at a low-key gathering at an old-fashioned publishing house near the British Museum. I had never been to a literary party that was anything like this. Some of the invitees seemed to be taking, with shy simpers, their first alcoholic drink of the year. Conversation was a struggle; all we had in common was God. After I didn’t win, I came out into the fine light evening and hailed a cab. What I felt was the usual flatness after a wasted journey; I told myself I hadn’t really expected to win this one. But as we inched through the traffic, a reaction set in. I was swept, I was possessed, by an urge to do something wicked: something truly odious, something that would reveal me as a mistress of moral turpitude and utterly disqualify me from ever being shortlisted for that prize again. But what can you do, by yourself, in the back of a taxi on the way to Waterloo? Wishing to drain the chalice of evil to the dregs, I found myself out of ideas. I could possibly lean out of the window and make hideous faces at pedestrians; but how would they know that it was my hideous face? They might think I was always like that.

For a week or so, after I won the 2009 Man Booker prize for fiction with “Wolf Hall”, people in the streets did recognise me. They’d seen my triumph face, my unpretending grin of delight stretched as wide as a carved pumpkin. Sometimes they would burble happily at me and squeeze my hand, and sometimes they would just smile warmly as they passed, not quite sure who I was but knowing that they’d seen me in the paper, and in a happy context. On the train home one evening, a pale glowing woman with a Vermeer complexion, alighting prosaically at Woking, followed me through a carriage and whispered to me, leaving on my shoulder a ghost-touch of congratulation. All this was new to me. Before the Man Booker, I had trouble being recognised by a bookseller when I was standing next to a stack of my own books. 

I am a veteran of shortlists. I have served my time in the enclosures where the also-rans cool down after the race, every back turned, the hot crowds sucked away as if by a giant magnet to where the winner basks in the camera-flash. I have sat through a five-hour presentation ceremony in Manchester, where the prize was carried off by Anthony Burgess, then a spindly, elderly figure, who looked down at me from his great height, a cheque between thumb and finger, and said, “I expect you need this more than me,” and there again I experienced a wicked but ungratified impulse, to snatch the cheque away and stuff it into my bra. After such an evening, it’s hard to sleep; your failure turns into a queasy mess that churns inside you, mixed in with fragments from the sponsors’ speeches, and the traitorous whispers of dissatisfied judges. Lunchtime ceremonies are easier; but then, what do you do with the rest of the day? Once, when I was trudging home from my second failure to win the £20,000 Sunday Express Book of the Year award, a small boy I knew bobbed out on to the balcony of his flat.

“Did you win?”

I shook my head.

“Never mind,” he said, just like everyone else. And then, quite unlike everyone else: “If you like, you can come up and play with my guinea pig.”

That’s what friends are for. You need distraction; or you need to go home (as I do these days when I lose) and defiantly knock together a paragraph or two of your next effort. At my third shortlisting, I did win the Sunday Express prize. This time it was an evening event, and as the announcement approached I found myself pushing and shoving through a dense crowd of invitees, trying to get somewhere near the front just in case; and getting dirty looks, and elbows in the ribs. At the moment of the announcement I thought that a vast tray of ice-cubes had been broken over my head; the crackling noise was applause, the splintered light was from flashbulbs. The organisers made me hold up, for the cameras, one of those giant cheques they used to give to winners of the football pools. I did it without demur. Did I feel a fool? No. I felt rich.


How to conduct yourself as winner or loser is something the modern writer must work out without help from the writers of the past. As a stylist, you may pick up a trick or two from Proust, but on prize night he’d just have stayed in bed. As prizes have proliferated and increased, advances and royalties have fallen, and the freakish income that a prize brings is more and more important. Prizes bring media attention, especially if the judges can arrange to fall out in public. They bring in-store displays, and press advertising, and all the marketing goodies denied the non-winner; they bring sales, a stimulus to trade at a time when bookselling is in trouble. By the time I won the Man Booker I had scrabbled my way to half a dozen lesser awards, but in the 1980s and 1990s marketing was less sharp, and the whole prize business looked less like a blood sport. I had been publishing for over 20 years, and although the reviewers had been consistently kind, I had never sold in great numbers. But moments after I took my cheque from the hands of the Man Booker judges, an ally approached me, stabbing at an electronic device in her hand: “I’ve just checked Amazon—you’re number one—you’re outselling Dan Brown.”

Amazon itself—with its rating system, its sales charts, its reader reviews—feels like part of the prize industry, part of the process of constantly ranking and categorising authors, and ranking and categorising them in the most public way. To survive the scrutiny you must understand that (much as you love winning them) prizes are not, or not necessarily, a judgment on the literary merit of your work. Winners emerge by negotiation and compromise. Awards have their political aspects, and juries like to demonstrate independence of mind; sometimes a book which has taken one major award is covertly excluded from consideration for others. Sometimes the judges are actors or politicians, who harbour a wish to write fiction themselves—if, of course, they had the time. I have sat on juries where the clashing of celebrity egos drowned out the whispers from the pages surveyed, and the experience has been so unfair and miserable that I have said to myself “never again”. 
But you do learn from being a judge that, in a literary sense, some verdicts matter and some don’t.

Sometimes the pressure on judges seems intolerable. When I was a Booker judge myself, back in 1990, we read about 105 books. Last year there were 132. I was lucky enough to serve under a businesslike chairman, Sir Denis Forman, a man who knew how to run a meeting from his time at Granada Television. All the same, I remember the final judging session as one of the great ordeals of my life. So much depended on it, for the winners and the losers. I was already nettled by the leaking of tittle-tattle and misinformation to journalists. It came from the administration, I think, not the judges. “Just mischief,” someone suggested, smiling. I was taking it too seriously, I suppose: as if it were a capital trial, and we were going to hang five authors and let one escape. But we were, it seemed to me, giving one author a life—a different life. There was an element of bathos when the winner, A.S. Byatt, said that she would use the money to build a swimming pool at her second home. At times of crisis—and winning this prize is a crisis—people say the most extraordinary things. I seem to recall one novelist saying more humbly that his winner’s cheque would pay for an extra bathroom. For years I dreamt of pursuing the watery theme: of flourishing my £50,000 with a cry of, “At last, I see my way to an indoor lavatory.”

I didn’t say it, of course. Jokes are wasted at prize time. I had never been shortlisted for the Booker till the year of my win, but I looked at those who were already winners with a narrow eye; I read their books, and also searched their faces. But whether it has changed their lives as it has changed mine is a mystery to me. The smiling repression—so many years of congratulating others, so many trudges home, so many taxis, guinea pigs; the sheer hypocrisy of pretending you don’t mind losing. These take their toll. You become a worse person, though not necessarily a worse writer, while you’re waiting for your luck to turn. When finally last year at the Guildhall in London, after an evening of dining and of speeches that, it seemed to me, were excruciatingly prolonged, when finally the moment came and I heard the name of a book and that book was mine, I jumped out of my chair as if I had been shot out of a catapult, and what I felt was primitive, savage glee. You have to win the Man Booker at the right time, pious folk tell you. You have to win it for the right book, runs the received wisdom. Balderdash, said my heart; but it used a stronger, shorter word. You just have to win it, right now. Hand it over. It’s been long enough.

The writer inside you feels no sense of entitlement. She—or it—judges a work by internal standards that are hard to communicate or define. The “author”, the professional who is in the prose business, has worldly concerns. You know the first question from the press will be, “What will you do with the money?” The truth was that I would use it to reduce my mortgage. But that reply would by no means do, and I felt obliged to say “Sex, drugs and rock’n’roll.” The public don’t like to think of authors as citizens who pay their debts. They like to think of them living lives of fabulous dissipation in warm climates, at someone else’s expense. The public want to regard you as a being set apart, with some quirk of brain function or some inbuilt moral freakishness that would explain everything, if only you would acknowledge it. They want to know, what is the stimulus to your creativity? What makes you write?

Sometimes you want to shrug and say, it’s my job. You don’t ask a plumber, what makes you plumb? You understand he does it to get his living. You don’t draw him aside and say, “Actually I plumb a bit myself, would you take a look at this loo I fitted? All my friends say it’s rather good.” But it’s little use insisting that writing is an ordinary job; you’d be lying. Readers understand that something strange is going on when a successful work of fiction is created—something that, annoyingly, defies being cast into words. If we poke the author with a stick for long enough, hard enough, he’ll crack and show us the secret, which perhaps he doesn’t know himself. We have to catch him when he’s vulnerable—and he is never more vulnerable than when someone has caught him on a public platform and given him a big cheque. He may be grinning from ear to ear, but he’s swarming with existential doubts.


“You are currently the top writer in the world,” an interviewer said to me on Booker night.
“It’s not the Olympics,” I said, aghast. The progress of the heart—which is what your writing is—cannot be measured like the progress of your feet on a race track. And yet, you can’t deny, it has been. On a particular night in October, you’ve got your nose ahead of J.M. Coetzee.
   
I have found I can live with the contradictions. I think there is one kind of writer who might be scalped and skinned by the demands the prize imposes, and that is the writer who finds public performance difficult, who has failed to create a persona he can send out to do the show. As people often observe, there is no reason why skill in writing and skill in platform performance would go together; I have witnessed some horrible scenes in the back rooms of bookshops, as writers sweat and stutter and suffer a mini-breakdown before going out to face 20 people, some of whom have wandered in because they saw a light, some of whom have manuscript-shaped parcels under their seats, some of whom have never heard of you before tonight, and have come on purpose to tell you so. Generally, it seems to me, authors are better at presenting themselves than they were ten years ago. Festivals flourish, we get more practice; you could give a reading somewhere every week of the year if you liked. For me the transition between desk and platform seems natural enough. I think of writing fiction as a sort of condensed version of acting and each book as a vast overblown play. You impersonate your characters intensively, you live inside their skins, wear their clothes and stamp or mince through life in their shoes; you breathe in their air. “Madame Bovary, c’est moi.” Of course she is. Who else could she be?

Some nine months on, I can report that the Man Booker has done me nothing but good. Because I am in the middle of a project—my next book is the sequel to the prize-winner—it has not destabilised me, just delayed me. The delay is worthwhile, because the prize has helped me find publishers in 30 countries. It has made my sales soar and hugely boosted my royalties. In doing these things it has cut me free. For the next few years at least, I can write what I like, just as I could before I was ever in print. I wrote for 12 years before I published anything, and in those years I felt a recklessness, a hungry desire, a gnawing expectation, that I lost when I became a jobbing professional who would tap you out a quick 800 words, to a deadline, on almost anything you liked. It is hard to make a good income from fiction alone, but now perhaps I can do it. I haven’t lived in a glamorous whirl since I won the prize. I could have taken up any number of invitations to festivals abroad, but only if I ditched the commitments at home that were already in my diary. I am, anyway, a bit world-weary and more than a bit ill, and intensely interested in the next thing I will write. Even when you are taking your bow, lapping up applause, you do know this brute fact: that you are only as good as your next sentence. You might wake up tomorrow and not be able to do it. The process itself will not fail you. But your nerve might fail.

On the evening of the Man Booker, if you are a shortlisted candidate, you are told that if you win you will be speaking live on air within a moment or two, and that after a long and late night you must be up early for breakfast TV, and that you will be talk-talk-talking into the middle of next week, to an overlapping series of interviewers. You must be ready, poised; so everyone is given a copy of the winner’s schedule to tuck away in their pocket or bag. So, for some hours, no one is the winner and you are all the winner. I already had plans for my week should I lose, and as I waited, watching the TV cameras manoeuvre in the run-up to the chairman’s speech, I split neatly into two component parts: one for schedule A, one for schedule B. All such decisions are narrow ones. You win by a squeak or you lose. Your life changes or it doesn’t. There is really no cause for self-congratulation: no time, either. You do not know till the moment you know; or at least, no wash of rumour reached me, lapping towards the stage from the back of the hall. So I wonder, what happened to the woman on schedule B, the one with the sinking heart and the sad loser’s smile? I can’t help worrying she’s escaped and she’s out there by night, in the chill of last autumn, wandering the city streets in a most inappropriate gold dress.

"Wolf Hall" is out now in paperback (Fourth Estate)

(Hilary Mantel is the author of ten novels, including "Wolf Hall".)
Picture Credit: Diver Aguilar

Friday, January 28, 2011

The Philosophical Novel

by James Ryerson
New York Times, 20 January 2011

Can a novelist write philosophically? Even those novelists most commonly deemed “philosophical” have sometimes answered with an emphatic no. Iris Murdoch, the longtime Oxford philosopher and author of some two dozen novels treating highbrow themes like consciousness and morality, argued that philosophy and literature were contrary pursuits. Philosophy calls on the analytical mind to solve conceptual problems in an “austere, unselfish, candid” prose, she said in a BBC interview broadcast in 1978, while literature looks to the imagination to show us something “mysterious, ambiguous, particular” about the world. Any appearance of philosophical ideas in her own novels was an inconsequential reflection of what she happened to know. “If I knew about sailing ships I would put in sailing ships,” she said. “And in a way, as a novelist, I would rather know about sailing ships than about philosophy.”

Some novelists with philosophical backgrounds vividly recall how they felt when they first encountered Murdoch’s hard-nosed view. Rebecca Newberger Goldstein, whose first novel, “The Mind-Body Problem” (1983), was published after she earned a Ph.D. in philosophy from Princeton, remembers being disappointed and confused. “It didn’t ring true,” she told me. “But how could she not be being truthful about such a central feature of her intellectual and artistic life?” Still, Goldstein and other philosophically trained novelists — including David Foster Wallace, William H. Gass and Clancy Martin — have themselves wrestled with the relationship between their two intellectual masters. Both disciplines seek to ask big questions, to locate and describe deeper truths, to shape some kind of order from the muddle of the world. But are they competitors — the imaginative intellect pitted against the logical mind — or teammates, tackling the same problems from different angles?

Philosophy has historically viewed literature with suspicion, or at least a vague unease. Plato was openly hostile to art, fearful of its ability to produce emotionally beguiling falsehoods that would disrupt the quest for what is real and true. Plato’s view was extreme (he proposed banning dramatists from his model state), but he wasn’t crazy to suggest that the two enterprises have incompatible agendas. Philosophy is written for the few; literature for the many. Philosophy is concerned with the general and abstract; literature with the specific and particular. Philosophy dispels illusions; literature creates them. Most philosophers are wary of the aesthetic urge in themselves. It says something about philosophy that two of its greatest practitioners, Aristotle and Kant, were pretty terrible writers.

Of course, such oppositions are never so simple. Plato, paradoxically, was himself a brilliant literary artist. Nietzsche, Schopenhauer and Kierkegaard were all writers of immense literary as well as philosophical power. Philosophers like Jean-Paul Sartre and George Santayana have written novels, while novelists like Thomas Mann and Robert Musil have created fiction dense with philosophical allusion. Some have even suggested, only half in jest, that of the brothers William and Henry James, the philosopher, William, was the more natural novelist, while the novelist, Henry, was the more natural philosopher. (Experts quibble: “If William is often said to be novelistic, that’s because he is widely — but wrongly — thought to write well,” the philosopher Jerry Fodor told me. “If Henry is said to be philosophical, that’s because he is widely — but wrongly — thought to write badly.”)

David Foster Wallace, who briefly attended the Ph.D. program in philosophy at Harvard after writing a first-rate undergraduate philosophy thesis (published in December by Columbia University Press as “Fate, Time, and Language”), believed that fiction offered a way to capture the emotional mood of a philosophical work. The goal, as he explained in a 1990 essay in The Review of Contemporary Fiction, wasn’t to make “abstract philosophy ‘accessible’ ” by simplifying ideas for a lay audience, but to figure out how to recreate a reader’s more subjective reactions to a philosophical text. Unfortunately, Wallace declared his most overtly philosophical novel — his first, “The Broom of the System” (1987), which incorporates the ideas of Ludwig Wittgenstein — to be a failure in this respect. But he thought others had succeeded in writing “philosophically,” especially David Markson, whose bleak, abstract, solitary novel “Wittgenstein’s Mistress” (1988) he praised for evoking the bleak, abstract, solitary feel of Wittgenstein’s early philosophy.

Another of Wallace’s favorite novels was “Omensetter’s Luck” (1966), by William H. Gass, who received his Ph.D. in philosophy from Cornell and taught philosophy for many years at Washington University in St. Louis. In an interview with The Paris Review in 1976, Gass confessed to feeling a powerful resistance to the analytical rigor of his academic schooling (“I hated it in lots of ways”), though he ultimately appreciated it as a kind of mental strength-training. Like Murdoch, he claimed that the influence of his philosophical education on his fiction was negligible. “I don’t pretend to be treating issues in any philosophical sense,” he said. “I am happy to be aware of how complicated, and how far from handling certain things properly I am, when I am swinging so wildly around.”

Unlike Murdoch, Gass and Wallace, Rebecca Newberger Goldstein, whose latest novel is “36 Arguments for the Existence of God,” treats philosophical questions with unabashed directness in her fiction, often featuring debates or dialogues among characters who are themselves philosophers or physicists or mathematicians. Still, she says that part of her empathizes with Murdoch’s wish to keep the loose subjectivity of the novel at a safe remove from the philosopher’s search for hard truth. It’s a “huge source of inner conflict,” she told me. “I come from a hard-core analytic background: philosophy of science, mathematical logic. I believe in the ideal of objectivity.” But she has become convinced over the years of what you might call the psychology of philosophy: that how we tackle intellectual problems depends critically on who we are as individuals, and is as much a function of temperament as cognition. Embedding a philosophical debate in richly imagined human stories conveys a key aspect of intellectual life. You don’t just understand a conceptual problem, she says: “You feel the problem.”

If you don’t want to overtly feature philosophical ideas in your novel, how sly about it can you be before the effect is lost? Clancy Martin’s first novel, “How to Sell” (2009), a drug-, sex- and diamond-fueled story about a high-school dropout who works with his older brother in the jewelry business, was celebrated by critics as a lot of things — but “philosophical” was not usually one of them. Martin, a professor of philosophy at the University of Missouri at Kansas City, had nonetheless woven into the story, which is at its heart about forms of deception, disguised versions of Kant’s argument on the supposed right to lie in order to save a life, Aristotle’s typology of four kinds of liars, and Nietzsche’s theory of deception (the topic of Martin’s Ph.D. dissertation). Not that anyone noticed. “A lot of my critics said: ‘Couldn’t put it down. You’ll read it in three hours!’ ” Martin told me. “And I felt like I put too much speed into the fastball. I mean, just because you can read it in three hours doesn’t mean that you ought to do so, or that there’s nothing hiding beneath the surface.”
Which raises an interesting, even philosophical question: Is it possible to write a philosophical novel without anyone knowing it?

James Ryerson is an editor at The New York Times Magazine. He wrote the introduction to David Foster Wallace’s “Fate, Time, and Language: An Essay on Free Will,” published in December.

Nonfiction: Nabokov Theory on Butterfly Evolution Is Vindicated

by Carl Zimmer
New York Times, 25 January 2011


A male Acmon blue butterfly (Icaricia acmon). Vladimir Nabokov described the Icaricia genus in 1944.

Vladimir Nabokov may be known to most people as the author of classic novels like “Lolita” and “Pale Fire.” But even as he was writing those books, Nabokov had a parallel existence as a self-taught expert on butterflies.

He was the curator of lepidoptera at the Museum of Comparative Zoology at Harvard University, and collected the insects across the United States. He published detailed descriptions of hundreds of species. And in a speculative moment in 1945, he came up with a sweeping hypothesis for the evolution of the butterflies he studied, a group known as the Polyommatus blues. He envisioned them coming to the New World from Asia over millions of years in a series of waves.

Few professional lepidopterists took these ideas seriously during Nabokov’s lifetime. But in the years since his death in 1977, his scientific reputation has grown. And over the past 10 years, a team of scientists has been applying gene-sequencing technology to his hypothesis about how Polyommatus blues evolved. On Tuesday in the Proceedings of the Royal Society of London, they reported that Nabokov was absolutely right.

“It’s really quite a marvel,” said Naomi Pierce of Harvard, a co-author of the paper.

Nabokov inherited his passion for butterflies from his parents. When his father was imprisoned by the Russian authorities for his political activities, the 8-year-old Vladimir brought a butterfly to his cell as a gift. As a teenager, Nabokov went on butterfly-hunting expeditions and carefully described the specimens he caught, imitating the scientific journals he read in his spare time. Had it not been for the Russian Revolution, which forced his family into exile in 1919, Nabokov said that he might have become a full-time lepidopterist.

In his European exile, Nabokov visited butterfly collections in museums. He used the proceeds of his second novel, “King, Queen, Knave,” to finance an expedition to the Pyrenees, where he and his wife, Vera, netted over a hundred species. The rise of the Nazis drove Nabokov into exile once more in 1940, this time to the United States. It was there that Nabokov found his greatest fame as a novelist. It was also there that he delved deepest into the science of butterflies.


Nabokov spent much of the 1940s dissecting a confusing group of species called Polyommatus blues. He developed forward-thinking ways to classify the butterflies based on differences in their genitalia. He argued that what were thought to be closely related species were actually only distantly related.

At the end of a 1945 paper on the group, he mused on how they had evolved. He speculated that they originated in Asia, moved over the Bering Strait, and moved south all the way to Chile.

Allowing himself a few literary flourishes, Nabokov invited his readers to imagine “a modern taxonomist straddling a Wellsian time machine.” Going back millions of years, he would end up at a time when only Asian forms of the butterflies existed. Then, moving forward again, the taxonomist would see five waves of butterflies arriving in the New World.

Nabokov conceded that the thought of butterflies making a trip from Siberia to Alaska and then all the way down into South America might sound far-fetched. But it made more sense to him than an unknown land bridge spanning the Pacific. “I find it easier to give a friendly little push to some of the forms and hang my distributional horseshoes on the nail of Nome rather than postulate transoceanic land-bridges in other parts of the world,” he wrote.

When “Lolita” made Nabokov a star in 1958, journalists were delighted to discover his hidden life as a butterfly expert. A famous photograph of Nabokov that appeared in The Saturday Evening Post when he was 66 is from a butterfly’s perspective. The looming Russian author swings a net with rapt concentration. But despite the fact that he was the best-known butterfly expert of his day and a Harvard museum curator, other lepidopterists considered Nabokov a dutiful but undistinguished researcher. He could describe details well, they granted, but did not produce scientifically important ideas.

Only in the 1990s did a team of scientists systematically review his work and recognize the strength of his classifications. Dr. Pierce, who became a Harvard biology professor and curator of lepidoptera in 1990, began looking closely at Nabokov’s work while preparing an exhibit to celebrate his 100th birthday in 1999.

She was captivated by his idea of butterflies coming from Asia. “It was an amazing, bold hypothesis,” she said. “And I thought, ‘Oh, my God, we could test this.’ ”

To do so, she would need to reconstruct the evolutionary tree of blues, and estimate when the branches split. It would have been impossible for Nabokov to do such a study on the anatomy of butterflies alone. Dr. Pierce would need their DNA, which could provide more detail about their evolutionary history.

Working with American and European lepidopterists, Dr. Pierce organized four separate expeditions into the Andes in search of blues. Back at her lab at Harvard, she and her colleagues sequenced the genes of the butterflies and used a computer to calculate the most likely relationships between them. They also compared the number of mutations each species had acquired to determine how long ago they had diverged from one another.

There were several plausible hypotheses for how the butterflies might have evolved. They might have evolved in the Amazon, with the rising Andes fragmenting their populations. If that were true, the species would be closely related to one another.

But that is not what Dr. Pierce found. Instead, she and her colleagues found that the New World species shared a common ancestor that lived about 10 million years ago. But many New World species were more closely related to Old World butterflies than to their neighbors. Dr. Pierce and her colleagues concluded that five waves of butterflies came from Asia to the New World — just as Nabokov had speculated.

“By God, he got every one right,” Dr. Pierce said. “I couldn’t get over it — I was blown away.”

Dr. Pierce and her colleagues also investigated Nabokov’s idea that the butterflies had come over the Bering Strait. The land surrounding the strait was relatively warm 10 million years ago, and has been chilling steadily ever since. Dr. Pierce and her colleagues found that the first lineage of Polyommatus blues that made the journey could survive a temperature range that matched the Bering climate of 10 million years ago. The lineages that came later are more cold-hardy, each with a temperature range matching the falling temperatures.
Nabokov’s taxonomic horseshoes turn out to belong in Nome after all.

"What a great paper," said James Mallet, an expert on butterfly evolution at University College London. "It's a fitting tribute to the great man to see that the most modern methods that technology can deliver now largely support his systematic arrangement."

Dr. Pierce says she believes Nabokov would have been greatly pleased to be so vindicated, and points to one of his most famous poems, “On Discovering a Butterfly.” The 1943 poem begins:

I found it and I named it, being versed
in taxonomic Latin; thus became
godfather to an insect and its first
describer — and I want no other fame.

“He felt that his scientific work was standing for all time, and that he was just a player in a much bigger enterprise,” said Dr. Pierce. “He was not known as a scientist, but this certainly indicates to me that he knew what it’s all about.”

Tuesday, January 25, 2011

Where have all the thinkers gone?

By Gideon Rachman
Financial Times, 24 January 2011


A few weeks ago I was sitting in my office, reading Foreign Policy magazine, when I made a striking discovery. Sitting next door to me, separated only by a narrow partition, is one of the world’s leading thinkers. Every year, Foreign Policy lists the people it regards as the “Top 100 Global Thinkers”. And there, at number 37, was Martin Wolf.

I popped next door to congratulate my colleague. Under such circumstances, it is compulsory for any English person to make a self-deprecating remark and Martin did not fail me. The list of intellectuals from 2010, he suggested, looked pretty feeble compared with a similar list that could have been drawn up in the mid 19th century.

This was more than mere modesty. He has a point. Once you start the list-making exercise, it is difficult to avoid the impression that we are living in a trivial age.

The Foreign Policy list for 2010, it has to be said, is slightly odd since the magazine’s top 10 thinkers are all more famous as doers. In joint first place come Bill Gates and Warren Buffett for their philanthropic efforts. Then come the likes of Barack Obama (at number three), Celso Amorim, the Brazilian foreign minister (sixth), and David Petraeus, the American general and also, apparently, the world’s eighth most significant thinker. It is not until you get down to number 12 on the list that you find somebody who is more famous for thinking than doing – Nouriel Roubini, the economist.

But, as the list goes on, genuine intellectuals begin to dominate. There are economists such as Joseph Stiglitz, journalists (Christopher Hitchens), philosophers (Martha Nussbaum), political scientists (Michael Mandelbaum), novelists (Maria Vargas Llosa) and theologians (Abdolkarim Soroush). Despite an inevitable bias to the English-speaking world, there are representatives from every continent including Hu Shuli, a Chinese editor, and Jacques Attali, carrying the banner for French intellectuals.

It is an impressive group of people. But now compare it with a similar list that could have been compiled 150 years ago. The 1861 rankings could have started with Charles Darwin and John Stuart Mill – On the Origin of Species and On Liberty were both published in 1859. Then you could include Karl Marx and Charles Dickens. And that was just the people living in and around London. In Russia, Tolstoy and Dostoevsky were both at work, although neither had yet published their greatest novels.

Even if, like Foreign Policy, you have a preference for politicians, the contrast between the giants of yesteryear and the relative pygmies of today is alarming. In 1861 the list would have included Lincoln, Gladstone, Bismarck and Garibaldi. Their modern equivalents would be Mr Obama, Nick Clegg, Angela Merkel and Silvio Berlusconi.

Still, perhaps 1861 was a freak? So let us repeat the exercise, and go back to the year when the second world war broke out. A list of significant intellectuals alive in 1939 would have included Einstein, Keynes, TS Eliot, Picasso, Freud, Gandhi, Orwell, Churchill, Hayek, Sartre.

So why does the current crop of thinkers seem so unimpressive? Here are a few possible explanations.
The first is that you might need a certain distance in order to judge greatness. Maybe it is only in retrospect that we can identify the real giants. It is certainly true that some of the people I have listed were not widely known or respected at the time. Marx worked largely in obscurity; Dickens was dismissed as a hack by some of his contemporaries; and Orwell’s reputation has also grown hugely since his death. But most of the giants of 1861 and 1939 were recognised as great intellects during their lifetime and some – such as Einstein and Picasso – became much-admired celebrities.

A second possibility is that familiarity breeds contempt. Maybe we are surrounded by thinkers who are just as great as the giants of the past, but we cannot recognise the fact because they are still in our midst. The modern media culture may also lead to overexposure of intellectuals, who are encouraged to produce too much. If Mill had been constantly on television; or Gandhi had tweeted five times a day – they might have seemed less impressive people and been less profound thinkers.

Another theory is that the nature of intellectual life has changed and become more democratic. The lists of 1861 and 1939 are dominated by that notorious species – the “dead white male”. In fact, “dead, white British males” seem to predominate. Perhaps there are intellectual giants at work now, but they are based in China or India or Africa – and have yet to come to the notice of Foreign Policy or the Financial Times.

In the modern world more people have access to knowledge and the ability to publish. The internet also makes collaboration much easier and modern universities promote specialisation. So it could be that the way that knowledge advances these days is through networks of specialists working together, across the globe – rather than through a single, towering intellect pulling together a great theory in the reading room of the British Museum. It is a less romantic idea – but, perhaps, it is more efficient.

And then there is a final possibility. That, for all its wealth and its gadgets, our generation is not quite as smart as it thinks it is.

Monday, January 17, 2011

Wake up and smell the jasmine

By David Gardner in London
Financial Times, 16 January 2011
Tunisia protester
A demonstrator in Tunis

The ignominious demise of Zein al-Abidine Ben Ali in Tunisia’s “Jasmine Revolution” has put a dent in the armour of the Arab national security state that will set tyrants trembling across the Middle East. The idea that Arab autocracies, with their backbone in the military and their central nervous system in the security services, are uniquely resilient to popular pressure has evaporated in the smoke of Tunis.

While that does not necessarily herald a wave of uprisings across the Arab world, such as those that swept across eastern Europe after the fall of the Berlin Wall, autocrats from Algiers to Amman and from Rabat to Cairo are at last aware that they now live in a different era. They will be on hyper-alert not only to stirrings among their usually cowed peoples but to any hint of change from a west that has acquiesced in their tyranny in the interests of short-term stability in a volatile and strategic region.
 
The west’s long connivance in this “Arab Exception” may be a welcome casualty of the Tunisian drama. The last 30 years have seen waves of democracy burst over almost every other despot-plagued region of the world, from Latin America to eastern Europe, and from sub-Saharan Africa to south-east Asia. Yet the Arab world remained marooned in tyranny. In the post-Communist era there is no other part of the world – not even China – treated by the West with such little regard for the political and human rights of its citizens.
The rationale has changed over time. In the late 19th and first half of the 20th century, France and Britain aborted the normal evolution of constitutional politics in the Arab colonies they carved out of the Ottoman Empire. For Britain the imperative was to secure the western approaches to India. After World War Two and the onset of the Cold War, the priority became to secure cheap oil, safeguard Israel and restrict the intrusion of the Soviets.

More recently, Arab regimes have frightened the west into believing that, but for them, Islamists (and Iran’s Shia theocrats) would take over the region. They maintain residual opposition parties – such as Egypt’s Wafd – as down-at-heel courtiers to exhibit to preachy westerners. Meanwhile they have laid waste to the political spectrum, leaving their opponents no rallying point except the mosque.

In the era of satellite TV and social media that has now changed. Tunisia was the second instance of this. Lebanon’s 2005 “Cedar Revolution” was a precursor, a civic uprising that ended three decades of Syrian occupation in less than three months. The digital revolution has reintegrated a fragmented Arab world in ways its technologically challenged leaders did not foresee and means socioeconomic grievances can quickly translate into broader political demands.

Economic hardship is, of course, the tinder that tends first to ignite, especially in a period of food- and fuel-price inflation. The lack of opportunity for young, increasingly educated populations, where between half and two-thirds are under the age of 25, is also a timebomb. The kleptocratic monopoly by most Arab regimes of resources as well as power is another.

But the narrative that economic reform must precede political reform – “let’s build the middle classes and then we’ll have some liberals to liberalise with” as one US ambassador once put it –is crudely determinist and an alibi for indefinitely postponing any political opening. Liberalising the economy quickly hits the wall of the national security states and the interests vested in them – which have no time for liberals.

President Hosni Mubarak of Egypt, under US pressure, in 2005 allowed the liberal Ayman Nour to stand against him. He restricted his majority to a mere 88 per cent, and then jailed his opponent on bogus charges. When Mr Mubarak took power three decades ago, 39 per cent of Egyptians were in absolute poverty; now 43 per cent are.

Mr Ben Ali was a western poster boy for economic reform, as his family fed on the economy.
Last week, as the fire in Tunisia raged, Hillary Clinton, US secretary of state, highlighted the region’s economic stagnation. Michelle Alliot-Marie, France’s foreign minister, even suggested sending French riot police to help. Wake up and smell the jasmine: it’s the politics, stupid.

Darkness on the Edge of the Universe















By Brian Greene
New York Times, 15 January 2011

IN a great many fields, researchers would give their eyeteeth to have a direct glimpse of the past. Instead, they generally have to piece together remote conditions using remnants like weathered fossils, decaying parchments or mummified remains. Cosmology, the study of the origin and evolution of the universe, is different. It is the one arena in which we can actually witness history.

The pinpoints of starlight we see with the naked eye are photons that have been streaming toward us for a few years or a few thousand. The light from more distant objects, captured by powerful telescopes, has been traveling toward us far longer than that, sometimes for billions of years. When we look at such ancient light, we are seeing — literally — ancient times.

During the past decade, as observations of such ancient starlight have provided deep insight into the universe’s past, they have also, surprisingly, provided deep insight into the nature of the future. And the future that the data suggest is particularly disquieting — because of something called dark energy.

This story of discovery begins a century ago with Albert Einstein, who realized that space is not an immutable stage on which events play out, as Isaac Newton had envisioned. Instead, through his general theory of relativity, Einstein found that space, and time too, can bend, twist and warp, responding much as a trampoline does to a jumping child. In fact, so malleable is space that, according to the math, the size of the universe necessarily changes over time: the fabric of space must expand or contract — it can’t stay put.

For Einstein, this was an unacceptable conclusion. He’d spent 10 grueling years developing the general theory of relativity, seeking a better understanding of gravity, but to him the notion of an expanding or contracting cosmos seemed blatantly erroneous. It flew in the face of the prevailing wisdom that, over the largest of scales, the universe was fixed and unchanging.

Einstein responded swiftly. He modified the equations of general relativity so that the mathematics would yield an unchanging cosmos. A static situation, like a stalemate in a tug of war, requires equal but opposite forces that cancel each other. Across large distances, the force that shapes the cosmos is the attractive pull of gravity. And so, Einstein reasoned, a counterbalancing force would need to provide a repulsive push. But what force could that be?

Remarkably, he found that a simple modification of general relativity’s equations entailed something that would have, well, blown Newton’s mind: antigravity — a gravitational force that pushes instead of pulls. Ordinary matter, like the Earth or Sun, can generate only attractive gravity, but the math revealed that a more exotic source — an energy that uniformly fills space, much as steam fills a sauna, only invisibly — would generate gravity’s repulsive version. Einstein called this space-filling energy the cosmological constant, and he found that by finely adjusting its value, the repulsive gravity it produced would precisely cancel the usual attractive gravity coming from stars and galaxies, yielding a static cosmos. He breathed a sigh of relief.

A dozen years later, however, Einstein rued the day he introduced the cosmological constant. In 1929, the American astronomer Edwin Hubble discovered that distant galaxies are all rushing away from us. And the best explanation for this cosmic exodus came directly from general relativity: much as poppy seeds in a muffin that’s baking move apart as the dough swells, galaxies move apart as the space in which they’re embedded expands. Hubble’s observations thus established that there was no need for a cosmological constant; the universe is not static.

Had Einstein only trusted the original mathematics of general relativity, he would have made one of the most spectacular predictions of all time — that the universe is expanding — more than a decade before it was discovered. Instead, he was left to lick his wounds, summarily removing the cosmological constant from the equations of general relativity and, according to one of his trusted colleagues, calling it his greatest blunder.
But the story of the cosmological constant was far from over.

Fast forward to the 1990s, when we find two teams of astronomers undertaking painstakingly precise observations of distant supernovae — exploding stars so brilliant they can be seen clear across the cosmos — to determine how the expansion rate of space has changed over the history of the universe. These researchers anticipated that the gravitational attraction of matter dotting the night’s sky would slow the expansion, much as Earth’s gravity slows the speed of a ball tossed upward. By bearing witness to distant supernovae, cosmic beacons that trace the universe’s expansion rate at various moments in the past, the teams sought to make this quantitative. Shockingly, however, when the data were analyzed, the teams found that the expansion rate has not been slowing down. It’s been speeding up.

It’s as if that tossed ball shot away from your hand, racing upward faster and faster. You’d conclude that something must be driving the ball away. Similarly, the astronomers concluded that something in space must be pushing galaxies apart ever more quickly. And after scrutinizing the situation, they have found that the push is most likely the repulsive gravity produced by a cosmological constant.

When Einstein introduced the cosmological constant, he envisioned its value being finely adjusted to exactly balance ordinary attractive gravity. But for other values the cosmological constant’s repulsive gravity can beat out attractive gravity, and yield the observed accelerated spatial expansion, spot on. Were Einstein still with us, his discovery that repulsive gravity lies within nature’s repertoire would have likely garnered him another Nobel prize.

As remarkable as it is that even one of Einstein’s “bad” ideas has proven prophetic, many puzzles still surround the cosmological constant: If there is a diffuse, invisible energy permeating space, where did it come from? Is this dark energy (to use modern parlance) a permanent fixture of space, or might its strength change over time? Perhaps most perplexing of all is a question of quantitative detail. The most refined attempts to calculate the amount of dark energy suffusing space miss the measured value by a gargantuan factor of 10123 (that is, a 1 followed by 123 zeroes) — the single greatest mismatch between theory and observation in the history of science.

THESE are vital questions that rank among today’s deepest mysteries. But standing beside them is an unassailable conclusion, one that’s particularly unnerving. If the dark energy doesn’t degrade over time, then the accelerated expansion of space will continue unabated, dragging away distant galaxies ever farther and ever faster. A hundred billion years from now, any galaxy that’s not resident in our neighborhood will have been swept away by swelling space for so long that it will be racing from us at faster than the speed of light. (Although nothing can move through space faster than the speed of light, there’s no limit on how fast space itself can expand.)

Light emitted by such galaxies will therefore fight a losing battle to traverse the rapidly widening gulf that separates us. The light will never reach Earth and so the galaxies will slip permanently beyond our capacity to see, regardless of how powerful our telescopes may become.

Because of this, when future astronomers look to the sky, they will no longer witness the past. The past will have drifted beyond the cliffs of space. Observations will reveal nothing but an endless stretch of inky black stillness.

If astronomers in the far future have records handed down from our era, attesting to an expanding cosmos filled with galaxies, they will face a peculiar choice: Should they believe “primitive” knowledge that speaks of a cosmos very much at odds with what anyone has seen for billions and billions of years? Or should they focus on their own observations and valiantly seek explanations for an island universe containing a small cluster of galaxies floating within an unchanging sea of darkness — a conception of the cosmos that we know definitively to be wrong?

And what if future astronomers have no such records, perhaps because on their planet scientific acumen developed long after the deep night sky faded to black? For them, the notion of an expanding universe teeming with galaxies would be a wholly theoretical construct, bereft of empirical evidence.

We’ve grown accustomed to the idea that with sufficient hard work and dedication, there’s no barrier to how fully we can both grasp reality and confirm our understanding. But by gazing far into space we’ve captured a handful of starkly informative photons, a cosmic telegram billions of years in transit. And the message, echoing across the ages, is clear. Sometimes nature guards her secrets with the unbreakable grip of physical law. Sometimes the true nature of reality beckons from just beyond the horizon.

Brian Greene, a professor of physics and mathematics at Columbia, is the author of the forthcoming book “The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos.”

Friday, January 14, 2011

Japan finds there is more to life than growth

By David Pilling

Financial Times, 5 January 2011
Is Japan the most successful society in the world? Even the question is likely (all right, designed) to provoke ridicule and have you spluttering over your breakfast. The very notion flies in the face of everything we have heard about Japan’s economic stagnation, indebtedness and corporate decline.

Ask a Korean, Hong Kong or US businessman what they think of Japan, and nine out of 10 will shake their head in sorrow, offering the sort of mournful look normally reserved for Bangladeshi flood victims. “It’s so sad what has happened to that country,” one prominent Singaporean diplomat told me recently. “They have just lost their way.”

It is easy to make the case for Japan’s decline. Nominal gross domestic product is roughly where it was in 1991, a sobering fact that appears to confirm the existence of not one, but two, lost decades. In 1994, Japan’s share of global GDP was 17.9 per cent, according to JPMorgan. Last year it had halved to 8.76 per cent. Over roughly the same period, Japan’s share of global trade fell even more steeply to 4 per cent. The stock market continues to thrash around at one-quarter of its 1990 level, deflation saps animal spirits – a common observation is that Japan has lost its “mojo” – and private equity investors have given up on their fantasy that Japanese businesses will one day put shareholders first.

Certainly, these facts tell a story. But it is only partial. Underlying much of the head-shaking about Japan are two assumptions. The first is that a successful economy is one in which foreign businesses find it easy to make money. By that yardstick Japan is a failure and post-war Iraq a glittering triumph. The second is that the purpose of a national economy is to outperform its peers.

If one starts from a different proposition, that the business of a state is to serve its own people, the picture looks rather different, even in the narrowest economic sense. Japan’s real performance has been masked by deflation and a stagnant population. But look at real per capita income – what people in the country actually care about – and things are far less bleak.

By that measure, according to figures compiled by Paul Sheard, chief economist at Nomura, Japan has grown at an annual 0.3 per cent in the past five years. That may not sound like much. But the US is worse, with real per capita income rising 0.0 per cent over the same period. In the past decade, Japanese and US real per capita growth are evenly pegged, at 0.7 per cent a year. One has to go back 20 years for the US to do better – 1.4 per cent against 0.8 per cent. In Japan’s two decades of misery, American wealth creation has outpaced that of Japan, but not by much.

The Japanese themselves frequently refer to non-GDP measures of welfare, such as Japan’s safety, cleanliness, world-class cuisine and lack of social tension. Lest they (and I) be accused of wishy-washy thinking, here are a few hard facts. The Japanese live longer than citizens of any other large country, boasting a life expectancy at birth of 82.17 years, much higher than the US at 78. Unemployment is 5 per cent, high by Japanese standards, but half the level of many western countries. Japan locks up, proportionately, one-twentieth of those incarcerated in the US, yet enjoys among the lowest crime levels in the world.

In a thought-provoking article in The New York Times last year, Norihiro Kato, a professor of literature, suggested that Japan had entered a “post-growth era” in which the illusion of limitless expansion had given way to something more profound. Japan’s non-consuming youth was at the “vanguard of the downsizing movement”, he said. He sounded a little like Walter Berglund, the heroic crank of Jonathan Franzen’s Freedom, who argues that growth in a mature economy, like that in a mature organism, is not healthy but cancerous. “Japan doesn’t need to be No 2 in the world, nor No 5 or 15,” Prof Kato wrote. “It’s time to look to more important things.”

Patrick Smith, an expert on Asia, agrees that Japan is more of a model than a laggard. “They have overcome the impulse – and this is something where the Chinese need to catch up – to westernise radically as a necessity of modernisation.” Japan, more than any other non-western advanced nation, has preserved its culture and rhythms of life, he says.

One must not overdo it. High suicide rates, a subdued role for women and, indeed, the answers that Japanese themselves provide to questionnaires about their happiness, do not speak of a nation entirely at ease with itself in the 21st century. It is also possible that Japan is living on borrowed time. Public debt is among the highest in the world – though, significantly, almost none of it is owed to foreigners – and a younger, poorer-paid generation will struggle to build up the fat savings on which the country is now comfortably slumbering.
If the business of a state is to project economic vigour, then Japan is failing badly. But if it is to keep its citizens employed, safe, economically comfortable and living longer lives, it is not making such a terrible hash of things.