Archive for the ‘Periodical Note’ Category

Periodical Note:
John Lachester on Austerity, in the LRB
Monday, 18 July 2011

Monday, July 18th, 2011

In the London Review of Books, John Lanchester’s comment, “Once Greece goes…” (dated 30 June), approaches its conclusion (on the vexing differentness of the German ecomony) with an eloquent capture of the growing public mood that is pushing back at the consequences of generations of European political condescension. I have rendered the strongiest feelings bold.

The Greek people are furious to be told by their deputy prime minister that ‘we ate the money together’; they just don’t agree with that analysis. In the world of money, people are privately outraged by the general unwillingness of electorates to accept the blame for the state they are in. But the general public, it turns out, had very little understanding of the economic mechanisms which were, without their knowing it, ruling their lives. They didn’t vote for the system, and no one explained the system to them, and in any case the rule is that while things are on the way up, no one votes for Cassandra, so no one in public life plays the Cassandra role. Greece has 800,000 civil servants, of whom 150,000 are on course to lose their jobs. The very existence of those jobs may well be a symptom of the three c’s, ‘corruption, cronyism, clientelism’, but that’s not how it feels to the person in the job, who was supposed to do what? Turn down the job offer, in the absence of alternative employment, because it was somehow bad for Greece to have so many public sector workers earning an OK living? Where is the agency in that person’s life, the meaningful space for political-economic action? She is made the scapegoat, the victim, of decisions made at altitudes far above her daily life – and the same goes for all the people undergoing ‘austerity’, not just in Greece. The austerity is supposed to be a consequence of us all having had it a little bit too easy (this is an attitude which is only very gently implied in public, but it’s there, and in private it is sometimes spelled out). But the thing is, most of us don’t feel we did have it particularly easy. When you combine that with the fact that we have so little real agency in our economic lives, we tend to feel we don’t deserve much of the blame. This feeling, which is strong enough in Ireland and Iceland, and which will grow steadily stronger in the UK, is so strong in Greece that the country is heading for a default whose likeliest outcome, by far, is a decade of misery for ordinary Greeks.

As we ponder this divide between the general public and “money,” it seems that our navigational charts are severely out of date. Territories previously marked “Here Be Communists” are now otherwise occupied. Almost everyone wants a decent change to do better, which means that almost no one is interested social levelling. But we have learned the hard way that while financiers may know how to make money, that’s usually all they know. Allowing them to set public policy is what produced the the worldwide debt crisis.

Our old poles — communist/capitalist, liberal/conservative — are less and less magnetic; they don’t serve to organize our ideas effectively. Complicating the current world situation is the gradual withdrawal of a one-time outgoing tide of social conservatives who will never be replaced — not, at least, for a very long time. Which are the problems that will persist in the wake of that ebbing, and that therefore require serious consideration?

Our governments, for all their modern apparatus, date back to ancien régime foundations. The distinction between public and private sectors has its origins in Renaissance state formantion, and represents the breakdown of the feudal hegemony, in which the distinction made no sense. Prior to the Seventeenth Century, the state was looked to for little more than military defense and the odd festal monument. It was supposed to uphold the tangle of established rights that had grown up in the highly localized Middle Ages, but this was just another kind of military support; the state wasn’t supposed to take initiative with respect to these rights. That changed with the activist centralizers of early modern France, and the conflict between government and private interests was launched.

I think that we’re due for a new model.

Periodical Note:
Louis Menand on Higher Ed, in The New Yorker
Thursday, 2 June 2011

Thursday, June 2nd, 2011

In his review of some of the depressing books about higher education that have been appearing right and left lately — among them, Professor X’s In the Basement of the Ivory Tower, which I think I’m going to have to read after all — Louis Menand advances three theories of education. They’re not essentially incompatible, and I don’t see why we can’t operate them concurrently. But most people, it seems, naturally favor one theory over the other two, and wonder why anyone would consider the others as viable alternatives. That’s almost as interesting as the theories themselves.  What everyone does agree about is that the Theory 1 approach is not for everyone.

Theory 1 holds that the purpose of higher education is to make the most of society’s most talented minds. This means not wasting energy on minds that display little or no academic talent. It means winnowing and culling, with high standards and tough grading. Theory 2, looking through the other end of the telescope, regards education as a socializing tool that, for that very reason, ought to be made available to everyone. Theory 3 is vocational: education enlarges your skill set by teaching you things that you need to know to get by and/or ahead in real-world situations. All three approaches are utilitarian, which is what makes them compatible in the end. Nobody is arguing that
education is an inherent good. I don’t have a problem with that, so long as we make it easy for people who feel that education is inherently good to educate themselves. 

What distinguishes the first theory from the second and third is not its apparent high-mindedness but its faith in abstraction and indirection. In other words, the liberal arts. Or maybe not. I’m not sure that “liberal arts” means anything anymore; Menand keeps coming back to “toughing it out with Henry James.” As synecdoche goes, it’s not inapt, because Henry James, at least in his late style, is so extraordinarily articulate that he is difficult to follow, and the ability to follow James’s sentences fluently enough actually to enjoy them is a good sign of the literate competence that we expect of so-called professional people — people who are effectively a law unto themselves, as doctors and lawyers quite often are. (The compact that we make with professional people is that the law that they implement will be sound.) There is an almost hieratic vagueness about the liberal arts that becomes palpable the minute you start looking for books about critical thinking. Everyone agrees that critical thinking is a key compenent of a liberal arts education. But you can’t buy books that will teach you how to do it, the way you can buy chemistry handbooks. Critical thinking turns out to be more of an experience than a skill. Those of us who have had the experience recognize it in others, like vacations in Paris. 

To some extent, in short, “the liberal arts” is simply a racket that the proponents of Theory 1 have settled on — did I say, “racket”? That was rude; I meant “convention” — as a “measure of intellectual capacity and productive potential,” as Menand puts it. It happens to be an academic convention, in that mastering the liberal arts entails a lot of reading and writing. Professor X, quoted by Menand, puts it very well: 

“I have come to think,” he says, “that the two most crucial ingredients in the mysterious mix that makes a good writer may be (1) having read enough throughout a lifetime to have internalized the rhythms of the written word, and (2) refining the ability to mimic those rhythms.”

It is impossible to demonstrate a mastery of the liberal arts without the ability to write clearly and effectively — which means, engagingly. “Mysterious mix” is putting it mildly. My point here, however, is not to talk about what makes good writers. It is rather to suggest that an education scheme bottomed on the liberal arts is going to serve Theory 2 and Theory 3 very poorly, because most people are not good writers. Why? Most of the bad writers are probably poor readers — of the kinds of liberal arts materials that are adumbrated throughout elementary school and that make high school so tedious for any student who is not in a frantic mood to read David Copperfield. That’s the part that Professor X leaves out. The writing from which he expects readers to internalize the rhythms of the written word is the kind of writing that he teaches from his liberal arts curriculum.

What the beneficiaries of Theory 2 and Theory 3 need (and this would seem to be everyone who doesn’t attend a liberal arts college) is a kind of reading material — let’s not call it “literature” yet — that is aimed away from the abstractions and the assumptions of liberal arts prose. I have no idea, really, what this writing would look like, or — most intriguingly — if liberal arts readers would like to read it with enthusiasm. Maybe it wouldn’t be reading at all — it might be visual (the dichotomy between “reading” and “seeing” never ceases to surprise me). The one thing I do know is that this new material would put an end to the twin complaints that Theory 1 people have about the alternatives, which are that Theory 2 and Theory 3 offer watered-down versions of Theory 1 education, and that they thereby threaten the integrity of Theory 1’s all-important standards. 

We need to know a lot more about why most people don’t enjoy reading. It may have a lot to do with what they have been offered.

Periodical Note:
In The Atlantic
Christian, Myers, Hitchens

Monday, February 14th, 2011

Brian Christian’s feature article in the current issue of The Atlantic, “Mind vs Machines,” is billed on the cover as “Why Machines Will Never Beat the Human Mind,” which nicely captures the distance between what the magazine’s editors think will sell and Christian’s rather different point, reflected in the title of his forthcoming book, from which the piece was adapted: The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive. You can win an award for being “the most human human,” as Christian himself has done, if you participate in the Loebner Prize, an annual event that recreates the Turing Test, and snatch victory from the jaws of artificial-intelligence engineers more frequently than the other human  contestants. The crux of Christian’s report is that what makes the Turing Test compelling is the insight that it generates into human complexity. The funhouse aspect of the exercise — trying to fool judges into thinking that they’re talking to people when they’re in fact talking to machines — makes for good headlines, but if Brian Christian is correct, we can expect a jockeying back and forth between man and motherboard in which human beings, regularly losing the title to ever-smarter computers, just as regularly figure out how to win it back. 

Christian reminds us of Alan Turing’s brilliant condensation of the thorny question that emerged after World War II: would the new computing machines ever be capable of thought?

Instead of debating this question on purely theoretical grounds, Turing proposed an experiment. Several judges each pose questions, via computer terminal, to several pairs of unseen correspondents, one a human “confederate,” the other a computer program, and attempt to discover which is which. The dialogue can range from small talk to trivia questions, from celebrity gossip to heavy-duty philosophy — the whole gamut of human conversation. Turing predicted that by the year 2000 computers would be able to fool 30 percent of human judges after five minutes of conversation, and that as a result, one would “be able to speak of machines as thinking without expecting to be contradicted.” 

The millennium turned without smiling on Turing’s forecast, but, in 2008, a computer program came within a hair of winning the Loebner Prize.  This inspired Christian to participate in 2009 — and not only that, but to go for the “most human human” award while he was at it. There was nothing frivolous about his undertaking; it’s quite clear that he didn’t sign up for the test so that he could write a breezy article about it. He was motivated by the fear that human beings were giving up too easily — weren’t, in fact, trying to win. While the AI teams poured boundless time and effort into the design of their simulators, the human confederates were being advised, fatuously, to “just be yourself.” As Christian says, it’s hard to tell whether this pap reflected an exaggerated conception of human intelligence or an attempt to fix the fight in the machines’ favor. 

To the extent that “just be yourself” means anything, it is better expressed in one word: “Relax.” That’s what coaches always seem to be telling their athletes before the big fight, and for highly-trained minds and bodies, it’s probably sound. You can’t show your stuff to true advantage if you’re worrying about what you’ve got. But ordinary people — this is what “ordinary” means — don’t have any stuff to show. What “just be yourself” says to them is “don’t sweat it.” So, on one side, we have ardent engineers, with their brilliant insights and excruciating attention to detail — and probably some serious funding. On the other, “don’t sweat it.” Rocket scientists versus slackers — not much of a contest. 

Christian doesn’t follow this peculiar asymmetry (not in The Atlantic, anyway), but what’s at work here is the same decayed snobbishness with which the Educational Testing Service insists that special preparatory courses and other preliminary efforts are irrelevant to success on its examinations. This is patently untrue, but the cachet of the ETS achievement and aptitude tests remains bound up in the idea that success in life does not require specialized training. This was the lesson taught to us by the great English gentleman of Victorian fact and fiction, men who, by following their whims as far as fortune allowed, acquired skills and insights of almost universal application. Boy Scouts varied this theme by straining to remain semper paratus while carrying the lightest backback. Executive suites are still stuffed with affable generalists who have learned what they know about life from playing golf. In this clubby atmosphere, study and preparation, “boning up” of any kind, looks like a kind of cheating. 

Even Christian is blown sideways by the gale force of this prejudice; he sounds crashingly unsportsmanlike. 

And so another piece of my confederate strategy fell into place. I would treat the Turing Test’s strange and unfamiliar textual medium more like spoken English, and less like the written language. I would attempt to disrupt the turn-taking “wait and parse” pattern that computers understand, and create a single, flowing chart of verbal behavior, emphasizing timing. If computers understand little about verbal “harmony,” they understand even less about rhythm. 

If nothing was happening on my screen, whether or not it was my turn, I’d elaborate a little on my answer, or add a parenthetical, or throw a question back at the judge — just as we offer and/or fill audible silence when we talk out loud. If the judge too too long corresponding to the next question, I’d keep talking. I would be the one (unlike the bot) with something to prove. If I knew what the judge was about to write, I’d spare him the keystrokes and jump in.

It’s almost funny, how shot through this passage is with the air of deception. All these conscious little tricks, all designed to “fool,” you almost think, the judge into regarding Christian as exactly what he is: a real person.  

“Mind vs Machine” shares some invaluable observations about vernacular discourse. In heated exchanges, for example, people respond more and more exclusively to whatever has just been said, and less and less to the overall tenor of the argument. Researcher (and three-time winner of the “most human computer” prize) Richard Wallace has discovered that “most casual conversation is ‘state-less,’ that is, each reply depends only on the current query, without any knowledge of the history of the conversation required to formulate the reply.” This is a windfall for programmers, because a sudden lurch into ill-tempered language is all too convincing evidence of a human-ature tantrum, and very easy for a machine to fake. Christian draws a very practical lesson: 

Aware of the stateless, knee-jerk character of the terse remark I want to blurt out, I recognize that that remark has more to do with a reflex reaction to the very last sentence of the conversation than with either the issue at hand or the person I’m talking to. All of a sudden, the absurdity and ridiculousness of this kind of escalation become quantitatively clear, and, contemptuously unwilling to act like a bot, I steer myself toward a more “stateful” response: better living through science. 

I hope that Christian’s book will make that final point more clearly and happily than his article does. I was deeply put off by a passage that I read before I knew what Christian was up to, when, that is, it seemed that he was doing nothing more interesting than moaning about the possibility that we might some day be overtaken by our mechanical creations. 

The story of the 21st century will be, in part, the story of the drawing and redrawing of those battle lines, the story of Homo spaiens trying to stake a claim on shifting ground, flanked by beast and machine, pinned between meat and math.  

That’s pungent prose, but the metaphor of military conflict could hardly be less welcome — or less apposite to Christian’s far more gracious point, which is that computers, instead of supplanting us, can show us how to be better at what we already are. 

***

If you believe that human beings are the Lords of Creation, then there is nothing to worry about when you sit down to dinner; but if you believe rather that we’re just one species among many, then eating becomes tragic, because it requires us to kill. My own view is that only the only way to draw a line between eating flesh and eating anything at all is to subscribe to a variant of the pathetic fallacy, according to which animals, being more like us than plants, merit kinder treatment — so it’s okay to finish your vegetables. We have to wonder what the editors of The Atlanticwere thinking when they assigned a passel of recent “foodie” books to BR Myers, the Green and vegan professor of North Korean literature. Oh, they were probably hoping for exactly what he delivered, a steaming denunciation of the lot. It’s easy to see why the prim Myers would dislike the louche Anthony Bourdain or the spiritual Kim Severson. But Michael Pollan? 

The moral logic in Pollan’s hugely successful book now informs all food writing: the refined palate rejects the taaste of factory-farmed meat, of the corn-syrupy junk food that sickens the poor, of frozen fruits and vegetables transported wastefully across oceans — from which it follows that to serve one’s palate is to do right by small farmers, factory-abused cows, Earth itself. This affectation of piety does not keep foodies from vaunting their penchant for obscenely priced meals, for gorging themselves, even for dining on endangered animals — but only rarely is public attention drawn to the contradiction. 

If you can find a passage in which Michael Pollan endorses any of the crimes enumerated in the second sentence, please write to Myers to thank him for the tip. Otherwise — and I’m fairly confident that it will have to be “otherwise” — you must still deal with Myers’ attack on everyone else mentioned in his review. I feel none of Myers’s hostility to today’s chic food writers, but I have lost interest in what they have to say, partly because they don’t begin to be honest about the economic elitism that underpins their outlook — those simple, slow-food pleasures are luxury goods, and always will be — and partly because, without getting excited about it, I do agree with Livy (referenced by Myers), that “the glorification of chefs” is probably unhealthy. Writing about food ought to be modest — that’s one of the appeals of Julia Child’s books. Child agreed with the fundamental French precept that there is one (1) right way to do everything, and she sought to convey the rules as clearly as possible to heterodox Americans; but she never raised her voice or succumbed to rapture. Today’s foodies haven’t got Child’s good manners.

The more lives sacrificed for a dinner, the more impressive the eater. Dana Goodyear: “Thirty duck hearts in curry — The ethos of this kind of cooking is undeniably macho.” Amorality as ethos, callousness as bravery, queenly self-absorption as machismo; no small perversion of language is needed to spin heroism out of an evening spent in a chair. 

Well, I couldn’t put it down.  

***

In his favorable review of Sean McMeekin’s The Berlin-Baghdad Expresss: The Ottoman Empire and Germany’s Bid for World Power, Christopher Hitchens identifies the people who ought to read this book (which would include me): 

If asked to discuss some of the events of that period that shaped our world and the world of Osama, many educated people could cite T E Lawrence’s Arab Revolt, the secret Anglo-French Sykes-Picot Agreement portioning out the post-war Middle East, and the Balfour Declaration, which prefigured the coming of the Jewish state. But who can speak with confidence of Max von Oppenheim, the godfather of German “Orientalism” and a sponsor of holy war? An understanding of this conjuncture is essential. It helps supply a key to the collapse of the Islamic caliphate — bin Laden’s most enduring cause of rage — and to the extermination of the Armenians, the swift success of the Bolshevik Revolution, and the relative independence of modern Iran, as well as the continuing divorce between Sunni and Shia Muslims. 

Check!