Archive for the ‘Big ideas’ Category

Big Ideas:
The Rentier Party
Friday, 17 June 2011

Friday, June 17th, 2011

Last week, Paul Krugman published a column that caught my eye. I don’t read Krugman as a rule, because I already agree with what he has to say, and it irks me that anyone who doesn’t wields any influence in Washington or elsewhere. But I read Friday’s column because its title, “Rule by Rentiers,” not only coincided with my own ideas but struck the same new note: “rentiers.” It had occurred to me only days earlier that the Republican Party, which used to be the party of business, had become the party of rentiers. As Krugman suggests, it’s not just Republicans. It’s political elites everywhere in the West. All seem to be in the pockets of wealthy people whose wealth no longer derives from personal effort.

A word about the word, which means the opposite of its English false-cognate. Rentiers, unlike renters, own things, and their income is derived from the “profits,” or surplus revenue, that their properties generate, whether they be farms, mines, or investment portfolios. (You might say that the French simply looked at the rental process from the other side; a rentier is someone who rents property out; our renters pay rent — to rentiers.)

I don’t mean to demonize rentiers. There may be nothing admirable about living on interest and dividend payments, but there’s nothing shameful about it, either. The mystery, though, is why leaders are attending to rentiers on the one subject that rentiers care nothing about, jobs. In the rentiers’ paradise, there would be no workers, only robots. There may be nothing wrong with that prospect, either. But surely in any discussion of serious social issues such as employment and health care, a class with every reason not to sympathise with workers ought to have a very limited voice at best.

Yesterday, an even fresher insight blossomed on the one that I shared with Paul Krugman. The men and women who run this countries large corporations (whether as executives or board members) are often members but always agents of the rentier class. That is why they are paid without any regard to their firms’ official profitability. Corporations are only incidentally commercial nowadays. They’re primarily strip mines for wealth whose operations are protected from outside interference by the executive class. It is the same with the big bankers. None of these people is any more interested in business as we know it than a medieval duke.

I don’t fear rentiers themselves. They’re not, as a rule, very bright — that’s how they dragged us into the credit collapse of 2008. As oligarchs, they have little solidarity except when under attack; getting them to agree is like herding cats. Except with regard to two things: the sanctity of contract in good times and an entitlement to bailouts in bad times.

The problem is that American campaign-finance laws have allowed the rentier class to buy the allegiance of the political class. The rentiers are the only people who can foot the bill of our preposterously bloated campaign-advertising programs. Rentiers also fund the think tanks that foment voter dissatisfaction with progressive causes. Rentiers have little interest in the social questions, such as abortion and same-sex marriage, that mobilize Republican Party supporters. But their spokesmen have astutely welded socially conservative issues to the economically regressive ones that mean a great deal to rentiers, and in this they are helped by the fact that economic progressives tend to be social progressives as well.

The only hope for a progressive party in America is to develop a genuine and knowledgeable passion for every kind of business except the conglomerate kind (which is no business at all). An economy of healthy businesses is the sine qua non of healthy societies generally, and it’s time for progressives to stop looking down their noses at people who are driven to earn money by making things and providing services. And to stop confusing these people with the three-card monte artists of Wall Street and their coupon-clipping (oh, for the days!) patrons.

Periodical Note:
Louis Menand on Higher Ed, in The New Yorker
Thursday, 2 June 2011

Thursday, June 2nd, 2011

In his review of some of the depressing books about higher education that have been appearing right and left lately — among them, Professor X’s In the Basement of the Ivory Tower, which I think I’m going to have to read after all — Louis Menand advances three theories of education. They’re not essentially incompatible, and I don’t see why we can’t operate them concurrently. But most people, it seems, naturally favor one theory over the other two, and wonder why anyone would consider the others as viable alternatives. That’s almost as interesting as the theories themselves.  What everyone does agree about is that the Theory 1 approach is not for everyone.

Theory 1 holds that the purpose of higher education is to make the most of society’s most talented minds. This means not wasting energy on minds that display little or no academic talent. It means winnowing and culling, with high standards and tough grading. Theory 2, looking through the other end of the telescope, regards education as a socializing tool that, for that very reason, ought to be made available to everyone. Theory 3 is vocational: education enlarges your skill set by teaching you things that you need to know to get by and/or ahead in real-world situations. All three approaches are utilitarian, which is what makes them compatible in the end. Nobody is arguing that
education is an inherent good. I don’t have a problem with that, so long as we make it easy for people who feel that education is inherently good to educate themselves. 

What distinguishes the first theory from the second and third is not its apparent high-mindedness but its faith in abstraction and indirection. In other words, the liberal arts. Or maybe not. I’m not sure that “liberal arts” means anything anymore; Menand keeps coming back to “toughing it out with Henry James.” As synecdoche goes, it’s not inapt, because Henry James, at least in his late style, is so extraordinarily articulate that he is difficult to follow, and the ability to follow James’s sentences fluently enough actually to enjoy them is a good sign of the literate competence that we expect of so-called professional people — people who are effectively a law unto themselves, as doctors and lawyers quite often are. (The compact that we make with professional people is that the law that they implement will be sound.) There is an almost hieratic vagueness about the liberal arts that becomes palpable the minute you start looking for books about critical thinking. Everyone agrees that critical thinking is a key compenent of a liberal arts education. But you can’t buy books that will teach you how to do it, the way you can buy chemistry handbooks. Critical thinking turns out to be more of an experience than a skill. Those of us who have had the experience recognize it in others, like vacations in Paris. 

To some extent, in short, “the liberal arts” is simply a racket that the proponents of Theory 1 have settled on — did I say, “racket”? That was rude; I meant “convention” — as a “measure of intellectual capacity and productive potential,” as Menand puts it. It happens to be an academic convention, in that mastering the liberal arts entails a lot of reading and writing. Professor X, quoted by Menand, puts it very well: 

“I have come to think,” he says, “that the two most crucial ingredients in the mysterious mix that makes a good writer may be (1) having read enough throughout a lifetime to have internalized the rhythms of the written word, and (2) refining the ability to mimic those rhythms.”

It is impossible to demonstrate a mastery of the liberal arts without the ability to write clearly and effectively — which means, engagingly. “Mysterious mix” is putting it mildly. My point here, however, is not to talk about what makes good writers. It is rather to suggest that an education scheme bottomed on the liberal arts is going to serve Theory 2 and Theory 3 very poorly, because most people are not good writers. Why? Most of the bad writers are probably poor readers — of the kinds of liberal arts materials that are adumbrated throughout elementary school and that make high school so tedious for any student who is not in a frantic mood to read David Copperfield. That’s the part that Professor X leaves out. The writing from which he expects readers to internalize the rhythms of the written word is the kind of writing that he teaches from his liberal arts curriculum.

What the beneficiaries of Theory 2 and Theory 3 need (and this would seem to be everyone who doesn’t attend a liberal arts college) is a kind of reading material — let’s not call it “literature” yet — that is aimed away from the abstractions and the assumptions of liberal arts prose. I have no idea, really, what this writing would look like, or — most intriguingly — if liberal arts readers would like to read it with enthusiasm. Maybe it wouldn’t be reading at all — it might be visual (the dichotomy between “reading” and “seeing” never ceases to surprise me). The one thing I do know is that this new material would put an end to the twin complaints that Theory 1 people have about the alternatives, which are that Theory 2 and Theory 3 offer watered-down versions of Theory 1 education, and that they thereby threaten the integrity of Theory 1’s all-important standards. 

We need to know a lot more about why most people don’t enjoy reading. It may have a lot to do with what they have been offered.

Big Ideas:
Animals and Bons Bourgeois
Tuesday, 31 May 2011

Tuesday, May 31st, 2011

One of the sweeter jokes in Midnight in Paris has Woody Allen’s stand-in, played by Owen Wilson, suggest a movie idea to Luis Buñuel, on one of his nocturnal sojourns in the Paris of the Twenties. The guests at a fancy dinner party find, when it is over, that they cannot leave the dining room, and in their frustration and panic they emerge from their civilized personas as the “animals” that they really are. Buñuel shrugs and says he doesn’t get it. Why don’t they just leave? he mutters. We’re tickled because we know that when the real Buñuel made Le Charme discret de la bourgeoisie in 1972, he improved on Gil Pender’s idea by backing up the problem: time after time, his six principals sit down to a meal that never appears. There are always reasons (sometimes surrealist reasons) why dinner can’t be served, and it is not the frustration of withheld dinners that exposes the bons bourgeois as animals — worse than animals, really. Gil’s idea, in short, is worked out cinematically from his one-line gag to a feature-length story. The elegance of Woody Allen’s joke is that it prompts the knowledgeable viewer to think about all of this. Buñuel’s shrugging incomprehension asks you to foresee that he will chew on the idea (for decades!) before finding a form for it that he does understand. And if you think that Gil Pender gets the credit for the idea by the time Buñuel is through with it, you haven’t been paying attention. 

Le charme discret de la bourgeoisie is of course about the supposed fraudulence of the bourgeoisie. The meal that is never served stands for the communion that the characters cannot achieve partly because they are actually engaged in narcotics trafficking but also because they are immured in the carapace of their social manners. Instead of food, they are served several courses of dreams, than which nothing could be more solitary. In the end, they have no reason to behave as properly as they do; in the movie’s terms, it doesn’t get them anywhere. The delivery of this lesson could not have been better-timed; by 1972, everyone, not just young people, was ready to consider the possibility that something was wrong with the bourgeois ideal — laughed at for nearly two centuries, no doubt, but no less eagerly pursued by those who could afford it. 

Nearly forty years later, it seems that many people still believe, paradoxically, that the bourgeois manner is both false and genuine. It’s false because it covers up all the rough stuff that’s left at home. It’s genuine because bourgeois people fall for their own cover-up, and start to deny the rough stuff that’s left at home. This second point turns out to have been what was wrong with the old bourgeoisie. As tradesmen and others worked their way into the urban middle classes, starting in the late Eighteenth Century, they were prone to expect thoroughgoing personal transformation, and to believe that they would become bourgeois. But nobody is really bourgeois. Being bourgeois is a simply a a well-articulated manner of behaving with people whom you don’t know well — or, as it may be, with everyone but the one person with whom you’re in love. Only a dolt would think that it’s hypocritical to be as pleasant as possible in the street, or to bottle up one’s furies and resentments while waiting at the checkout counter. It’s bottling things up and then letting all hell break loose in the company of loved ones that’s a problem. 

From the coordinates that I’ve sketched here, it is understandably difficult to make sense of the heart of Jonathan Franzen’s Op-Ed piece in Sunday’s Times. (“Liking Is For Cowards. Go For What Hurts.”) What’s bothering the novelist is Facebook’s appropriation of the verb “like”; unfortunately, he lets his annoyance run away with him. “The simple fact of the matter is that trying to be perfectly likable is incompatible with loving relationships.” Well, yes! But who’s trying to be “perfectly likable”? On the basis of his writing, I’d venture that Franzen himself has a problem with this; he smoulders with that peculiarly Midwestern desire to be liked (which is in no way to derogates from his achievement as an artist). At the same time, like any smart fellow, he doesn’t want to be situated too far from the banks of coolness. But being cool is not on his mind at the moment; being loving is. He sees that being lovable is not the same thing as being likeable. What I don’t understand is why he thinks that being likeable makes it hard to be lovable. One wants to be liked by the world — not perfectly liked, by any means, but generally liked — and, if healthy, one also wants to be loved by a very small number of people, no two in quite the same way, and with only one love commanding one’s complete candor. I suppose that can be very hard to realize both of these desires if one situates them on the same mental plane. But they no more belong on the same mental plane than one belongs in the same clothes round the clock. You don’t wear a suit to bed and you don’t go outside draped in a towel. Perhaps it’s that simple: adjust your behavior to the outfit that you allow another person to see. If you are never to be seen without dress shoes, then it’s safe to say that you have intimacy issues. Being cool and being nice are just different personas — not so much masks designed to hide as projections designed to inspire (they do share that). Being loving and lovable means shutting the projections down as far as we can without becoming oafs. 

As recently as forty years ago, many bourgeois people felt it necessary to be shocked by the details of the sex lives of others (even when the others in question were made-up characters in novels). Now we know better, and ask only not to witness these details. What plays in the bedroom stays in the bedroom (please!); we can assume that silence does not imply inertia. Although I’ve had some pretty hot-tempered moments, I’ve never acted in a way that afterward put me in mind of an animal. This is not to say that I should be untiringly pleasant if I found myself locked in a dining room after dinner, unable to say good night to fellow guests. But it wouldn’t take me long to suggest ripping up the tablecloth to partition the room into tolerably private spaces. That’s probably where I’d have gone with Gil Pender’s idea. 

Periodical Note:
On Higher Education
Monday, 16 May 2011

Monday, May 16th, 2011

In a review, at The Nation, of twelve recent books about the plight of higher education, William Deresiewicz writes, 

Our system of public higher education is one of the great achievements of American civilization. In its breadth and excellence, it has no peer. It embodies some of our nation’s highest ideals: equality, opportunity, self-improvement, useful knowledge and collective public purpose… Now the system is in danger of falling into ruins.

After meditating on this passage for a few days, I’ve decided that it provides an accurate mission statement for the great American state universities (and the larger private ones as well), but that its objectives have never been persistently realized. Ideals are targets, and American leaders of the past can be praised for having aimed at “some our nation’s highest” in the design of a public university system. But in order to approximate ideals in reality, you have to understand the reality that you’re working with, and candid self-assessment has never been a marked feature of the American character. We’re optimists and visionaries who don’t in fact see very clearly with our eyes. The system has always been in danger of falling into ruins, much like an overheated real estate development that never gets to the point of housing a living tenant. 

I am not a scholar; I haven’t studied facts and figures. I’m willing to rely on those who have, such as William Arum and Josipa Roksa, whose Academically Adrift, which figures among the books under Deresiewicz’s review, suggests that students aren’t learning much of anything in college these days. But my willingness is born of experience: when I look round the Internet, for example, I see hosts of very bright young people who are teaching themselves things that they ought to have been taught in college. The only thing that’s new about this state of affairs is that the Internet provides a medium for observing it on a much larger scale than was available before. Even in the days when students put in longer hours studying than they do now, the objective was rarely to attain true understanding of a subject. The point was to pass tests. Then as now the only way to learn about something was to write about it in a manner that would meet a modest level of critical inquiry; then as now the only way to evaluate such writing was to read it and to grade it, something most intelligent people would rather not do. Most writing is not pleasant to read. And when it is, the disappointment of a gifted student’s sudden unsuspecting drop into error can be maddening. With very, very rare exceptions, young minds have little of interest to contribute to the general discussion of ideas. But they must be educated if that discussion is to continue, and the fact that educating them is in large part a tedious slog must be faced squarely. Going to college may be fun overall, but education is always going to be as painful, and in much the same ways, as any other form of demanding exercise. Clever students have been getting better and better at avoiding the rut of education for nearly fifty years, ever since the introduction of course evaluations. That really has to stop.

I was a very clever student: I managed to avoid ever having to parse Caesar’s Gallic Wars. I was also bright enough to see where this cleverness would lead me if unchecked, and before it was too late I signed up for a Great Books Program, five semesters of talking about thinkers from the pre-Socratic to the post-Revolutionary. (As I recall, I bluffed my way through discussions of only one book, Moby-Dick, a book that, forty years later, I put down as utter rubbish; everything else I mostly read.) In other words, I surrendered my options and read what I was told to read. I doubted that the reading list was as good as it might be, but flush with youthful arrogance as I was I was also determined to get something out of school beyond a passel of interesting personal experiences. I’m reminded of a nightmare that used to trouble me: The print in a book that I was reading in my dream would grow fainter and fainter, as a realization slowly dawned that the words were becoming invisible because I hadn’t written them yet. This always woke me up with a shudder. That’s what college seems to have become for many students: an environment in which they produce everything out of themselves. 

The American university system underwent a tremendous expansion after World War II; so did the foundations of learning, which would soon support such unimagined realms as the superstructure of information technology. The old model of university education, germinated in Enlightenment Germany and polishe d to a high gloss at dozens of late Nineteenth-Century colleges and research universities, was primarily an apprenticeship system in which the stock of knowledge was transmitted from teachers to students. The stock of knowledge was known to be expanding, but the expansion was thought to be manageable because professors increased their sub-specialities. After World War II, it was increasingly understood that the unknown — the unmanageably learnable — vastly outbulks the masteries of credentialed professors. In 1800, it might not have been possible to read all of the books that were thought to be important, but it was certainly possible to house them all in a library. This was not the case in 1950, by which time the authority of transmission that foremerly underwrote the virtue of traditional education had evaporated. But the model remained, and to some extent it still with us in the liberal arts graduate departments with which Deresiewicz begins his review. It made so little sense, however, that the sprawling American university saw no reason not to corrupt itself by recruiting graduate students as the underpaid teachers of undergraduates. That was the end of the apprenticeship system: graduate students took the jobs that former students a couple of years older, on the other side of completing doctoral programs, ought to have had. It hasn’t taken long for this cannibalism to create a small but powerful class of tenured professors who have no more interest than any other powerful group in expanding their ranks and diluting their privileges. Whether this class is dooming itself to extinction is only a variant of the question that confronts most modern institutions, which, as mature institutions always and everywhere do, have concentrated power in ever fewer hands and increased the inequality of access to resources of all kinds. 

I don’t believe that the modern research university has ever justified itself as a provider of the kind of education that matters in civil society, which, as I’ve said, is a matter of training young minds to participate in public discussion. (I shy away, in these polarized times, from all thought of “debate”). It seems obvious that a solid grounding in the history of the nation’s problems would be the one indispensable subject, but this has never been on offer in the way that, say, art-history survey courses used to introduce students to centuries of imaganative creation. From what I can tell, colleges and universities have dismissed history surveys as belonging to the high-school curriculum. But what high school students can’t be expected to digest is the contentiousness of American historiography: true history is never settled, and can never be reduced to the tenets of a creed. (Doubtless it would be grand if high-school students could be required to memorize a lot of dates, freeing college students to explore their significance.) Nor has the research university been adept at inculcating social values, perhaps for the simple reason that the academic knowledge is diffracted into shards of sub-specialty. The ideal university, it seems to me, would prepare students for what I call the social paradox: the strength of any society is a function of its constituents’ cooperative pursuit of distinctions and differences. “Knowing thyself” is only part of what a good education imparts; figuring out where you might fit and what you might improve — matters that require learning a lot about other people — is just as important. 

I see no reason for higher education to be as expensive as it is. The current university, like some sort of interplanetary rocket, consists of three stages, with an athletic and quality-of-life stage consuming most of the resources, a capital-intensive stage of scientific research consuming most of the rest, and only a tiny nugget of money going to the actual teaching of undergraduates. I don’t see why these three stages ought to operate as parts of a unit. Graduate professional schools ought to be free-standing, and vastly less numerous than undergraduate schools. To me, the model college professor is a more sophisticated high-school teacher, not a grudging research scientist. It is likely that secondary and higher education might be unified, with some sort of non-academic national service interposed at varying points. As for athletics, their presence on the academic campus has always been the snake in the garden. How wonderful it would be if national service, and not education, were infused with the atmospheric attractions of today’s university life!

The origins of the university stretch back a thousand years, and its rich heritage must not only be preserved but kept alive. But whether the university has ever been or will ever be suited to equipping the citizens with the knowledge and intellectual habits that a liberal democracy depends on is a question that ought to be answered without reference to ideals.

Big Ideas:
Op-Ed Boilerplate
Thursday, 12 May 2011

Thursday, May 12th, 2011

The other night, at a cocktail party, I excused myself from holding forth about Pakistan so that I could refill my wine glass. While I stepped away, the gentleman to whom I was talking whispered to Kathleen, “Your husband sure knows a lot.” Indeed, I was shamelessly gratified, the next morning, to find nearly everything that I had said echoed (as it were) in Lawrence Wright’s latest contribution to The New Yorker‘s Annals of Diplomacy, “The Double Game.” I did know a lot — and I learned almost all of it from reading articles at 3 Quarks Daily. 

I recommend the site highly, especially to Zalmay Khalilzad, the Afghan diplomat whose Op-Ed piece in the Times, “Demanding Answers From Pakistan,” struck me this morning as almost hygienically pure of common sense, not to mention common knowledge about Pakistan. Mr Khalilzad envisions diplomacy as a sort of chess game, which every nation plays in the same way and with a single purpose. Did diplomacy ever function in that way? I understand that treaties are designed to pretend that it does, but beyond that I have always understood that each nation plays the diplomacy game according to its characteristic bent. It would be treason not to do so. Every sovereign power owes a far greater duty of care to its subject people than it does to other sovereigns, and most sovereign powers represent multiple constituencies with inconsistent, sometimes colliding interests. And while it is laudable to encourage one’s own sovereign to play the game fairly and candidly, it is childish to expect other sovereigns to do so.  

This is why the second prong of Mr Khalilzad’s “two-stage strategy” for obliging Pakistan to behave more like an ally is so fatuous.

Then we should follow up with demands that Pakistan break the backbone of Al Qaeda in Pakistan by moving against figures like Bin Laden’s deputy, Ayman al-Zawahri; remove limits on the Predator drone campaign; uproot insurgent sanctuaries and shut down factories that produce bombs for use against American and Afghan soldiers; and support a reasonable political settlement in Afghanistan.

This assumes a sovereign unity in Pakistan that simply does not exist. Lawrence Wright quotes the late Benazir Bhutto’s description of the powerful Inter-Services Intelligence directorate as “a state within the state.” Then he proceeds to write about S Wing, a supposedly secret organization comprised of retired ISI officials that operates within (or alongside) the ISI. It is thought that if Osama bin Laden had any support from Pakistani officials, it was the men of S Wing who knew where he was and who respected his cover. It would seem that the official government of Pakistan has little or no say in the doings of S Wing. The Predator drone campaign is understandably unpopular with Pakistani people, and expanding it in any way would increase the unpopularity of the government, which whatever its party affiliation or campaign promises is invariably drawn from the wealthy, largely feudal (landowning) elite. This ruling class has undertaken for decades to distract disaffected Pakistanis with the pursuit and acquisition of Kashmir, which was foolishly allotted to India in the last hours of the Raj. Pakistan’s dealing with Afghanistan — another country that, as we have found to our cost, harbors a number of mutually hostile elements that jockey for nominal control of the nominal government — are in contrast cloudy and multifarious. It is unclear what Pakistan stands to gain from “a reasonable political settlement” in Afghanistan, and Mr Khalilzad’s blather actually serves to underline this point. 

It is in neither America’s interest nor Pakistan’s for relations to become more adversarial. But Pakistan’s strategy of being both friend and adversary is no longer acceptable.

While Mr Wright and many others entertain the possibility of withholding financial aid to the government of Pakistan, in retaliation for the double game that Pakistan appears to have been playing (against itself as much as the United States), that option figures nowhere in Mr Khalilzad’s four-pronged backup plan (in case the “two-stage strategy” fails). Doubtless his diplomatic proposals are pregnant with significance for diplomatic insiders, not so much for their patent content as for their timing (and Mr Khalilzad is in any case far more concerned about Afghanistan than he is about Pakistan). But I can’t imagine what use the editors of the Times expected the bulk of their most educated and well-informed readers might make of Mr Khalilzad’s boilerplate. It’s almost criminal of them to make precious Op-Ed space available to a writer who has so little genuine information to offer.  

Big Ideas:
A Question of Timing
The Most Human Human, by Brian Christian
Wednesday, 27 April 2011

Wednesday, April 27th, 2011

Is it age? Every book that I read these days seems to be utterly remarkable, unprecedented, world-changingly important. At my age, it ought, one would think, be the other way round. Nothing new under the sun and all that. But no: I appear to have lived just long enough to glimpse the sun for the first time.

The subtitle of Brian Christian’s The Most Human has a banal and fatuous ring, just like most other subtitles these days. “What Talking with Computers Teaches Us About What It Means to Be Alive.” That’s comprehensive! “Talking with computers”! “Being alive”! The only way to make it sound even more portentous would be to throw in something about changing the world forever. But that’s exactly what’s already implicit: the suggestion that we wouldn’t know much about being alive if we couldn’t talk to computers — something that almost everyone knows was impossible until quite recently. The world has been changed, probably forever. To demonstrate the point, you have only to turn Christian’s subtitle into a question, and interrogate the recent past. What, in 1960, say, did talking with computers have to tell us about what it meant to be alive?

Exactly nothing. For one thing, there was no talking with computers in those days. ELIZA, the first AI program capable of dialogue, was still a few years in the future. More to the point, nobody (beyond a tiny handful of visionaries like Claude Shannon) had any idea that a machine of any kind had anything to teach about being human (which, for our purposes, is what “being alive” means). Machines were tools, and they were also threats, just as they had been since the early days of the Industrial Revolution, when French workers invented sabotage by throwing their wooden sabots into looms, and Mary Shelley envisioned Dr Frankenstein’s monster. The computer was simply the latest in a string of inventions whose unanticipated powers might, many feared, bring down Promethean punishment on mankind. Computers were what made annihilation by ballistic missiles possible. They told us what it meant not to be alive.

So, if you were a bright young person in 1960, an interest in computers would probably carry you away from the pursuit of humanist wisdom that motivates the prodigious Brian Christian. (Can he really have been born in 1984?)  And if you didn’t choose cold blue steely science, you would make your mark in the wilds of the Peace Corps, then in the first flower of Rousseauvian idealism.

In 1960, computers were gigantesque, exorbitantly expensive, and not very powerful. As that changed, so did the world of business, which went from humdrum to hot in the same years that computers shrank to PC proportions. In 1985, asking a computer what it meant to be alive might very well take the form of a Lotus spreadsheet, splaying out competing mortgage options. Not very transcendent stuff, but something:  a computer might help you live a (marginally) better life. By the late Nineties, talking with a computer meant talking with the entire world. Email displayed a dark side of bad manners that the sheer clunkiness of snail mail had helped to conceal: not only did hitting “Send” prematurely expose one’s id in unflattering ways, but it also tempted your correspondent to “share” the spectacle of your bad behavior with, ultimately, everyone else with a computer. That something genuinely new was going on seems to be clearly indicated by the fact that the era’s countless flame wars, far from killing anyone, provided a global learning experience. Not only did we all learn something about self-restraint, but we learned it together.

If you want to know what a computer has to tell you about being alive today, you face the daunting threshhold problem of defining “computer.” Is your iPhone a computer? Your iPod? What about all the diagnostic tools, such as fMRI, that depend upon computational wizardry for their effectiveness? Physical gadgets will be with us for a long time, but “computers” seem to be dissolving into “computing,” something performed by many types of device. Unlike the computing of 1960, modern computing is ubiquitous and intimate; we wouldn’t want to live without it.

When I read the excerpt from The Most Human Human that appeared in The Atlantic a few months ago, I pegged Brian Christian for a forty-something journalist specializing in science and ethics, a field that ordinarily leaves me cold. I thought that, before his piece came to an end, Christian would be sounding hair-tearing alarms about the growing power of AI chatbots, which, he pointed out early on, nearly passed the Turing Test in 2008. All we need say about the Turing Test at this point is that it posits a point at which computers might be said to be capable of thought, by appearing, to a jury of human beings, to be capable of human conversation. The Turing Test became a bulwark of humanity, the breach of which by human-seeming computers would signify a Something Awful that I expected Christian to spell out in hectoring detail. What toppled instead, however, was my own expectation.

Christian turned out to be an extraordinarily well-educated young man — how right he is to dedicate his book to his teachers! — without a moping bone in his body. Far from being a passive journalist (or disgusted observer), Christian had the computer-science chops that would enable him to enter the contest himself. It’s at this point that we need to look back at his subtitle. What he does not say is that he beat the computers in the Turing Test. There are no computers, really, in The Most Human Human. There are only people “being themselves,” and other people trying to make machines simulate people being themselves.

“Being himself” is the very first thing that Christian decides not to do, and herein lies the glory of his undertaking. Like legions of high-school students facing aptitude tests, he is advised by the Test’s manager that there is no special training that will enhance his performance. Piffle, says Christian.

So, I must say, my intention from the start was to be as thoroughly disobedient to the organizers’ advice to “just show up at Brighton in September and ‘be myself'” as possible — spending the months leading up to the test gathering as much information, preparation, and experience aas possible ad coming to Brighton ready to give it everything I had.

Ordinarily, there wouldn’t be very much odd about this notion at all, of course — we train and prepare for tennis competitions, spelling bees, standardized tests, and the like. But given that the Turing teest is meant to evaluate how human I am, the implication seems to be that being human (and being oneself) is about more than simply showing up. I contend that it is.

Unlike Christian, I lived through the late Sixties, and I’ve been contending that just showing up is not enough for a long time now. It has been an unfashionable thing to say. But unlike Christian I came of age long before learning from AI (“computers”) was an option. It’s very hard not to be jealous of a bright young man whose timing appears to have been excellent. He can do much better than mouth plausible platitudes about contemplation and great books. He can “do the math,” and he does it in a way that any intelligent reader will grasp at once and with pleasure. If I weren’t so grateful, and if I didn’t so much admire Christian for making the most of his superlative opportunity, I’d be eaten alive by the green-eyed monster.

Periodical Note:
Franzen on Wallace

Monday, April 18th, 2011

It is very difficult to imagine Jennifer Egan, the latest winner of the Pulitzer Prize for fiction, ever taking her life, or indeed doing anything that would cause her readers sorrow. I have nothing but a packet of intuitions to support what I’ve just said, but I can boil them down to one point of surmise. Both as a writer and as a woman standing in front of strangers reading from her work and answering questions about it, Egan seems to me to be Not A Romantic. She also appears to be untroubled by mental illness. Her fiction is dark but clear, and it recurs to an old and unfashionable view of human nature, according to which we do more harm to other people than we do to ourselves. This is at odds with tragic modernism, which pierces the gifted hero with the spears of his own strength. Egan doesn’t believe in heroes. I think that she believes in curiosity — a curiosity that kills some other cat.

These thoughts are occasioned by perusal of The Pale King, and by the strange multifaceted essay that Jonathan Franzen published in last week’s New Yorker. I’m not sure that it was a good career move for Franzen to sift the ashes of David Foster Wallace’s career and death; there are too many moments in the piece where Franzen sounds like the earnest older brother whose lamentation for a fallen sibling muffles unmistakable cackles of self-satisfaction.

That he was blocked by with his work when he decided to quit Nardil — was bored with his old tricks and unable to muster enough excitement about his new novel to find a way forward with it — is not inconsequential.

Franzen (who seems much more like Jennifer Egan than he does his late friend) inflects his observation that “David died of boredom” with the accent of someone amazed by such a remarkable feat, and clearly incapable of it himself.

Which was it that killed David Foster Wallace — his romanticism or his depression? I suspect that they were densely intertwined. What interests me about wallace isn’t the accretion of small-scale discouragements that must have surrounded the locality of his suicide, but rather the doom that was presaged by the unruly immensity of Infinite Jest, twenty-odd years ago. I haven’t read it, but I understand it to be an attempt, as Wallace put it in conversation with David Lipsky, to capture “what it feels like to be alive right now.” Although he was the keenest of reporters, capable of reducing almost every observation to fully-articulated prose, I’m not sure that Wallace will be remembered for his fiction.  I do mean to read The Pale King, but not as a novel. Because it’s unfinished, retains a rough documentary quality, attesting to the author’s accumulation of views, that a final editing would almost certainly have effaced. Or maybe not! 

The people who knew David least well are most likely to speak of him in saintly terms. What makes this especially strange is the near-perfect absence, in his fiction, of ordinary love. Close loving elationships, which for most of us are a foundational source of meaning, have no standing in the Wallace fictional universe. What we get, instead, are characters keeping their heartless compulsions secret from those who love them; characters scheming to appear loving or to prove to themselves that what feels like love is really just disguised self-interest; or, at most, characters directing an abstract of spiritual love toward somebody profoundly repellent…David’s fiction is populated with dissemblers and manipulators and emotional isolates, and yet the people who had only glancing contact with him took his rather laborious hyper-considerateness and moral wisdom at face value. 

That is not the profile of a writer of mature literary fiction. Not every great novel is about love (and its failure), but most are, and to write instead about the simulation of love seems to me an adolescent exercise, something that you do before you have fallen in love yourself, while you are still tempted to think that love is an illusion, a puff of poetry, as a way of excusing the monstrous defect that must be preventing you from sharing a transcendant experience. For my part, I see adolescence — struggling to be an adult — at the bottom of addiction (substance abuse provides both a shortcut to the desired state of mind and a distraction from the search for it) and at the heart of romanticism as well. As if to prove my point, Wallace not only dressed like a teenager, but like a teenager from an earlier era — a hippie, in fact. I’m always surprised that Wallace’s appearance goes unmentioned, when it was so patently a costume.  is shambolic appearance is at complete odds with the precision of his voice, even when that voice is registering the blur of ambiguity. 

It’s not a stretch to imagine Jennifer Egan writing a novel about David Foster Wallace.

Big Ideas:
Artistic Value

Monday, March 21st, 2011

At Ward Six the other day, J Robert Lennon tossed in a note about Tadzio Koelb’s deflating review (in the NYTBR) of Rebecca Hunt’s Mr Chartwell.  Koelb wrote,

Now England has seen the rise of “Mr. Chartwell,” a humorous and amiable novel about which such extravagant claims have been made — for its prose, psychological insight and emotional depth — that one might imagine a work to rival Robert Burton’s “Anatomy of Melancholy” instead of what is, in fact, well-packaged chick lit.

The end of Lennon’s note stuck with me: 

While I am enjoying the democratization of literary discourse that the internet has brought us, the trend Koelb describes is a consequence of the decline of newspapers and print magazines–hardly anyone is being paid to recognize artistic value anymore. And so, I fear, hardly anyone is bothering.

My first reaction was to protest: I’m not being paid, and yet I am bothering to recognize artistic value. My second reaction was to wonder if the first was actually correct. I’m conscious of being on the lookout for interesting things, and of trying to explain what it is about things that interest me that interests me. But: recognizing artistic value? I’m not sure that I believe in it. And it’s not as simple as doubting that “artistic value” exists. There’s the matter of recognition, too, the sense, which I think Lennon intends, of making an award. You pin a blue ribbon on something, and, voilà, it has artistic value. (The ribbon is what you have to say about it, and the quality of that ornament is for others to judge.) You go on to the next thing, leaving your little ribbon behind for all time. 

This old model of critical authority has almost completely broken down, not because we don’t have faith in people who make authoritative pronouncements (we’re if anything too credulous) but because we don’t have time for them. All we want to know is whether to read the book or not. Will our friends all be reading the book? There is no need for much of a ribbon; a letter grade will do. This is indeed what has happened in “the democratization of literary discourse.”  I’m unfamiliar with the string of admiring reviews that Mr Chartwell evidently garnered — I hadn’t heard of the book before reading Koelb’s review — but his description suggests an excited readership enthusing over a shiny bauble. I daresay that careful readers of those reviews were not deceived into thinking that Rebecca Hunt might take a place alongside Trollope and Tolstoy. They could probably tell that satisfaction was guaranteed by a plaubile patina of “history.” I’m reminded of Frederick Arbuthnot, the happily faithless husband in The Enchanted April, who writes sexy potboilers under an assumed name. 

He wrote immensely popular memoirs, regularly, every year, of the mistresses of kings. There were in history numerous kings who had had mistresses, and there were still more numerous mistresses who had had kings; so that he had been able to publish a book of memoirs during each year of his married life, and even so there were great further piles of these ladies waiting to be dealt with. 

What has changed since those days is that nobody is being punished for publishing critical flummery anymore; it’s unlikely that anyone is going to lose a gig because Tadzio Koelb has seen through a gushing review or two. The people who care about psychological insight and The Anatomy of Melancholy won’t be lodging complaints, because they won’t have been tricked into buying the book. 

So, then, what am I doing? I’ve already said, putting it with cheeky complication: I’m “trying to explain what it is about things that interest me that interests me.” What’s left out of that formulation is the time-stamp, which is always set to “right now.” What interests me now? It’s not necessarily what interested me last week or last year, or when I was in my twenties. And what interests me now has been shaped by what has interested me (recently, for the most part, but not always), so that my liking a book this week may be tied up in my having liked another one last week, or last month, or whenever. Far from being an unchanging authority who makes judgments according to some fixed protocol, I’m more or less impressed, literally, by everything that I read. It would almost be better to say that the book judges me. 

That’s what I was thinking on Saturday night, listening to Rudolf Buchbinder and Orpheus play Mozart’s D minor concerto — the most dramatic of the lot and destined to stand at or near the top of anyone’s ranking. I was trying out a new metric: the measure of a performance’s excellence is the extent to which it blots out all others. It would not occur to me, in connection with any actual concert, to judge the concerto itself.  I might say that it reminds me of my sheltered youth, when I could hardly imagine what real tragedy would be like; or I might wonder what the first audience made of it — I expect that everyone who stopped talking and listened was aware of unprecedented music; or I might quote Donald Francis Tovey (well, no, I couldn’t; Tovey didn’t write it up). But nothing in any of these passing, colorful remarks would address the music that Mozart wrote. I am no longer sufficiently conceited to believe that I have anything useful to add to the overflowing store of Mozart commentary. So much for his artistic value. As for that of the performance, my new metric keeps things simple. I can report that the horns had a bit of a flub in the Romanze — one that happens often enough to make its way onto recordings — but I don’t expect you to be find this news interesting.

Of Mr Buchbinder’s reading, I’ll say that it coincided with an ideal of the concerto that I carry around in my head. His playing was temporally acute (by which I mean that he kept time in an interesting way) and dynamically expressive (he used the shift between loud and soft to structure the thematic lines). His left hand was particularly gratifying: the low notes were always sonorously there, assuring us that the current of music was flowing through clear and capacious channels. But because I never heard a thing that I did not hope or expect to hear, I cannot say that it was the best imaginable performance of the concerto. I have to be content with pronouncing it extremely well done, and very satisfying to sit through. What I hope I’m conveying is that this “judgment” is really about me. Beyond a presumable level of competence, the performance of music is so peculiar to time and place, to the vibration of the air between player and listener, as to have the quality of magic. It makes no sense to attempt objective descriptions of such things. 

In his blog entry, Lennon mentions Jonathan Franzen’s Freedom. 

I am still bewildered by the fact that nobody seems to have recognized Freedom as Jonathan Franzen’s worst book; it’s a lopsided domestic drama with a lot of timely and unnecessary sociopolitical nonsense slathered over it.  (FWIW, I enjoyed it anyway–but it is not up to Franzen’s usual standard.)  In that book, we were seduced, I think, by its ambitious title, its environmental subplot, its political undertones.

Worst book? That’s more impish than intelligent. “Least successful,” perhaps — and I say that not because I agree with Lennon about Freedom but because there is no call to speak of the “worst book” of a writer who always turns out excellent, sometimes extraordinary, work. I do agree that there was a lot of tedious hype about the novel last spring and summer, and that the novel was tedious to write about because one couldn’t begin without clearing away at least some of the critical lumber.

What matters more, in Lennon’s commentary, is that he enjoyed the book even though it wasn’t, in some way or other, good enough. What is the point of Jonathan Franzen’s maintaining his “usual standard” if readers will like what he writes even if he doesn’t? What is this standard, this excess beyond enjoyment? Let’s talk about that. 

Reading Note:
Lilla on Bakewell on Montaigne

Thursday, March 10th, 2011

In the current issue of The New York Review of Books, Mark Lilla gives Sarah Bakewell’s wonderful Montaigne book, How to Live, what begins as a nice review. He praises it as a genuine introduction to Montaigne’s work and to the circumstances in which it was written; and he takes the occasion to deplore the “scholarly detritus” that has supplanted the informative prefaces that used to be aimed at the general reader.

Bakewell begins at ground zero, much as Montaigne did, without assuming anything more than that her readers have an interest in themselves and a desire to live well, which she addresses by cleverly organizing her book as a series of suggestions Montaigne makes for doing just that.

Then Lilla sums up the contemporary consensus about Montaigne, which is that his essays have no agenda. As Bakewell puts it, Montaigne’s collections of self-portraits and miscellaneous musings “does not have designs on you; you can do as you please with it.” With this proposition Lilla heartily disagrees, and he spends the rest of his lengthy review making a case that Montaigne’s transparency is an illusion wrought by his immense influence: he is, as many readers feel him to be without perhaps knowing why, the father of modern man.  

tated in a positive sense, Montaigne was the first liberal moralist. Ancient virtues like valor and nobility, and Christian ones like piety and humility, were unattainable for most people, he thought, and only made them vicious and credulous. But rather than say that directly, a suicidal act, Montaigne sang a song of Montaigne, giving himself virtues that we accept without question today as being more reasonable and attractive: sincerity, authenticity, self-awareness, self-acceptance, independence, irony, open-mindedness, friendliness, cosmopolitanism, tolerance. He was an idealist, though, not a realist. And his ideal reshaped our reality. It is hardly an exaggeration to say that the reason Montaigne knows us so well is that he made us what we are (or at least what we profess to be).

In the very next sentence, Lilla contradicts this claim: “But he did not entirely remake us.” No, what Montaigne couldn’t alter was the desire for transcendance that springs naturally in the human breast. He could counsel against yielding to it, by referring to the horrors that the search for transcendance throws off in its most widespread form, organized religion. Montaigne was writing through the religious wars that wracked France in the final third of the Sixteenth Century; in no other country did the old faith and the new struggle so relentlessly to  extirpate the other. Lilla finds that, the more you read of Montaigne — especially if you read the Essays in order, and more or less all at one go — the more clearly an anti-transcendance message emerges. Lilla construes this as necessarily an anti-Christian message, as well as an anti-heroic one. And he faults Bakewell for not pointing out that Montaigne is a corrupter.

Lilla takes the longing to transcend the limitations of everyday life — and the corresponding contempt for the “mediocre life” extolled by Montaigne, that connoisseur of comforts — as a natural good. He writes with the air of a breathless messenger who, by reminding us of something vital, something that we had been lulled into forgetting, brings us back to our senses at the last minute, before we rashly sign away everything important about life.

By refusing to recognize the grandeur in our desire for transcendence, our urge to understand what is, to experience rapture, to face and overcome danger, to create something bold and lasting, Montaigne offered no guidance for coping with it, let alone directing it to good ends. And his silence had consequences. The Essays not only inspired a skeptical Enlightenment that aimed to make modern life softer, freer, and more humane, with some success; they also, through Rousseau, helped inspire a Romantic cult of the self that beatified the individual genius and worshiped his occult powers—also with some success. The easy inner reconciliation Montaigne offered his readers has proved as impossible for them to attain as sainthood was for his Christian contemporaries. Suggesting, perhaps, that the most we can ever hope to achieve is reconciliation to the fact that we will never be reconciled.

I’m not so sure. Reading Bakewell’s book, I felt encouraged to hope that Montaigne may inspire a third development, that of a society of sociable individuals, of men and women who have outgrown the urges that Lilla enumerates — the rapture and danger and boldness and grandeur that always beckon from outside and beyond our mortal frames but that only carry us deeper into the prison of our own individual experience (no matter how powerful the illusion of connecting with “something greater” — it is the subjective feeling that matters). I read the other day that children are natural philosophers; I agree, and I think that it says something about systematic philosophy, which in the last couple of years has come to seem to me to be rather astonishingly juvenile (given its august if dusty place in the scheme of things), yet another attempt to justify persisting in a childish pastime by giving it a serious look.

Now that I am an old man, these caperings are more obvious as such. Whenever I hear the word “hero,” I think of the adolescent impusles that seasoned old codgers have been exploiting for millennia. I draw the self-sacrificing line at taking risks in order to assist those who are weaker; self-immolation is to be confined to the opera stage. I remember all the outsized longings, but I regard them as signs of immaturity, and I’m delighted to have survived them.

I depend enitrely upon my fellow man and woman for meaning and pleasure. I hope that a few men and women can depend upon me for some of the same, but I myself am not a source of interest to me. Ive necessarily got to take an interest in the fact of myself, as a problem-in-progress, if you like. But that’s a responsibility, not a pursuit. I have no objection to your concern for a soul, if you believe yourself to be possessed of one, but I do object to your placing that concern ahead of your concern for the rest of us. I believe that Jesus shared this objection.

You know there’s a commandment against murder. Where would you draw the line? Would you say murder is wrong, but beating someone is maybe a little less wrong, and just being angry with them isn’t wrong at all? I’m telling you that if you’re angry with a brother or a sister, by which I mean anyone at all, even if you’ve just got a grudge against them, don’t dare to go and offer a gift in the temple until you’ve made your peace with them. Do that first of all.

That’s part of the Sermon on the Mount as reconceived by Philip Pullman in his wonderful little book, The Good Man Jesus and the Scoundrel Christ. It’s interesting to note that, for all his talk of Christian values, Mark Lilla mentions Jesus only once, and then only in passing, as a byword for Christianity. The Christian wisdom that he likes to quote from goes back to Augustine, the inventor of many onerous and unforgiving Christian dogmas. Before launching tirades against the likes of Montaigne, Lilla ought to examine his own Christianity with a view to casting out the un-Jesus-like selfishness of seeking personal redemption. Happily, he is not so preoccupied by the need “to become other than we are” that he can’t share his well-put thoughts with us. 

Big Ideas:
Humanities and Higher Ed

Tuesday, March 8th, 2011

In the current issue of The New York Review of Books, the eminent professor of comparative literature Peter Brooks considers a clutch of books about higher education in America, all of them critical but most of them, in Brooks’s view, off-target or even “pernicious.” Every time I tune into this querelle, I assume the posture of a critic myself, ready to strip the ivy from the walls and the tenure from the profs. Lately, however, I’ve been asking myself why I’m hostile to the modern university, since I got an excellent education at one myself. Now, it’s true that I got an excellent education at Notre Dame more or less in spite of institutional constructs; I did not perform very well “academically,” largely because I was never persuaded that it was important to go to classes and to do well at tests. (I’d have been done for if I hadn’t been keen on writing, which I took very seriously.) But what’s wrong with this picture? Like a co-ed who goes to college to find a husband, I hung around libraries and seminar rooms hoping to develop an intellect. That certainly happened, even if it did take decades to manifest itself. And if my studies were economically useless, I did emerge with a trade skill, one that, at least in those days, was pretty much taught only on an extra-curricular basis at the nation’s colleges (radio broadcasting). I would have been a slow- to late-bloomer no matter what kind of school I’d gone to. I really shouldn’t complain. 

And yet I do — for the very reason that I did get an education. I did. Most of my classmates, who were more chary about following the rules, did not. They jumped a series of hurdles until they landed on one kind of career track or another, and that was it for “education.” I don’t mean that they stopped reading good books. What I mean is that what they studied in college was how to do well in school. It was laid out very carefully for them, with teachers ready to help them from one step to the next. If they were diligent and obedient, they wound up with a nice degree — which is to say, a burnished resume. Their good grades would get them into a good professional school (business, medicine, law, architecture), where the aptitude for doing well in school would pay even higher rewards. They might even become teachers themselves. But their educations were over.

What we call “education” in this country — the institutions of higher and lower learning — is aimed at young students, and it is designed to produce “outcomes.” Genuine education, in contrast, is an ongoing, never-ending affair. It gets harder (if more satisfying) as you get older — because you have to swallow the bitter pill of recognizing that most of what you were taught when you were young is now deemed to be wrong, misguided, or simply out of fashion. You have to start all over again. And again. Even if you’re a teacher yourself. This does not come naturally, and it’s even harder to get traction in a system that “graduates” its students by dumping them back into a real world full of defiantly
uneducated people. 

I’m all for professional school. I don’t want to board an airplane that has been designed and fabricated by well-meaning amateurs. I wish that I’d done better at math, and I don’t know whom to be angrier with about my lack of Latin, myself or my schools. I was a consistently sloppy student — it wasn’t until I got to law school that I understood the vital importance of diligence — and I don’t wish that I’d been graded for creativity and imagination. Measurable academic achievement really meant nothing to me, and it still means nothing, possibly because I don’t believe that anything important is ever achieved. (I treat the idea of “achievement” as do the French, for whom the word is a term for death — game over.) If you build a great building, whether by designing it or obtaining the funding, you had better have a plan for maintaining it, because it is going to need to be rebuilt in twenty-five to fifty years or else demolished (“achieved”). I’m glad that the skills of engineers and architects are rigorously tested. But there is no skill set that will teach you what might be expected of a great public building. That is what the humanist curriculum is for.  

This isn’t the place for me to lay out my ideas for teaching the humanities seriously. I merely wish to register my conviction that current higher education in the humanities is wasteful and ineffective: the liberal arts are not being taught, and at great expense, too. There may be fine professors out there, like Peter Brooks, who really do illumine the minds of their students, but Brooks himself highlights the problematic nature of his own work, vis-à-vis “successful outcomes” and such, when he observes that it “might best be evaluated, I have often thought, by what students are thinking about and dreaming twenty years after graduation.” If there is a way to subject that kind of teaching to cost-benefit analysis, then it still lies far beyond the grasp of our most brilliant AI research. The one book that Brooks admires, Martha Nussbaum’s Not For Profit: Why Democracy Needs the Humanities, argues against the “marginalization” of liberal-arts studies “by technocratic and business-oriented demands.” But how can a major research university, with its staggeringly expensive laboratories and its equally costly (but loss-leading) athletic programs, shelter the humanities without proposing exceptionalist arguments that it will never argue persuasively? 

Contrary to received understanding, the university as we know it was never intended to be the liberal-arts finishing school for the well-to-do that the Ivy League colleges became in the later Nineteenth Century. They still aren’t, and neither are the great public institutions. There is no free lunch in this country, and the loftiest benefactions contemplate measurable results. If we all agree that citizenship in a democracy effectively requires some exposure to humanist values and habits of mind, lets stop counting on schools to provide the service as a sideline. Let’s re-think.

Big Ideas:
Porter on Pricing

Tuesday, March 1st, 2011

My one quibble with The Price of Everying: Solving the Mystery of Why We Pay What We Do, Eduardo Porter’s wonderfully readable survey of the function of pricing, is that he didn’t put his last chapter, “When Prices Fail,” at the beginning of his book; I also wish that he had dealt a little more aggressively with the toxic strain of Chicago thinking exemplified by Eugene Fama. Mr Fama says that talk of economic bubbles “drives me nuts”; it’s his Panglossian belief in efficient markets that drives me nuts. The first thing to learn about prices is that they are often wrong, and wrong for the very reason the existence of which thinkers of Mr Fama’s persuasion deny: neither buyers nor sellers have enough information to set a correct price. Market prices, moreover, are always somewhat arbitrary, in that they’re spot prices, reflecting the needs of the moment. There is no way for the buyer and seller of a barrel of oil to develop an agreeable estimate of the environmental cost of the use of that oil, whether as fuel or otherwise. Environmental costs are necessarily determined outside the market. We are still pricing oil as if they did not exist — as if the twenty metric tons of carbon dioxide that the average American produces every year were not a problem. 

There is no reason to expect us to be any better at setting environmental prices than we are. Until three or four hundred years ago, the long-term consequences of human activity were limited to the supply of fertile soil. We could, as the Mayas did, run out of the resources needed to support civilization, but exhausting the environment was a temporary thing. It is only with the large-scale industrial and engineering projects of the Nineteenth Century that we began to test the limits of the natural world’s recuperative powers, and we were understandably slow to assess our impact. Blake’s dark Satanic mills were objectionable for their human costs; nobody seems to have thought what caused those famous London fogs until the town ceased to belch tons of coal soot into the air every day. Anyone who foresaw what the proliferation of vehicular traffic would do to air quality in Los Angeles or Denver would have been dismissed as a crank. 

As Porter shows us, Sir Nicholas Stern, author of the 2006 Stern Review on the Economics of Climate Change, has been dealt a more polite version of crank-dismissal by William Nordhaus, a Yale professor who does not doubt that we’ve got to do something to reverse course on climate change, but who questions the importance of preventing damages set to accrue after the year 2800. These are early days indeed for the economics of stewardship. 

The recommendations to combat climate change in the Stern Review stand uncomfortably alongside this principle of social justice. If income per person were to grow by 1 percent a year over the next two centuries, less than half the pace of growth of the last century, peeople in the year 2200 would be 6.3 times as rich as they are today. Why should the poorer people of the present scrimp and save in order to protect the environment for their richer descendants, who could afford more environmental investments than we can?

It’s the asking of questions like this that highlights the importance of The Price of Everything. The important thing to do right now is not to stop carbon emissions — important as that certainly is. Before embarking on any ambitious schemes to curtail this environmental damage or to encourage that environmental boon, we need to match our anxieties about the future with an awareness of the past, the history of which has only begun to be written. How did we get here? What were we thinking? In “The Price of Work,” Porter analyzes the worker-friendly policies of bygone giants such as AT & T and Eastman Kodak. Today’s corporations, he writes, 

can no longer afford the generosity of the corporate leviathans of the early twenieth century, which relied on a unique feature of American capitalism of the time: monopoly profits. As a dominant company in a new industry with high barriers to entry, Eastman Kodak haad a near monopoly over photographic film. Ford also enjoyed fat profits unheard of in the cutthroat competitive environment of today. 

Monopolies can be good, in other words, for workers. That they may not be optimal for consumers is a consideration that has to be balanced on the recognition that consumers are workers, too. In what circumstances might monopolies serve consumers as well as they do workers and (of course) investors? It may be time for a fresh inquiry. (It’s my view that the facilities for delivering power and water to consumers ought to be municipal monopolies maintained at public expense, and geared to local demands. What’s also needed, if this is to happen, is an improved model of political accountability, one that deftly blends the virtues of transparency with the operational baffles that protect administrators from the whims and caprices of popular enthusiasm.) Everything that a person of my age was taught in school is probably wrong forty-odd years on. Just like every other aspect of human affairs, economic conditions change over time. Searching the marketplace for scientific principles with the eternal applicability of Newtonian physics is misguided, simplistic, and childish.   

Readers who aren’t much interested in the dismal science will find an incredibly interesting extension of the very idea of pricing in “The Price of Faith.” In Porter’s hands, religion looks a lot like a luxury brand that becomes more appealing as it becomes more expensive, not less. Why should that be? Because “more expensive” means “more exclusive,” naturally enough. A religion that imposes personal sacrifice and ritual burden on its members is more likely to hold onto them — as the Catholic Church found out after Vatican II, a loosening — price reduction — that went too far for some communicants but not far enough for others. The history of the Roman church also shows that it is never a good idea to substitute money prices for those sacrifices and burdens — a very undogmatic development that triggered the heart of the Protestant disaffection. 

The Price of Everything is an intelligent book that, for all its surprising nuggets of information, avoids the contrarian and the counterintuitive. But it is  enormously provocative, because it encourages the reader to approach the prices in every aspect of life, and to recognize that money only one way of making payment. The most common alternative to money is time, and the more you have of the one, the more willing you’ll be to spend it for the other.  

Big Ideas:
Marshall McLuhan

Wednesday, February 16th, 2011

How supremely piquant it was to read, in one swallow, Douglas Coupland’s book, Marshall McLuhan: You Know Nothing of My Work! (the subtitle comes from a line spoken by McLuhan himself in Woody Allen’s Annie Hall), on the day when Borders’ bankruptcy, long anticipated, was finally announced. Way back when Borders was taking off, expanding nationally, buying WaldenBooks, hadn’t anybody read The Gutenberg Galaxy?

I’m not going to pretend that I read it, not the whole thing. Like everyone else, I thought, at the time, that Marshall McLuhan was hostile to the high culture of the West, and that he relished its immolation in staticky, low-resolution images of bad television. I thought that he welcomed the End of Civilization As We Knew It. I also thought that he was impossible to read. I regarded McLuhan as a mad Canadian, driven by the boredom of the prairies to predict a human cataclysm. But I sensed that he was right about books, somehow or other.

The Enlightenment dream of mass readerships turns out not to have been psychologically acute. For most people, reading is an escapist, not an instructive pastime. Few people read to learn if they’re not required to do so. The vast run of retail history books, for example, is hardly more scholarly than the romance fiction and knitting manuals that “history buffs” look down their noses at on their wives’ and girlfriends’ nightstands; weighty tomes as they may be, the books simply massage pre-existing accumulations of facts relating to this or that war. Reading, ironically, is not a visual activity; it puts our ocular apparatus to an unintended use. (Nothing is more natural than unintended uses.) Most people would rather sit back and watch something. For a century and a half or so, beginning in 1800, a combination of civic virtue — democracies have been thought to depend upon literate electorates —and the absence of alternative entertainments conspired to create the illusion of a vast reading public. Well, there may actually have been a vast reading public, for a while. But it was not a willing one, and when technology advanced after World War II, and authority retreated, books were replaced by screens.

Coupland’s biography, of course, is merely an extended essay, blending stories from McLuhan’s life with glancing meditations on the vastness of Canada, academic pettifoggery, and the Internet — something that McLuhan would have loved to hate, according to the author. This is the kind of book that we like to read now: brisk, knowing, and personal. Of course a biography ought to be personal, you might say, but I mean personal with respect to the writer, who is something of a cultural groundbreaker himself. (Coupland coined the term “Generation X.”)  It will not replace the serious studies by Marchand and Gordon that are mentioned at the outset (but identified only in the notes), but who would read those now save students of intellectual history? You Know Nothing of My Work! links the mad scientist to the mad world that he foresaw. If it fails to deliver a plausible account of the transformation of a Renaissance scholar into a media guru for whom that very term had to be invented, it does a fine job of suggesting why nobody — not McLuhan, not the businessmen who retained him, not even Pierre Trudeau — was able to mine any practical advantage from his work. If McLuhan sensed the outlines of a coming era, he was nevertheless unable to speed the coming. Much of the time, he comes across as a more successful John Forbes Nash, possessed of a beautiful mind that was better attuned to perceptible patterns.

At the end of the book, Coupland tells us that he was inspired to write it by the history of his own Canadian family, and he evokes the life of his cement-salesman grandfather in a passage that’s worthy of Alice Munro.

What thoughts would fill the mind of Arthur Lemuel Campbell? Did he hate the past? Did he want to drive into the future, and, if so, where did he perceive the future as being — to the west? To the east? Above his head? All that driving and all that flatness, all thoses Sundays and rooming house meals with pursed lips and ham hock dinners with creamed corn and the fear of God. Our Father, who are in heaven. And always the family left behind — High River; Regina; Edmonton; Swift Current — family gone crazy, family gone religious, family dying young. Don’t complain and don’t explain. Cut your losses. Cut your family before they cut you. Be weak. Be crazy. Be insane. Be humble. Bow before God. Pretend you’re something you’re not. Rise above your station and pay the price. Keep you opinions to yourself. Die alone, even when surrounded by others. You will be judged. There will never be peace. There will never be sanctuary, because there will always be something lurking on the other side of the horizon that will be a threat to you. Pay cash. Credit is the devil.

Indeed.

Of all the bookstores that I’ve ever visited, Borders was easily the most decadent, the most intoxicated by the idea that books are precious objects that radiate their contents in glimmering auras; there can’t be any need to read books if you’re surrounded by so many excellent titles. (The only thing missing was a line of fragrances named after beloved classics and redolent of the freshest sawdust.) I detected nothing cynical about this projection; the good people at Borders were good people. But there were far too many of them. A proper bookshop ought to be a bit creaky, inconvenient, and forbidding — just a bit. Borders was entirely too dreamy.

Big Ideas:
On Persuasion

Tuesday, February 1st, 2011

Forty-odd years ago, it seemed that, week after week, the Notes and Comment piece that opened the Talk of the Town section of The New Yorker  (now just “Comment”) launched a new attack on the conduct of the Vietnam War; failing that, the administration of the day was taken to task for bungling the Cold War. This scolding seemed brave and daring at the time; national magazines didn’t the government in a relentlessly unflattering light. It wasn’t the magazine’s disagreements with specific policies that one remembered. It was its impatience with the clutch of morons who seemed unaccountably to be in charge of things. In the end, The New Yorker turned out to be right: Vietnam was a pointless waste — we are doing business today with the same government that we sought to crush for nearly twenty years — and when the Cold War came to an abrupt end in 1989, it was clear that George Kennan’s modest policy of containment would have been sufficient to halt the spread of Russian influence in the world — without the need for bellicose imagery. But the fact remains that  nobody in power paid attention to The New Yorker while the war was underway, and nobody in power is paying attention to it now, when climate chance poses a far greater menace to the American way of life than Communism ever did, and the magazine is still at it, scolding away, week after week, in pieces that are signed, usually, by Hendrick Hertzberg. 

It’s not enough to be right; what’s wanted is persuasiveness — persuasiveness aimed at the unpersuaded. What would such exhortation look like? It would not look like this: 

Meanwhile, the “overwhelming evidence” that Obama used to cite continues to mount, relentlessly and ominously. The decade just ended was the warmest since systematic recordkeeping began, in 1880; the year just ended was tied (with 2005) for the warmest on record, and it was the wettest. The vast energies released by moister air and warmer oceans are driving weather to extremes. Hence epic blizzards as well as murderous heat waves, unprecedented droughts alongside disastrous floods, coral reefs bleached white and lifeless while ice caps recede and glaciers melt. 

We may ask, what is the point of this paragraph? Aside from the pleasure, which for Mr Hertzberg, we imagine, must be considerable, of delivering the thwack of a sound, articulate rebuke, we can’t imagine what sort of effect the passage is supposed to have on those who peruse it. No individual reader is in a position to do anything about moister air or bleached coral reefs. Surely no one at The New Yorker is naive enough to suppose that Mr Hertzberg’s words would serve their purpose if the magazine’s readers were motivated by them to vote, en bloc, for a slate of candidates committed to reversing the dire trends herein outlined. Any such candidates, once elected, would be mown down by representatives of the much larger portion of the electorate that wants to muddle through, with as little inconvenience as possible — the majority of Americans who will never be prodded into constructive action by being made to feel ashamed of themselves. Nor will ordinary homeowners ever worry more about faraway environmental problems than they do about their own real estate; they won’t even worry half as much. 

Maybe concern for the environment isn’t much of a prod when it comes to reducing the consumption of energy and the production of waste. Complaining about the callousness of the general public isn’t going to accomplish anything. Fear isn’t much of a spur, either; it makes people either sullen or escapist (or unhelpfully crazy: remember those bomb shelters!). It’s unlikely Americans will do much of anything about the environment until they feel good about trying, and can see that their efforts are making their own territory a better place. Just how to make them feel good about visible results — that’s probably as daunting a challenge as actually reversing global warming. (It is certainly not the job of the President of the United States, whose principal duty is to hold the country together, or at least to prevent it from flying apart.) But reversing global warming isn’t going to happen first. And for my part, instead of rapping the knuckles of politicans who oppose the progressive environmental agenda, and spanking their supporters, I prefer to bemoan Hendrik Hertzberg’s failure to write something useful. 

Reading Note:
Permission
Caitlin Flanagan and Natasha Vargas-Cooper in The Atlantic

Wednesday, January 19th, 2011

Until last night, I hadn’t heard of Karen Owen and her PowerPoint presentation. Ms Owen, an undergraduate at Duke University, decided to treat thirteen college athletes with whom she had sex as “subjects” of a faux-sociological report — which is to say that she rated them with astringent candor. Having read about the presentation in Caitlin Flanagan’s aghast article in The Atlantic, I can see that there’s no need to continue beyond the first couple of slides. 

But the 42 slides of Owen’s report on her “horizontal academics” are so dense with narrative detail, bits of dialogue, descriptions of people and places, and reproduced text-message conversations that they are a chore to read. It’s as though two impulses are at war with one another: the desire to recount her sexual experiences in a hyper-masculine way—marked by locker-room crudeness and PowerPoint efficiency—fighting against the womanly desire to luxuriate in the story of it all.

A chore to read at best. A Calvary, I should think, for her family. If nothing else, it confirms my ancient convinction that, classrooms aside, male and female students ought not to share the same campus. 

Sex education needs a serous re-think: the sexes need to be taught about one anotheer. It would appear the learning the mechanics of the thing is the least of the problem that faces young people. Boys in addition need to learn that gratifying their own desires, whatever these might be, is always less important than respecting the human independence of their partner(s). This principle is right up there with the prohibition of murder and the rules against stealing things. Which brings me to the other Atlantic article that I read last night, Natasha Vargas-Cooper’s pornography update, “Hard Core.” Vargas-Cooper grasps an aspect of sexuality that doesn’t, I think, get enough frank discussion; when it comes up, it’s tarted up as “role-playing.” I’m speaking of power, as in the exercise of — and the tremendous ambivalence that men feel in the face of a partner’s surrender. 

Never was this made plainer to me than during a one-night stand with a man I had actually known for quite a while. A polite, educated fellow with a beautiful Lower East Side apartment invited me to a perfunctory dinner right after his long-term girlfriend had left him. We quickly progressed to his bed, and things did not go well. He couldn’t stay aroused. Over the course of the tryst, I trotted out every parlor trick and sexual persona I knew. I was coquettish then submissive, vocal then silent, aggressive then downright commandeering; in a moment of exasperation, he asked if we could have anal sex. I asked why, seeing as how any straight man who has had experience with anal sex knows that it’s a big production and usually has a lot of false starts and abrupt stops. He answered, almost without thought, “Because that’s the only thing that will make you uncomfortable.” This was, perhaps, the greatest moment of sexual honesty I’ve ever experienced—and without hesitation, I complied. This encounter proves an unpleasant fact that does not fit the feminist script on sexuality: pleasure and displeasure wrap around each other like two snakes.

Although Vargas-Cooper doesn’t seem terribly upset by her encounter with the “polite, educated fellow” — educated in what? — it made me sick that anyone would corrupt an intimate encounter by asking to inflict pain — to introduce an absolute distance. (I hope that you grasp the difference between wanting to make someone else uncomfortable — this fellow’s stated objective — and asking to “try something out” that, while causing some discomfort, might also afford the partner a kinky sexual satisfaction.) I’m not shocked by the desire, but the guy’s bad manners are astonishing. Not that there is anything about Vargas-Cooper’s report that likens him to a rapist. He seems to have remained “polite” in bed (his problem, perhaps?). It’s that hse seems to have believed that his partner could give him permission to make her uncomfortable. 

It is easy to hold up the stories told in the two Atlantic pieces against nostalgia for the good old days of respectability, which afforded a woman who chose to take advantage of it a great deal of protection from the predations of male sexuality. But that’s a mistake as well as a distraction. It’s a good thing that women can’t “fall” anymore — not even Karen Owen. The question isn’t who gets to have sex with whom. The question is what kind of sex is wrong for everybody. We seem to be on the same page about children and non-consensual sex — verboten. Perhaps those are the only workable general rules. But the idea that “sex is good full stop” is preposterously naive, and if the flower children of the Sixties may be forgiven for their ignorant excesses, no such innocence is available today. 

As an aside, I think that it’s worth thinking whether the uncorseting of the American libido had the side-effect of eliminating shame on the subject of income inequality.

Big Ideas:
The New Academy, cont’d

Tuesday, January 18th, 2011

There were two stories in this morning’s Times that interested me even more when I sensed that there was a connection between them. The connection is not obvious, and I want to try to work it out here.

The first piece is David Brooks’s column about Amy Chua. Amy Chua is a lawyer of Chinese background who, in her book, Battle Hymn of the Tiger Mother, describes her career as a very tough mom. Brooks’s “paradoxical” point is that she’s not tough at all; on the contrary, Chua is protecting her daughters from the tough cognitive tests that determine both success and happiness in life. This carries forward the thinking that Brooks outlined in his New Yorker essay last week, and underlines my conclusion that an educational model that takes the cognitive revolution into account will put much less emphasis on individual examinations than today’s best schools do. Chua, subscribing to the received wisdom that finds a high correlation between achievement and advancement, imposed upon her daughters a rigorous program of “practice makes perfect,” neither questioning the value of perfection nor assessing its cost. We agree witr Brooks that this is not very intelligent. 

Practicing a piece of music for four hours requires focused attention, but it is nowhere near as cognitively demanding as a sleepover with 14-year-old girls. Managing status rivalries, negotiating group dynamics, understanding social norms, navigating the distinction between self and group — these and other social tests impose cognitive demands that blow away any intense tutoring session or a class at Yale.

The other piece is about Apple. Steve Jobs is ailing, and he’ll be taking another leave of absence from running the company. Miguel Helft and Claire Cain Miller report that the “deep bench” of leaders at Apple ought to insure its continued success. But they quote a couple of observers who aren’t so sure. One of them is David Yoffie, a professor at the Harvard Business School. 

“The company could not thrive if Steve didn’t have an extremely talented team around him,” said David B. Yoffie, a professor at Harvard Business School who has studied the technology industry for decades. “But you can’t replace Steve on some levels.” 

Mr. Yoffie and others said Mr. Jobs’s creativity, obsession with a product’s design and function, and management style, as well as the force of his personality, were unusual, not only in Silicon Valley, but also in American business. They said that it would take several people with different skills to fill Mr. Jobs’s shoes.

The reinvention of Steve Jobs is one of the most interesting stories in American business. Always a visionary as well as a gifted engineer, Mr Jobs appears to have lacked good people skills in his first shift at Apple’s helm. He was tossed out of his own company, and languished in the wilderness for over ten years. For the past fifteen years, he has shown himself to be the nucleus of a brilliantly creative team. That I call him the nucleus of the team rather than its leader is an indication of the connection that I see between the two stories. A leader is much easier to replace than a nucleus; it’s also much easier to measure the effectiveness of a leader. Leadership is just another individual skill that can be learned. By “nucleus,” I have something far more complex in mind: the center of a web of semi-conscious (or even unconscious) signals, suggestions, cues, and associations that bind Apple’s top engineers and marketers in a productive unit. Steve Jobs runs that web — again, I would argue, unconsciously. I have no idea what lessons he learned during his exile, but it is not necessary that becoming a more “understanding” leader was one of them. The only thing that Mr Jobs had to understand, at the end of the day, was how to pick compatible executives. (And this may be nothing more or less complicated than a matter of wearing clothes — although the heir-apparent, Timothy Cook, looks as if he’d be much more comfortable in a jacket and tie.) All he had to learn was how to be a more powerful nucleus, a more efficient reader and emitter of personal communications. Insofar as the result was to unleash the true power of Steve Jobs — and this does seem to be what happened — then it is very unlikely that Apple will continue on its course of staggeringly successful innovation. 

Continuing on that course may not be necessary for Apple to continue to be a successful enterprise — but that’s a matter for some other time. My point is that Apple’s success under Steve Jobs appears to have depended on the very skills that, in David Brooks’s view, Amy Chua has denied her daughters the pportunity to acquire. We certainly don’t know very much about teaching them.

Big Ideas:
The New Academy

Wednesday, January 12th, 2011

What would it be like, I can only wonder, to be eighteen or twenty-one today (instead of sixty-three), and to have just read “Social Animal,” David Brooks’s little fable in this week’s New Yorker. Even harder to grasp is what it is going to be like for my grandson, who is going to grow up in the aftermath of the cognitive revolution that forms the backbone of Brooks’s tale.

A core finding of this work is that we are not primarily the products of our conscious thinking. The conscious mind gives us one way of making sense of our environment. But the unconscious mind gives us other, more supple ways. The cognitive revolution of the past thirty years provides a different perspective on our lives, one that emphasizes the relative importance of emotion over reason, of social connections over individual choice, moral intuition over abstract logic, perceptiveness over I.Q. It allows us to tell a different sort of success story, an inner story to go along with the conventional surface one.

I’m wholly on board with this new story; I’ve spent the past ten or fifteen years coming to understand it — and experience that has oddly reminded me of Plato’s theory of knowledge. Plato thought that learning was actually a form of remembering (he believed in reincarnation), and the new story that Brooks outlines seems more familiar to me with every time I learn a new detail. I say “oddly,” though, because the new story reduces Plato’s intellectual accomplishment to rubble. What a waste of time it was, laboring under the notion that “man is a rational animal.” The notion, rather, that man’s “rationality” is what’s important and distinctive about him. How typical of the old story to talk of “man” in the singular, and to compare him to other creatures. The new story compares people, asking, earnestly, “Who’s happy?”

Plato’s ideas about human nature may range from the mistaken to the bizarre (they are certainly misanthropic), but it is in his methods that the seeds of his undoing lie. It is not hard to imagine the appeal of his teaching, at least in ancient times. Irrationality is frightening and painful, as anyone in the throes of unrequited love can attest. And it is thanks to the tradition of systematic clear thinking that Plato began that today’s researchers are able to study the unconscious: the cognitive revolution is nothing but the deployment of what we call “science” to the study of its place of origin, the human mind.

Off all the things that the cognitive revolution is bound to change, education comes in at the top of the list, or very near it.

Scientists used to think that we understand each other by observing each other and building hypotheses from the accumulated data. Now it seems more likely that we are, essentially, method actors who understand others by simulating the responses we see in them.

It doesn’t sound very interesting to say that man is a social animal (one of many), but it’s going to be the cardinal virtue to bear that in mind. We’re going to be more vitally conscious that happiness is mediated by society — by civilization, itself — and that it is for most post people wholly unobtainable in solitude. (Even in solitude, men and women keep company with the society of ideas that they’ve acquired at other times.) The model of education that is centered on a periodic examination of individual skills will be seen to be as intrinsically useless as most people instinctively think that it is. What fun it will be, at last, to go to school!

 

Editorial Note:
Blow It Up!

Tuesday, June 29th, 2010

 

Fortunately, I was never a big fan of Barak Obama. He’s a smart guy, certainly, but even before he was elected — even before he campaigned for the presidency — he took prudence to the point of vice. Did anyone hear from him in the ethnic-cleansing aftermath of Hurricane Katrina? Let me know if you did.

No, all I cared about was his party affiliation, because all I cared — and care — about is fixing the Supreme Court, which is gushing toxic opinions at a clip to rival the Macondo well’s. As currently constituted, the Roberts Court poses a greater ecological threat to the United States than twenty broken wells. It won’t be happy, I think, until it has rearranged the population into two groups: wealthy rentiers and the people who service them. If you’re not working for somebody rich, you probably won’t be working at all.

Even so, although I’m not disappointed by President Obama, I am genuinely alarmed by his quiescence. The oil pours on and on and he does nothing. What can he do? He can order the Navy to demolish the well with conventional explosives, that’s what he can do. Or, in the alternative, he can explain to me why he doesn’t want to do that. But the proper course of action seems barred to him not because he’s unaware of unusual military solutions but because he still thinks that, as the private property of a group of corporations (each pointing fingers at the others), the well is beyond his reach. For him, that seems to be the end of the story. The well is dealing incredible damage to large swathes of this country, devastating livelihoods and destroying habitats — but he does nothing because, it seems, the well is private property. Nocando.

Correct me if I’m wrong, please! And please tell me how the president’s reaction is different from expecting homeowners to extinguish their own conflagrations in case of fire. Four alarm blaze : fire department :: Macondo disaster : naval intervention. Yes? Correct me if I’m wrong!

President Obama’s agenda is littered with murky problems — a monstrously uncertain war in Afghanistan, economic instability at home, an increasingly dysfunctional Congress — but this is not one of them. The only thing that’s murky about the Macondo well disaster is what it’s doing to the Gulf of Mexico.

Just ask Bill!

Reading Note:
Why would you want the iPad to function like a laptop?

Wednesday, June 9th, 2010

ddk0603

Why, I wonder, have the editors at The New York Review of Books drafted Sue Halpern to cover the iPad, when it’s clear that she is unsympathetic to the device? Her long piece in the 10 June issue of the Review never touched on the iPad’s function — its primary function, in my view — as an Internet reader. Now, in an entry at NYRBlog dated 8 June (but obviously written long after the printed piece), she comes closer, but lets the point slip out of her hands.

As it is built now, the iPad is the ultimate consumer device, meant primarily to consume media, not to produce it. That’s why, in its first iteration, it has no native printing application, no camera, no USB ports for peripherals.

Because Ms Halpern wants the iPad to be a computer. Why on earth, I wonder?

But the impulse to make it into something else, a lightweight computer that can stand in for a PC in the classroom, at a meeting, on the road, wherever, is strong. This is why iPad users have been buying keyboards to bypass the touchscreen, and finding apps that allow for rudimentary multitasking, printing, and remote access to one’s home computer in order to use non-iPad-enabled software like Microsoft Word. The paradox of having designed the ultimate consumer device is that ultimately the consumers will make of it what they want—if Google, with its rumored Chrome Tablet, doesn’t get there first.

Doesn’t she already have a computer?

Heading the blog piece is the image of an extravagantly marked-up book; we’re told that it is David Foster Wallace’s copy of Don DeLillo’s Players. Ms Halpern helpfully outlines a hack for writing notes on books that you’re reading on your iPad, although she complains that you have to know what you’re doing “to avoid getting tripped up.” Awkward or not, I won’t be giving the hack a try, because I don’t write in books. Except to insert the odd “Ex libris,” I do not mark my books. Possibly because I am really very bad at multitasking, I find taking notes to be unhelpful. I find that it’s better to let strong impressions simmer untended; if I feel that I have something to say when I’ve finished reading, then I try to write it out in as finished a manner as possible, often in the form of entries that, without too much editing, appear on this site. But that is me; that is my idiosyncrasy. In the end, reading books is not what the iPad is really for.

Well, that’s precisely what the iPad may be for — the specific tablet sold by Apple — that and all the other apps that Apple markets. I don’t have much time for apps, and, like James Kwak, I think that there’s something retrograde about them. Eventually, there will be other tablets, with or without their own apps markets. Some of them may support browsers superior to Apple’s Safari. I may come to prefer one of them to the iPad. All that is down the road. As Jason Kottke wrote when the iPad first appeared, it’s a “proof-of-concept gadget for adults.” But the concept that it proves is that reading the Internet can be as pleasurable as, or at any rate no less pleaurable than, reading a book or a magazine.

Since my way of reading the Internet is pretty much the same as my way of reading books, I am not incommoded by the difficulty of taking notes in a browser. The Daily Blague might indeed be regarded as a notebook, even if it’s a notebook that’s designed to be intelligible to other readers. Or just plain intelligible: in my note-taking days, I was often at a complete loss to make sense of a good many scribblings, even when they were perfectly legible.

When I acquired my first personal computer — an IBM Peanut — in 1985, I had high hopes of using it to organize my life. But life is far too complicated to be addressed by one machine. For several years now, I’ve been writing longer things on a laptop, in another room, without opening any email apps. The sensed difference between the computer where I work (with its two screens) and the one on which I think (in order to write) is intense. Now the iPad has introduced a third — and, I suspect, a completing — mode: a computer on which to read. Sue Halpern may try to tarnish the device by using the dirty word “consumer,” but I’ll embrace the description. As I stroke through Safari, I’m letting the other guy speak.

Reading the Feeds:
Elites and Entrepreneurs

Wednesday, June 2nd, 2010

ddk0602

A word or two about Tyler Cowen’s recent entry at Marginal Revolution, “When does large-scale public ownership work?” The butt of this piece is a compare-and-contrast of the cultures of China and France on the one hand and of the United States on the other. Here is what Americans do not share with the French and the Chinese:

In part this is a puzzle but in part France and China have one important feature in common: it’s high status to be a ruler. Very smart Frenchmen often grow up wanting to work for the government. Hardly anyone in France thinks that is weird and so the French bureaucracy has some of the best talent in the country.

Americans get carried away by sentimental patriotism as eagerly as any buch of guys, but their love of country is pallid compared with what France and China inspire in their nationals: both countries are “middle kingdoms,” central sovereignties habituated to setting broad cultural standards and to solving regional problems in terms of sheer chthonic majesty. (Whether either France or China is “entitled” to this drastically superior self-image is, here, beside the point.) In both countries, being Chinese/French is a civil project that’s at least as important as any other that the government might foster (defense, industry). No wonder, then, that smart and ambitious graduates find civil service to be a top-drawer careeer.

The relationship between “being American” and the American government is starkly opposite: the government’s only task is to stay out of the way of Americans’ self-realization. Fervently patriotic Americans are like the appreciative children of very permissive parents: their taste for freedom cannot be exaggerated. And where once the canonical image of the free American was the cowboy, now it is the entrepreneur. Mr Cowen doesn’t use the word in his entry, but he carves out a space that only it could fill.

I also prefer to live in a society where the public sector does not have so much prestige. Very often governmental prestige stifles innovation and implies a series of more general insider, elitist, and sometimes authoritarian attitudes. It’s also worth a quick look at the histories of what France and China had to do to build up so much governmental prestige; not pretty.

My sympathies are altogether opposed to these sentiments; it will always be a matter of the deepest regret that I did not contrive to live my life in a society that exalted the public sector above all others. I believe that entrepreneurs are of rather marginal importance in a society as complex as ours, and even if they’re ten times more important than I think they are, they’re no model for the rest orf us. While I would not call myself an “elitist,” I will not deny or try to hide the fact that I was born, by whatever accidents, into the American elite and that I have spent my entire life in that atmosphere. Mr Cowen certainly does so now, whatever his origins. The odds are very strong that you yourself, the reader, are a member of the American elite.

What’s great about France and China — and wretched about Anglophone civilization — is that elite groups do not hypocritically pretend that they do not enjoy great advantages — the sort of advantages that will accrue to people at or near the levers of power so long as human beings walk on two legs. I don’t know that French or Chinese elites are qualitatively more responsible to their cultures than privileged Americans are, but at least they’re not vacuously pretending that they’re not Old Etonians.

Dear Diary: How Everybody Felt About 2009

Thursday, December 31st, 2009

tomwaterhouse1209
© 2009 Tom Waterhouse

(Except me. I had a great year, really; I was the spring in this guy’s step.)