Archive for the ‘Big ideas’ Category

Gotham Diary:
Dames
25 January 2013

Friday, January 25th, 2013

Reading Caroline de Margerie’s Amnerican Lady: the Life of Susan Mary Alsop a few months ago made me realize that I’ve got an interest in well-written books about well-born American women who weren’t supposed to do much of anything beyond marrying and raising children, but who became famous, which they certainly weren’t supposed to do. What they became famous for was, essentially, being themselves in a publicly interesting way: Dames. They created stylish careers to suit themselves as women; they were rarely certified professionals. (My Dames were not WASP princesses who went on to be leading doctors or lawyers. ) They were not feminists — or, rather, the women who were would have said that they weren’t. Susan Mary Alsop was a diplomat’s wife with a couturier’s dream figure who became an important Georgetown hostess, entertaining the nation’s political élite. Not without a little help from her second husband, who, to be sure, married her for her Dame potential.

That’s a thing about Dames: they generally needed to be married before they could get started on whatever it was that would make them famous. Whether their husbands provided lots of money (Bill Paley) or lots of creative support (Paul Child), they were indispensible. Dames might not be beauties, but they were attractive, and their achievements depended on important lifts from men. Some worked harder than others — nobody worked harder than Julia Child, once she got going — and some were just famous for being famous, like the beautiful, beautifully-dressed Babe Paley.

Now lands Empress of Fashion: A Life of Diana Vreeland, by Amanda Mackenzie Stuart. Like American Lady, this is the first serious biography of its subject, and I will be refining my still rather vague ideas about Dames. How many other such lives can be found on my shelves?

***

Meanwhile, who knew? It was very interesting to read, on page 15, that Emily Key Hoffman, who would become Diana Vreeland’s mother, attended the recently-established Brearley School, of which my dear Kathleen is an alumna (and, currently, a trustee). But the bombshell on page 34 — Vreeland herself, who also went to Brearley (!), was effectively thrown out by the headmaster, for not being “Brearley material” — this startling revelation put ants in my pants, and at eleven-thirty last night. I got quite excited, mulling it over. Kathleen had had no idea that DV was a Brearley girl (much less a rejected one). Had we known it and forgotten it? Unlikely — but not impossible. I must find my copy of DV, Vreeland’s “factional” memoir.

***

Kathleen, during moments of relative calm in last night’s conversation (who knew?), asked me about the Mitford sisters — were they Dames? I am pondering this. There was something so prodigious about Lord and Lady Redesdale’s brood of girls that they none of them had to endure the floudering, or at least aimlessness, of my Dames’ early adulthood. Nancy always wanted to be a writer, and Jessica seems always to have wanted to be a socialist. Unity worshiped Hitler from quite a young age, and had a lot of comfortably sexless quality time with the monster. Deborah, who did everything she could think of to avoid becoming a Dame, is perhaps the only one among the lot. In becoming a highly idiosyncratic chateleine on a grand scale, Debo was assisted not only by her husband (who came into the castle, as it were) but by the Inland Revenue and some very untimely deaths. (Q: do Inland Revenue collect death duties?) But can a Dame be a duchess? We shall have to perpend.

Gotham Diary:
Photography and Accident
15 January 2013

Tuesday, January 15th, 2013

When I was young, I thought that the Nineteenth Century must have been the very worst time in human history to be alive. I was brought to this conclusion by photographs, by the pictures that were taken of nineteenth-century people. Faces were either stern or inexpressive. Hair looked dirty generally; men’s hair was often a frightful mess — and those untrimmed beards! Everyone seemed to be wearing too much clothing, too much fabric that managed to be stiff and rumpled at the same time. The overall effect was infernally dismal.

There were no photographs of the people of earlier times, of course; instead, there was a very wide range of painted portraits. Each period had a definite style, but at least the rich and powerful people who sat to the most skilled artists looked lively and interesting. (Or appealingly wicked — van Dyke’s Richelieu, for example.) The light seemed to have gone out of life in about 1850, when photography began to be common. By intriguing coincidence, this is about when Queen Victoria lost her girlish charm. As a young woman, she was often painted in a way that brought out her youthful warmth. As a middle-aged woman, almost immediately a widow, Victoria was a monument to loss. The century’s other great figure, Abraham Lincoln, was equally rebarbative.

It was only a few years ago that I saw what the problem really was, an aspect of photography that hadn’t occurred to me. Oh, I had long known that early sitters had to hold their poses for aching stretches of time. That accounted for the frowns and the blank stares. But it didn’t explain everything, certainly not the clothes. Everyone seemed to be wearing a sofa!

One day, at the Museum, I was looking at Nadar’s portrait of Rossini, taken in 1856. It took a moment for me to correct the impression than Rossini had spilled soup on his coat; in fact, the print was damaged at some point by a splash of something. But the sense of dishevelment persisted. Rossini’s coat and overcoat are ribbed with creases. His shirt cuff appears not to be properly buttoned. Now, Rossini was a natty gent, a true bon viveur, with plenty of money and, presumably, a good tailor. Indeed, Rossini looks well turned-out when contrasted with another Nadar subject in the Museum’s collection, Théophile Gautier. But there’s still something frumpy and off-putting about his photograph.

Photography itself wasn’t the problem. Rather, it was the novelty of photography, the fact that nobody was prepared to be photographed. Painters had been straightening hair, smoothing bodices, and polishing shoes for centuries. We may assume that very few ancien régime grandees looked as good in everyday life as they did in their portraits. Correspondingly, no one expected bandbox freshness of friends and neighbors. Soup spots simply didn’t “show.” Minor accidents of disarray were overlooked.

Nearly two centuries later, we have so adapted to photography that it is the minor accidents that show most powerfully. Take a picture of your living room after you’ve tidied it up. The odds are that, looking at the photograph instead of the room itself, you’ll be assaulted by something terribly out of place, such as an unconsidered stack of magazines or a mug thoughtlessly abandoned on a shelf. (The very thoughtlessness is captured in the image.) With your own eyes, you simply won’t have seen these as faults. Welcome to the world of art direction!

Most clothing on the market today is made to look good in a photograph, and expensive clothing is designed to make love to the camera. Increasing platoons of young people are determined to do the same in their birthday suits. Photography has taught us to attend to the details of our personal appearance — of every kind of appearance. Appearance is nothing but a welter of details. There are no more accidents — no more slight derangements that “no one will notice.” Even the rejection of well-groomed neatness has become canonical: in the last thirty years, a stubbly chin and a loosened tie have become intentional “looks,” and are no longer the regrettable marks of the slob. (The slob is almost always overweight, and in need of a barber.)

In classical concept of the “accident” was a tool for guiding the attention, away from the chaos of detail, and toward the posited essence of a thing. Accidents today are, conversely, serious: unexpected collisions that result in breakage. The wishful simplicity of the classical outlook is no longer on offer. Photography, I believe, played a major role in the transformation of accidents from meaningless nonessentials into glaring defects.

Gotham Diary:
Rude
29 November 2012

Thursday, November 29th, 2012

Aaron James, we’re told on the “About the Author” page of his new book, is a surfer, and he is not an asshole — the subject of same, Assholes: A Theory. There. I hope to be done with the unpleasantness. I find the word hideously unpleasant to write in this space, although like everyone else I find it an enormously effective emollient when in need of invective. Although only midway through the book, I am heartened to find that, in general, I use the epithet in conformity with James’s theory. It is also a great relief to learn that, if I worry about being one of James’s subjects, and would be ashamed to be so regarded by anyone else, then, no matter how foolish my behavior, I am probably not one myself. Modified rapture.

But I fasten on the surfing in the author’s CV, which I’d already gathered from the text, since the bad behavior of some surfers provides an inordinate number of examples. (I had no idea that Brazilian louts had made themselves detested on the north coast of Oahu.) Surfing is incomprehensible to me: I cannot imagine devoting so much attention to the swells of the sea, particularly since my idea of a thrill is walking into a cloud of freshly-baked bread. (Actually, my idea of a thrill is a very funny line.) It is all too primordial, to at-one-with-unchanging-but-unpredictable-nature. Nature, as is well documented, I find to be a great bore, at least in its untamed avatars.

So, I am going to attribute the failings in Theory to an excess of sun, saltwater, and perhaps a concussion. These failings are two. (So far!) First, the preoccupation with philosophical argument. I think that it’s possible to study human failings without getting caught up in the conundrum of free will. I hope that it is, anyway, because free will is not going to established or disproved anytime soon. My eyes glazed over as James’s struggled to draw a bright line between psychopaths and his subjects. If society at large is responsible for the creation of assholes — and I believe that it is — then the question of culpability ceases to be interesting. We’re left with a problem — the pains-in-the-neck are still with us — and we have to figure out what to do about them. Not how to think about them.

One of the worst failings of philosophy is its complete ahistoricism: it dismisses changes in circumstances as “accidents,” and deems “essence” to be eternal. There’s no question that James is going after the essence of his subject, and there’s no question (in my mind) that he might as well be pondering the zodiac. The simple fact is this: assholes are a modern problem, an after-effect of the dismantling of structures of birth-determined status. (Once upon a time, in the ancien régime, affairs were so managed that assholes were a protected class, the aristocracy.) This is the other error of the book at hand. It is fatuous in the extreme to appraise a figure such as Cecil Rhodes in terms of the Theory, because the terms of James’s three-pronged test don’t translate meaningfully back into the Nineteenth Century. Or to any earlier time, or to any culture that isn’t, officially at least, “liberal democratic.” There were moments, as I read the book last night, when I expected James to come out and declare a correlation between the phenomenon that interests him and the individualism and obsessive personal autonomy that flourish particularly among Anglophone males. (He does share a brilliant, highly localized hypothesis: Anglophones prefer to line up in orderly queues because they dislike touching and being touched by strangers.) So far, alas, the connection has not been made.

Here is the Theory, Aaron James’s three-pronged test: An  asshole is someone who

(1) allows himself to enjoy special advantages and does so systematically;
(2) does this out of an entrenched sense of entitlement; and
(3) is immunized by his sense of entitlement against the complaints of other people.

In the ancien régime, this sense of entitlement was legally protected. It is true that we have moved on, or tried to move on, but it seems to me more useful to consider the asshole as a relic of historical conditions, a would-be member of an extinct social class, and bear in mind that his now annoying behavior used to be conventional, than it is to tramp through the semantic swamps of personal responsibility.

I have a third problem with this book, but it’s not a shortcoming on James’s part. It’s an uneasiness about the egalitarian claims that underlie it. We learned, over centuries of experience, that status based on birth is a terrible idea. So we got rid of that, or thought we did. But it has been shown in case after case that the children of wealthy people are more equipped to cope with life’s ups and downs (especially the downs, which are heavily upholstered) than other people, and also better able to manipulate circumstances in their own favor. Taken too far, egalitarianism gives these unofficially privileged children the power, if not the right, to pursue separate and superior trajectories, and they seem to do a good job of taking their money with them. As France’s miserable record with the assimilation of outsiders proves, the declaration of equality, without more (much more), is an empty thing. We have to be more candid about our inequalities, many of which are the result of circumstances beyond human control — if only to determine which them aren’t.

But Assholes: A Theory is a helpful, timely book, simply for having inaugurated a conversation about civility and socialization. We all tend to think that we’re nicer than we are (and than other people), and smarter, too; and yet we’re all jerks now and then. A firm grasp of Aaron James’s three-pronged test will go far to help us from thinking too highly of ourselves while being jerks as a matter of course.  

Gotham Diary:
Genius
8 November 2012

Thursday, November 8th, 2012

Permit me to begin on a scholarly note: most of what I have to say about genius just occurred to me the other day while thinking over Virginia Woolf’s autobiographical writings and, for context, re-reading parts of Hermione Lee’s 1996 biography of the writer. My thoughts are entirely sketchy and impressionistic. Aside from a glance at Wikipedia, no study of any kind was involved.

What I brought to this deliberation was something between a hunch and a conviction to the effect that “genius” is not a helpful concept, not a meaningful label. It means nothing more than “very smart person,” where “very” may be exchanged for any number of emphatic adverbs (“extraordinarily,” “unusually,” “immeasurably,” &c). In other words, it is not a grade; it does not signify a class. “Genius” is a romantic word, loaded with hokum.

As the Wikipedia entry points out, genius was the ancient Roman term for a family’s “tutelary deity,” whatever that was. It picked up something of its modern sense when the genius of certain families came to be thought of as the explanation for their prominence in affairs. But until the Nineteenth Century, genius was a possession, not an identity. You might speak of “the genius of Shakespeare,” but without claiming that “Shakespeare was a genius.” The genius of Shakespeare is reflected in his plays, in the the works that this genius inspired. Shakespeare taking a walk or a nap was not being a genius.

The idea of being a genius seems to have emerged in response to the grandeur of Romantic art and philosophy. Did Kant and Hegel write about genius? It doesn’t matter. They came to be regarded as geniuses themselves, as were Keats and Shelley, posthumously. Mozart and Beethoven also became geniuses after their deaths. (In Beethoven’s case, the reception of his late quartets is a gauge of the development; thought to be the product of a diseased or deranged mind when they were first played, they came, by virtue of their craggy inscrutability, to be proofs of genius.) Genius was sublime.

***

Genius was allowed to be eccentric. Tennyson dressed like a tramp. Genius was not obliged to behave like a gentleman — witness Carlyle and poor Jane. Geniuses, as the Victorian era deepened, became a sort of upmarket Barnum attraction. Here is a picture that Woolf paints of her mother’s youth:

Little Holland House was her world then. But what was that world like? I think of it as a summer afternoon world. To my thinking Little Holland House is an old white country house, standing in a large garden. Long windows open onto the lawn. Through them comes a stream of ladies in crinolines and little straw hats; they are attended by gentlemen in peg-top trousers and whiskers. The date is round about 1860. It is a hot summer day. The tables with great bowls of strawberries and cream are scattered about the lawn. They are “presided over” by some of the six lovely sisters, who do not wear crinolines, but are robed in splended Venetian draperies; they sit enthroned, and talk with foreign emphatic gestures — my mother too gesticulated, throwing her hands out — to the eminent men (afterwards to be made fun of by Lytton); rulers of India, statesmen, poets, painters.  … The sound of music also comes from those long low rooms where the great Watts pictures hang; Joachim playing the violin; also the sound of a voice reading poetry — Uncle Thoby would read his translations from the Persian poets. How easy it is to fill in the picture with set pieces that I have gathered from memoirs — to bring in Tennyson in his wideawake; Watts in his smock frock; Ellen Terry dressed as a boy; Garibaldi in his red shirt — and Henry Taylor turned from him to my mother — “the face of one fair girl was more to me” — so he says in a poem. But if I turn to my mother, how difficult it is to single her out as she really was; to imagine what she was thinking, to put a single sentence into her mouth! I dream; I make up pictures of a summer’s afternoon.

It is difficult to put a sentence in Julia Jackson’s mouth, I surmise, because she is a young girl in the shadow of geniuses. The presence of genius drives out triviality and invests everything with significance. Everything, even the household chores. Woolf describes the process in “Reminiscences,” a journeyman piece composed under the influence of Henry James, before she found her own voice, with an ingenuousness that it’s impossible to imagine her older self not taking issue with.

She [Julia Stephen] delighted to transact all those trifling businesses which, as women feel instinctively, are somehow derogatory to the dignity which they like to discover in clever men; and she took it as a proud testimony that he came to her ignorant of all depressions and elations but those that high philosophy bred in him.

In her mid-twenties, Woolf (or Virginia Stephen as she then was) still bought this brand of the feminine mystique. It would take years for her to acknowledge and articulate her disgust with her father’s genius act.

This frustrated desire to be a man of genius, and the knowledge that he was in truth not in the first flight — a knowledge which led to a great deal of despondency, and to that self-centredness which in later life at least made him so childishly greedy for compliments, made him brood so disproportionately over his failure and the extent of it and the reasons for it — these are qualities that break up the fine steel engraving of the typical Cambridge intellectual.

There is a glee in this deconstruction of her father’s aura that makes “A Sketch of the Past” just about the most exciting thing that Virginia Woolf ever wrote.   

***

Even when that disgust was disgorged (beginning with To the Lighthouse), Woolf continued to live and write as though the “dignity,” of which her mother was so solicitous, continued to glimmer in her life, a lamplight that would give all other things their contours of significance. The most menial chores would be relieved of drudgery by the presence of this light. But the light did not shine for her as it had for her mother; Virginia herself wished to be a genius. She was able to wish it, without sounding the depths of her father’s miserable self-doubt, and the prospect must have seemed provisional to any woman born in the 1880s, growing up a thicket of geniuses all of whom, with the arguable exception of George Eliot, were male. But her relentless high-mindedness interfered with her sense of humor. It placed a high lower limit on admissible fun.  

Had she been able to forget this dignity from time to time, she might have left us much more in the vein of “Am I a Snob?”, a speech that she wrote in the Thirties to be read before old friends. What does it mean to be a snob? It means setting true values aside, hobnobbing with aristocrats, and having a lot of guilty fun.

Margot Asquith — “a lady whose birth is no better — perhaps worse — than my own” — was, nevertheless the Countess of Oxford when she wrote to Virginia to ask a fatuous favor: “When I die, I would like you to write a short notice in The Times to say you admired my writing, and thought that journalists should have made more of me.” It seems that Virginia had actually allowed that Margot was a “good” writer. “This, coming from you, might have turned my head as you are far the greatest female writer living.”

Now I was not, I think, flattered to be the greatest female writer in Lady Oxford’s eyes; but I was flattered to be asked to lunch with her alone. “Of course,” I replied, “I will come and lunch with you alone.” And I was pleased when on the day in question Mabel, our dour cook, came to me, and said, “Lady Oxford has sent her car for you, ma’am.” Obviously, she was impressed by me; I was impressed by myself. I rose in my own esteem because I rose in Mabel’s.

When I reached Bedford Square there was a large lunch party; Margot was rigged up in her finery; a ruby cross set with diamonds blazed on her breast; she was curled and crisp like a little Greek horse; tart and darting like as asp or an adder. Philip Morrell was the first to feel her sting. He was foolish and she snubbed him. But then she recovered her temper. She was very brilliant. She rattled off a string of anecdotes about the Duke of Beaufort and the Badminton hunt; how she got her blue; … about Lady Ripon, Lady Bessborough; L Balfour and “the Souls.” As for age, death and obituary articles, The Times, nothing was said of them. I am sure she had forgotten that such things existed. So had I. I was enthralled. I embraced her warmly in the hall; and the next thing I remember is that I found myself pacing along the Farringdon Road talking aloud to myself, and seeing the butchers’ shops and the trays of penny toys through an air that seemed made of gold dust and champagne.

Now no party of intellectuals has ever sent my flying down the Farringdon Road. I have dined with H G Wells to meet Bernard Shaw, Arnold Bennett and Granville Barker and I have only felt like an old washerwoman toiling step by step up a steep and endless staircase.

I think that Virginia Woolf felt like an old washerwoman a lot.

Gotham Diary:
Court Art
2 August 2012

Thursday, August 2nd, 2012

The one thing that I didn’t mention in yesterday’s entry — I mentioned it, but not as one of the draws that keep me buying the latest Donna Leons — was Venice. I have never been to Venice, but I should be very surprised if it does not seem familiar the moment I arrive, not because I’ve seen so many photographs of the famous buildings and curious canals, but because I’ve followed Guido Brunetto, resident of the Polo district, throughout the city, and in all weathers. I’ve followed him across the campi and about the markets and into little trattorie tucked into the corners. I may not have seen anything, actually, but I’ve absorbed some of the pace and quirkiness of life in Venice, which is essentially life as it is everywhere but with a few little wrinkles. Walk or boat, for example; how many instances, in the Brunetti novels, are there of the commissario’s working out whether a given destination is better reached by boat or on foot? A conventional realist, Donna Leon knows how to bring Brunetti’s Venice to life, and the danger for me, if and when I visit the actual city, I don’t unpack too much of the one that I’ve brought with me.

Reading Lauren Collins’s profile of Tino Sehgal in the current New Yorker, it struck me that the rejection of representation that characterizes modern art (even pop art, which substitutes hallucination for representation) is also a rejection of everyday life. The Sehgal installations that Collins describes are systematic violations of the conventions that govern everyday life. These violations are tolerable, presumably, because they occur in museum spaces, and most viewers, if they may no know what they’re getting into, know that they’re getting into something. But, again like so much modern art, they have a jokey air, an aspect of Let’s Pretend This Is How Things Are. There is no educational mission here; the object is not to discover something about how things actually are. The sociologists and cognitive scientists tackle that project. Contrasted with their disciplinary rigor, installation designers such as Tino Sehgal are pretty clearly fooling around.

I don’t object to fooling around, although I’ll admit that I haven’t much patience for it. What really bothers me is that anyone thinks of it as “art.” Art is something else. Art descends from Court Art.

***

Court Art, eh — minuets danced by nymphs and shepherds in pastoral nowheres? Very expensive objects, such as bronze clocks and tables made out of silver. Objets de vertu that you might spend a fortune on at A la Vieille Russie. That’s a part of it, but not the important part. 

Court Art is art intended to be enjoyed and appreciated by the most important people in the land, gathered together at least occasionally in a town, usually a capital, centered on some sort of palace. The palace is the building in which the most important people in the land come together, especially for ceremonial occasions honoring the ruler or the patron saints. The court is the secular ritual that takes place within the palace and that spills out of it into the town. It is here that women approach most closely the freedom of men. The court is grave for the most part, but tempered by sophisticated, well-mannered frivolity. Grave or frivolous, the court is always serious, because it is the center of power.

I’m describing the earliest courts, which arose in the Thirteenth Century in Italy. By the Eighteenth Century, Court Art was so well understood that it flourished in England quite apart from the royal coteries. One might argue that all of London, together with the satellites of the great country houses, comprised the English Court in 1800.

By 1900, there were no more courts anywhere. There were still kings and even a grand duke or two, but there were no more gatherings of all the important people of the land, and few ceremonies with any claim on attentiveness other than that of great age. There were too many important people for such gatherings, and there were too many ways of being important, many of them having little do with sovereign power. Like electric power, sovereign power was domesticated by 1900. It hummed along smoothly, doing whatever needed to be done with as little violence as possible. The personal rule of warlords was replaced by the elected authority of committees. 

Court Art, as it had been known to the most important people of the land, for whom it had constituted a kind of mirror, was no longer called for (and so the genre of large paintings depicting historical or mythological scenes came, along with others, to an end). But it did not disappear altogether. It persisted in two ways, ways that have today become antithetical.

The first is the persistence of craft. The techniques of Court Art are still studied assiduously, and traditional artists — that’s what they’re called — continue to turn out paintings and drawings and even sculptures that serve the people who enjoy them, quite aside from decorative value, as mirrors. Looking at a traditional painting that appeals to you, you see yourself in part. The vision is a serious pleasure, just as it was for people at court.

The second is the persistence of expositions of new works. For a long time, these expositions continued to feature the same kind of art that was produced for the court, but traditional artists were developing other interests, subjects that, in court eyes, appeared insignificant and unworthy of attention. These interests, curiously, could be traced back to the first sovereignty to dispense with a court: the Netherlands. It was there that mirroring art was first created for people who did not regard themselves as among the most important in the land. This tradition flowered beautifully at the end of the Nineteenth Century.

But how to continue the officially-sponsored expositions? It did not take long for Marcel Duchamp to show up with a urinal. Next week, I shall take up the mystery of how this “fountain,” and the installations to follow, continued to be considered “art” 

 

 

Gotham Diary:
Example
13 June 2012

Wednesday, June 13th, 2012

This was probably not the best time to be reading about Bryan Fischer, the American Family Association broadcaster who has forged opposition to same-sex rights into the sharpest brand in the evangelical attempt to institute an American theocracy. I’m feeling wobbly enough as it is, and panic attacks are not helpful. That’s my first reaction to people like Fischer: panic. Monsters of wrong-headed self-assurance, they seem designed to prove the ancients right about the impracticability of democracy. I can only echo Fischer’s charge about President Obama, “who, I believe, despises this nation.” I certainly despise this nation, insofar as it endows Fischer’s organization with tax-free status and allows him to argue, on the airwaves (which are allotted in public trust), that, for example, First Amendment rights are available only to Christians and that men are superior to women. His ideas are ridiculous or treasonous or both — so why am I reading about him in The New Yorker? 

Having worked in radio myself, I am not surprised by the pile-up of talk celebrities with checkered pasts in other lines of work. Fischer, according to Jane Mayer’s profile, has consistently strained his relations with the organizations that employed him by insisting upon male supremacy. He simply refuses to subject himself to the authority of a woman. In this, he accords with most pious observers of the Abrahamic strictures. I panic because I grew up in a time when such piety was marginalized (as I believe it ought to be). This isn’t the place to consider how and why that changed, but of course it did change, and now there is only one reason for progressive optimism: opinion polls that align rigorous conservatism with age. Eventually, according to this prospect, the supporters of Bryan Fischer and others like him will die off. I should like to rest my hopes for the future on more positive developments than mass-mortality.

The nation seems divided between people who want to be left alone to do their thing, and who are willing to leave others alone to do theirs, on the one hand, and zealots who wish to impose Iron Age laws on everyone. The division is very far from equal, but the people who want to be left alone, are, ipso facto, unmotivated to take public action. Many of them are too busy following interesting HBO series to pay attention to the Fischers at work in the land. (Everyone who tells me how much “fun” Downton Abbey is seems bemused and wishful, as if things would be better if we could all live like that.) The hipsters who pipe up at The Millions and The Rumpus are preoccupied by job prospects, naturally, but I sense that, like sharp young people everywhere, they’re disinclined to engage with people who have invested in deeply uncool policies. Fischer’s evangelicals think there’s a war on, that the country is about to burn — and they’ll be happy to light the match. Who is going to stop them?

One thing that isn’t going to stop them is the passage of laws that they don’t like, laws permitting a fully equal distribution of civil rights among all citizens. That is only going to encourage them. Passing rightful laws is not enough. It is the beginning of the progressive project, not the climax. What follows the laws is the behavior that, over time, makes the laws unnecessary. But if no one is paying attention to good behavior, what then?   

***

Happily, the antidote to my panic attack, the intellectual Xanax, as it were, lay close to hand: Stuard Firestein’s Ignorance: How It Drives Science.  Science, as it has developed over the past four hundred years, begins with a proposition of general ignorance: nothing is known unless it has been tried. To put this even more sharply: nothing is really known until it has been disproven. We know that there is no such thing as phlogiston. What we know about oxygen is subject to further refinement. We may know all that we need to know about oxygen for the time being, but we don’t know everything about it, and we probably never will. Which is grand, because, as Firestein explains, scientists “don’t get bogged down in the factual swamp because they don’t care all that much for facts. It’s not that they discount or ignore them, but rather that they don’t see them as an end in themselves. They don’t stop at the facts; they begin there, right beyond the facts, where the facts run out.”

Bryan Fischer and people who regard him as an intelligent speaker clearly have an opposed view of knowledge. They believe that everything worth knowing was handed down to us long ago, in a book inspired by God. It’s a very attractive idea, and, because the Bible is actually a complilation of mutually inconsistent texts written over a long time and from changing points of view, its complications keep happily puzzled minds busy. It is no wonder that fundamentalist Christians try to discredit what I’ve just called Science, because Science insists on a complete disavowal of the Bible as a compendium of facts. There is, in fact, not one single fact in all of Scripture. That is the point, you might say; it is a work of faith. But the Bryan Fischers of this world want more than faith. They want knowledge. And they are right to discredit Science in this pursuit, for all that Science can give us, truly, is, as Stuart Firestein argues, Ignorance.

It’s because we don’t know right from wrong that we have worked out a number of conventions, most of them backed at one time or another by supernatural claims. In fact, it takes nothing but the modern imagination to understand that murder is wrong, and that no one occupies a position of inherent authority. It would be convenient if these conventions yielded compelling exceptions, but they don’t, ever. Murder is always and everywhere wrong. No one human being — and certainly no gang of human beings — has the right to take the life of another. (I believe that the complete denial of personal liberty known as modern incarceration is almost always wrong, too; prisons, I hope, will one day be seen as no less wicked than the machines of torture. But perhaps the world is not ready for that convention.) Most of our convetions are trivial, involving nothing more obscure nor less arbitrary than which side of the road to drive on. They change and improve along with our understanding. No divinity gave us a road map at the beginning of our journey, and nobody’s talk of one ought to impede our halting progress in reforming and recreating conventions that more closely answer our needs — one of which, certainly, is to live in a world without certainty. It’s precisely for that reason that the electric power must always spring on when summoned, and the water flow from the faucet. These sophisticated simplicities are the price of our everyday agnosticism, a negative capability, as Keats put it, to live with our very limited knowledge of the world. Those without the capability will have to find happiness in caves.  

Gotham Diary:
Doctrinally Focused
1 June 2012

Friday, June 1st, 2012

The other day, Maureen Dowd complained about the authoritarian attitude of the Roman Catholic Church. Let’s breeze right past whatever it was that inspired a woman brought up in a pious Catholic family to imagine that such complaints might not be a complete waste of time, and proceed to a letter that a former Jesuit priest wrote in agreement. Ms Dowd had said,

So it makes me sad to see the Catholic Church grow so uncatholic, intent on loyalty testing, mind control and heresy hunting. Rather than all-embracing, the church hierarchy has become all-constricting.

To which Tim Iglesias, of Oakland, California, responded,

… I believe that they are pursuing a very deliberate strategy. They have decided that a smaller, more unified and fervently doctrinally focused church community is preferable to a welcoming, diverse and unrully one. All of their actions are consistent with this strategy. I mourn the loss of the catholic Catholic Church.

This exchange seemed as good an occasion as any for teasing out what might lie at the center of the purer church’s doctrinal focus. But first, a little history.

The Roman Catholic Church emerged from its near-death experience at the end of the Eighteenth Century with an appearance that seemed to go back for centuries but a severely altered interior. Beneath the preserved and even refreshed decorative fabric — the church buildings, the vestments, the liturgies, the religious orders, and of course the Apostolic Succession itself — the Church was not at all what it had been. How could it be? Like everything and everyone, it had been cast out of the ancien régime, and obliged to make its own way on its merits. Stripped of properties and the administrative power that went with them, as well as morally authoritative alliances with the governments of Catholic powers (far fewer in number when the commotion subsided), the Church ceased to be a temporal ruler with the unification of Italy. It became a sentimental operation, dispensing pastoral care and comfort to a secular world ever less interested in traditional ideas of the sacred.

This isn’t the place to consider the role played by the industrial revolution in causing huge shifts in how educated people imagined the universe, but I think that we can safely claim that the intellectual energy formerly devoted to working out fine points of metaphysical dogma was now harnessed to understanding the material mechanics of the universe. God, as it were, was left to take care of himself, dogmatically speaking. It is sobering to see how completely such issues as the Trinity and Transubstantiation lost the power to rile up civil discord; matters for which people had been prepared to kill (and die) no longer meshed with anyone’s intellectual outlook. It was against the background of this general indifference to speculative certainties that the Papacy pushed through two dogmatic alterations that would have been hotly contested in earlier times: the doctrines of the Immaculate Conception (1854) and of papal infallibility (1870). The most conspicuous exercise of the latter power concerned the Assumption of Mary (1950). This isn’t the place to assess the drift of ecclesiastical concerns from the core beliefs that were shared by Paul and Augustine, nor to tease out the misogyny implicit in making one woman, the mother of Jesus, alone in holiness of all her sex. It’s enough to note that bodies and sexuality were promoted as matter of doctinal concern, while no attempt was made to alter the institution of the priesthood.  

Inevitably, Church leaders would find themselves confronted with ever-louder complaints from the likes of Maureen Dowd. The Augustinian settlement of Christian sexuality, in which laymen were permitted to marry and procreate but religious men and women were not, dedicating their lives to God, no longer made sense. The Church was no longer functionally above and apart from the secular world of the faithful; as the Protestants had demonstrated, spirituality did not suffer when prelates took wives. (And as for the unspeakable alternative, sought out by too many priests under the benighted protection of bishops whose allegiance to the confraternity was far stronger than any pastoral concern, we shall not speak of it.) While there might not be any objection to retirement from the world, insofar as it remained in and of the world, the Church’s insistence that it was nevertheless apart transformed it into what it has become: a hierarchical confraternity of celibate males that claims to have the first word, the last word, and every word in between on the subject of the Church’s constitution. If you are not a member of this confraternity, you have no standing to engage in a discussion — not that discussion is encouraged among those who are. This is the unlovely organization that, according to Tim  Iglesias, Church leaders would like to purify.

The only object of doctrinal focus, therefore — at least, for Roman Catholics who are not priests — is the identity of the Church as an organization run and officiated by unmarried men. That is what being a member of the Roman Catholic Church has come down to.

Gotham Diary:
Unthinkable
14 February 2012

Tuesday, February 14th, 2012

Two stories in today’s Times seem to me to ring the same bell. Joe Nocera lights into the NCAA yet again, this time in a somewhat backward fashion, by “praising” the organization for turning a blind eye on entrenched practices in amateur/collegiate hockey that it would prohibit in any other sport. Nocera believes that these practices are beneficial to young athletes and ought to be the rule, not the exception. Why aren’t they? Because they interfere with revenue streams that accrue to colleges at the athletes’ expense, that’s why. It’s pretty sickening stuff.

The other story needs even less in the way of summary. Truly independent fair-labor watchdogs are laughing at Apple’s pious decision to sponsor the investigation of the Foxconn City plants, where many of its products are made, by the Fair Labor Association — an outfit that, like the NCAA, is funded by the very enterprises that it is supposed to regulate.

(Technical point: the FLA will be investigating Apple’s supplier, not Apple itself. Ultimately, however, it is Apple’s decision to continue working with the Foxconn City folks that is on the line.)

Into those stories, stir the debate about the Volcker Rule. Bankers are complaining that the Rule will cost them inordinate amounts of money and also lead to job loss. Volcker to banks: tough noogies.

What all three of these stories have in common, I think, is that the behavior to be prevented or regulated is wicked. Not criminal, necessarily, but certainly nasty. Take sweatshop conditions, where the factory might be clean and well-ventilated but the workers are subject to a midnight wake-up call to meet the whims of some dork at Apple (happily no longer likely to be “the best businessman in the world today,” Dork-in-Chief Steve Jobs). How do you rationalize treating workers badly? Here’s how you do it in the early Twentieth Century: you exploit Chinese workers. You exploit Confucian ethics and Asian authoritarianism: They’re like that anyway. And maybe they are, but of course that’s not the point, not if you’re an American wondering what it cost to put an iPad in your hands for only $500.

The case of the NCAA is darkly fascinating. The point of the Association is to protect athletes from commercial exploitation. All well and good, but the mission was corrupted when the schools belonging to the Big Ten and the other football circuits were in a position to do the commercial exploitation. Once again, the underlying behavior, the exploitation of adolescent athletes, is obviously wicked. No matter who does it, it’s wrong! And Joe Nocera is certainly making the case that schools are exploiting their so-called student athletes. Even if the player gets a degree, what has he actually learned in school? How well are thesse “students” prepared for the lives that they will, with any luck, live to live after their bodies cease to be profit centers?   

***

As for the bankers, can anyone tell me where, on this particular moment on the Blue Planet, big banking is being done in a responsible, constructive way that does more than pour the odd bonus into punters’ pockets? South America, perhaps. You don’t hear terrible things these days about South American banking. Maybe it’s in no shape to keep up with the smart alecks in Japan, China, Europe, and the United States who have developed a broad portfolio of fucked-up strategies.

(And while we’re on the subject of smart alecks, may I suggest to Andrew Ross Sorkin that addressing Paul Volcker, even in hypothetical punditry, with a sentence beginning “C’mon…” is unbecoming? ) 

My question is this: how do you “regulate” wickedness? Is it possible?

Gotham Diary:
Procedural
25 January 2012

Wednesday, January 25th, 2012

One of the craziest things about law school — in my day, anyway — was the presence of Criminal Procedure in the first-year curriculum. I had a terrible time with the class because I could not suppress the conviction that we were not being taught things in the proper order. Criminal Procedure is, basically, Constitutional Criminal Law, meaning the body of procedural requirements ordained by interpretations of the Bill of Rights and its piecemeal superimposition upon state law. Law schools are not interested in the array of federal criminal laws, such as the Mann Act or Rule 10b-5. They’re even less interested in the states’ various criminal laws. But what’s worse about learning criminal procedure in the first year of law school is the postponement of learning about evidence to the second year. Criminal Procedure ostensibly lays out the rules for playing the Go To Trial game fairly. It’s only in Evidence that you learn how bizarre, deranged, and no-longer-just that game really is.

I was thinking about all of this yesterday as I read Adam Gopnik’s Critic At Large piece in this week’s New Yorker, The Caging of America.” Nobody who reads The New Yorker needs to be told that we have a massive prison problem, with a far higher percentage of men behind bars than any other advanced nation; or that this prison problem is also, outrageously, a race problem, with a sickening disproportion of black and Latino inmates. Adam Gopnik brings two new items to his discussion. Well, one of them was new to me, William J Stuntz’s The Collapse of American Criminal Justice, published last year right before the Harvard professor’s death. If I heard mention of the new title, nobody went on to tell me that Stuntz came to the conclusion that it’s our Bill of Rights itself that’s the root of the prison problem. “The trouble with the Bill of Rights, he argues, is that it emphasizes process and procedure rather than principles.”

This emphasis, Stunts thinks, has led to the current mess, where accused criminals get laboriously articulated protection against procedural errors and no protection at all against outrageous and obvious violations of simple justice. You can get off if the cops looked in the wrong car with the wrong warrant when they found your joint, but you have no recourse if owning the joint gets you locked up for life. You may be spared the death penalty if you show a problem with your appointed defender, but it is much harder if there is merely enormous accumulated evidence that you weren’t guilty in the first place.

This is just another typical, sad result of Anglophone credulousness when it comes to playing fair. The problem with trying to play fair in a criminal trial is that it’s absolutely unnatural. We may say that everybody is innocent until proven guilty, but we don’t back it up at all with safeguards against our bone-deep doubt that a truly innocent person would ever wind up in the dock. We’re unwilling to understand that many good police officers, seasoned by experience, will outgrow the essentially adolescent modality of playing fair and turn toward seeking justice instead, procedures be damned. And we tacitly conspire, all of us, to impose the brunt of our sillier laws — Gopnik rightly singles out our marijuana-possession proscriptions — on minorities, permitting white infractors to get off lightly, thus baffling the point that the law itself makes little sense.

I said that we don’t back up our innocent-until-proven-guilty rule with “safeguards,” but this is not true; it’s worse than untrue, because the safeguards provided by our laws of evidence  were put in place by an entirely different society, a largely homogeneous one with low social mobility. It’s worth bearing in mind that the original Anglophone witnesses, back in the Middle Ages, were also the jury. Imagine rules for an emergency-health-care system that took no account of ambulences or cell phones. That’s what our jury system is like. It made sense, about a thousand years ago. What the laws of evidence serve to do today is to block a lot of common sense. And they encourage the judge and opposing counsel to engage in all manner of fancy branles and bourrées over what is and what isn’t “admissible,” not to mention the surreal demand that jurors will pretend not to have heard this or that in the courtroom.  

The other thing that felt fresh about Gopnik’s essay was the note on which he ended it: “‘Merely chipping away at the problem by the edges’ is often the very best thing to do with a problem; keeping chipping away patiently and, eventually, you get to its heart.” This is a kind of conservative optimism, it’s true; it’s dangeously close to believing that “muddling through” will get you through any crisis.  But there was nothing muddling about the changes that brought crime rates to a national low in New York City — “just the intercession of a thousand smaller sanities.” Not enough of these sanities were located within the criminal justice system itself, however.

Elsewhere in the issue, Ryan Lizza points out that, thanks to the vast increase in self-segregation in American society since the passage of the Civil Rights Acts, the divide between red and blue is not a bad dream but a political reality. Maybe it’s time for progressive thinkers to explore ways to exploit the divide. The denizens of securely-gated communities have very little reason to fear drug-addled “elements” from the wrong side of town. Why not encourage them to take the “live-and-let-live” that their seclusion, once available only to the very wealthy, now allows them?

Gotham Diary:
In Like Flynn
19 December 2011

Monday, December 19th, 2011

There is always so much to be learned about photography. Red-eye is bad enough. Red velvet hands? What I’m really showing off here is the happy accident that Civil Pleasures, my second Web site and still more in development than it ought to be four years after launching, looks just right on the Kindle Fire without any further fiddling.

***

Watching The Private Lives of Elizabeth and Essex last night gave new meaning to the phrase “in like Flynn.” The 1939 Warners classic, which I’d never seen before, turns out to be almost perfectly cast. Just as Elizabeth put the stability of England ahead of personal glory, something that Essex couldn’t seem to imagine doing, so Bette Davis put the dramatic interest of the motion picture ahead of personal vanity, which couldn’t have occurred to Errol Glynn. Ethan Mordden writes that Flynn “was at his best when he let his natural charm show through” — in other words, when he stopped acting. Of Elizabeth and Essex, Mordden writes, “Flynn thinks it’s a Flynn vehicle, and he hurts the film by not refusing to respond to Davis.” Just as Essex hurt England with his vainglorious march on London. Well, “hurt” is perhaps overstatement. Neither the aristocrat nor the actor was truly significant personage in his line of work, although both were of course very popular for a spell. uy

I’ve been re-reading The Hollywood Studios: House Style in the Golden Age of the Movies (Knopf, 1988), and enjoying it to pieces. Beginning with Paramount and MGM, Mordden writes engaging, conversational chapters about each of the Majors (and one about the Independents as well), sifting through the moguls, the stars, and the properties to identify the characteristics that distinguished the overall output of each. What, for example, made RKO different? First of all, it was founded in 1928, at the dawn of the Talkies. It couldn’t have learned anything about making movies from the long experience that the other studios had. No wonder the studio was the first to go, bought about by Desilu in 1957.

Among other things, House Style (as I call it) is a very funny book.

Today it is common to think of Hepburn as a natural, even as inevitable. But when she was new she was thought strange-looking, affected, and possibly nutty. Hollywood likes outstanding versions of the norm, not outstanding versions of the outstanding, and the non-conformist Hepburn, blurting out The Oddest Things to the press, dodging photographers, and failing to be spotted on the right arm at the orthodox places, acted as strangely as she looked.

She played strange roles, too, no one like another: and played them not as if the studio made her do so but because she wanted to. How to get a handle on this woman? In Christopher Strong (1933) she is Lady Cynthia Darrington, a world-famous aviatrix. The very noun itself bespeaks a pride of glamour. But Hepburn shows up in silver lamé sheath with a Dracula collar and antennae. Maybe it’s supposed to suggest Garbo, but it makes Hepburn look like a Martian lounge singer.

As they say, LOL. I don’t know when I’ve enjoyed re-reading a book so much. Of course I feel terribly guilty, indulging in such pleasures when the house is bursting with unread new books. I can’t have known, back in 1988, that House Style would be one of the most important books in my collection, to me I mean, but that’s unfortunately how libraries work. You have to hold on to everything, because you don’t know what you’ll regret letting go.

Rereading the book prompted me to have another look at Grand Hotel and Dinner at Eight (1932 and 1933 respectively, and both MGM. They were both signature offerings, the one of Irving Thalberg and the other of David O Selznick, and they are both haunted by silent-screen habits that won’t go away. Lionel Barrymore plays dying men in both films, but that’s all the characters have in common; Dinner‘s Oliver Jordan is an admirably modest gent, but Grand Hotel‘s Otto Kringelein is a whining, wheedling clerk who never shuts up. He would go over much better, and in fact be the figure of sorrow and pity that he is, if we couldn’t hear him. In the same film, there are times when it would better if we couldn’t hear Joan Crawford, too. She’s still a pretty girl here, but she sounds like a defective Eliza Doolittle, too much of this and too little of that. Too many of her takes seem designed to announce winning poker hands. As for Garbo, she doesn’t need the silver lamé or the antenna to look like a Martian lounge singer on the verge of a nervous breakdown. Only John Barrymore, ham that he was, seems to know where movies were going, and is capable for film’s natural tendency to overstate everything.

I’ve never cared much for Jean Harlow, possibly because, like Joan Crawford, she’d have done better at Warner’s (as Crawford certainly did). What I learned from Ethan Mordden is that the studios’ different styles could be the making of an actor. It took MGM to make an outstanding normal woman of Katharine Hepburn, for example. Jean Harlow might have been funnier if she’d made more movies with James Cagney, say. Instead, at brighly-lighted MGM, she’s just vulgar, a mannequin for bias-cut satin nightgowns. And she’s sad, too — she died so very young (26). Marie Dressler, on the other hand, is a revelation: now I know where Angela Lansbury comes from.

***

Now, to finish Daniel Kahneman. A blurb on the dust jacket, contributed by Nicholas Nassim Taleb, ranks Thinking, Fast and Slow with The Wealth of Nations and Freud’s The Interpretation of Drams, and I wholeheartedly agree. Like the earlier books, Thinking completely upsets a widely-held idea, in this case that “man is a rational animal.” I hope that someone is already at work on an elementary-school curriculum that is based on Kahneman’s conclusions. For one thing, we all need much more basic training in statistics, and the whole field of arithmetic ought to be reconceived accordingly. Second, young minds ought to be shaped, to the extent that they can be, by an awareness of the biases toward overconfidence and bad decisions that are Kahneman’s book’s crown of thorns.

 

Gotham Diary:
Tech Style
16 December 2011

Friday, December 16th, 2011

Typical. The minute I feel restored to 3D by Remicade, I run around like a crazy person trying to do everything that was left undone during the previous fortnight. The result is as much a part of the rhythm of my life as the infusion itself: a day in bed. Ordinarily, I’m someone who likes to get out of bed. I may not be so keen on standing up and thinking, but staying in bed has become unappealing. This morning, I did not so much wake up as drift into a remake of Greta Garbo’s bedroom scenes in Grand Hotel (which I watched yesterday), only I was happy and perfectly content. While Kathleen read the paper, I sank in and out of dreams that were alarming simply because of their alternaty: at one point, I was pushing a grocery cart, clueless, at Fairway. What was I shopping for? What was I doing in Fairway? Even less pleasant was trying to take a sip from my water bottle: only in my dream was I holding it. I came to with a shudder.

I scratched my plans for the day, which were pretty ambitious. I was going to go to the movies, visit a toy store, and round up the holiday paraphernalia at the storage unit. Instead, I think that I’ll go back to bed.

***

But first, a word or two about Kurt Anderson’s Vanity Fair piece about the failure of style to change over the  past twenty-plus years. This is something that I’d noticed myself. I came to the conclusion, voiced but not fully endorsed by Anderson, that we’ve been too preoccupied by the overhaul in our personal lives wrought by digital technology that we haven’t had much appetite for superficial change.

In some large measure, I think, it’s an unconscious collective reaction to all the profound nonstop newness we’re experiencing on the tech and geopolitical and economic fronts. People have a limited capacity to embrace flux and strangeness and dissatisfaction, and right now we’re maxed out. So as the Web and artificially intelligent smartphones and the rise of China and 9/11 and the winners-take-all American economy and the Great Recession disrupt and transform our lives and hopes and dreams, we are clinging as never before to the familiar in matters of style and culture.

But Anderson is happier, it seems to me, with a declinist reading of the matter: “After all, such a sensibility shift has happened again and again over the last several thousand years, that moment when all great cultures—Egyptian, Roman, Mayan, Islamic, French, Ottoman, British—slide irrevocably into an enervated late middle age.” There’s a fallacy here that is only beginning to be noticed by historians, who have come to see it as part of their déformation professionelle: the inclination to anthropomorphize cultures, to speak of them in terms of youth, vigorous prime, and decrepitude. What’s really being done is this: slices of the past are being weighed for their interest to us. Were the affluent families of the later Roman Empire and the dawning European kingdoms sensible of living in fallen times? I rather doubt it; on the contrary, they were preoccupied by the big new thing, which was Christianity, not only as a personal faith (a new idea in itself) but as a social network. Until very recently, historians have not found anything about Christianity as a social network to be interesting, not least because most they’ve grown up in an era of Christian retreat from intellectual life. But now we’re learning that we have a tendency to identify as robust those cultures that get to push other cultures around. Not so great. And while there are certainly periods in history of great catastrophe, they don’t appear on cue, in the order proposed by Thomas Cole’s suite of Course of Empire paintings.

Anyway, it struck me this morning that the simplest explanation is that the locus of change has shifted, from stuff to circuits, and that we have been taken through several  booming cycles of style change by the guys who set style today: tech nerds. Why, they’re not even nerds anymore! They’re usually pretty hip. But they’re private about their stuff. (And, as closet libertarians, they’re fairly apolitical as well.) The only thing that they want to share is the newest wrinkle in the technological fabric. Because this is their time in the sun — well, we’ve put them there — they make sure that we’re all caught up in the frenzy. Our social network congregates at the Apple Store.

Gotham Diary:
Just a Thought
14 December 2011

Wednesday, December 14th, 2011

Nick Carr, the superhero who goes by the name of Scout, bringing the brick-and-mortar mysteries of New York City to light even when he can’t solve them, has long been interested in the abandoned headquarters of the New York Architectural Terra Cotta Works, built in 1892 and left to rot during the 1920s. He has recently discovered that the building is being restored by its current owner, Silvercup Studios — even though no actual use for the structure has yet been decided.

May I suggest that, whatever the interior configuration that Silvercup settles upon, the Works Office ought to be occupied by a foundation — a foundation devoted to the nurturing of Internet journalism. I’ll just call it that: Internet journalism. Preferably journalism supported by anything but advertising. Whether bloggers of the future work there or have offices — well, that seems beside the point. The Works Office would serve as an archive, not just of information about New York, but of how to find it. A fellowship at the Works Office would make a decent contemporary journalist out of almost anybody upon whom it was bestowed.

The Works Office isn’t very large, but, as we all know, you can do a lot on the Internet without taking up much square footage. I’m sure that Nick Carr (who would have to be a director!) can figure it out. There’s certainly room for plenty of bicycles — can’t you just see them lined up beneath those great big windows?

Just a thought.

Gotham Diary:
Here will we sit…
25 October 2011

Tuesday, October 25th, 2011

I could say that Kathleen beguiled me into staying up late, listening to a new Schubert CD and an old favorite by Vaughan Williams, but I didn’t actually go to bed at any very late hour. Rather, I didn’t want to get up this morning. I was more comfortable in bed than I knew I’d be when I got up, and had to begin the day, as I always begin the day now, with the selection of a photograph for this entry. I had chosen a picture last night, but it was a case of making do, and I wasn’t happy with it. So I stayed in bed until all hours (after nine!), and when I got up the first thing I did was mount the camera on the tripod and look for a subject.

The crotona plants in the living room took me by surprise. I was interested in a tree behind a building on First Avenue; it is the only tree whose leaves have turned color so far. But the morning light was all wrong; I’ll have to wait until this afternoon, and even then it may be too sunny. I cast my gaze around the balcony, but nothing tempted me, and I kept turning, and voilà, there, right at my side, was this profusion of shiny, variegated leaves, begging to have their picture taken.

That was hours ago. My apologies.

***

 The Epicurean Dealmaker has written a great piece about the Volcker Rule mess. It prompted me to resume my serious thinking on regulation and how to go about it. Our standard model regulatory agency, with its commissions, staffs, rule-making procedures, enforcment provisions and fees is about as up-to-date as the rotary-dial telephone. The model itself is arteriosclerotic; decades of usage and abusage have fermented a Washington culture of bureaucrats that make the Middle Kingdom’s mandarins look like Silicon Valley entrepreneurs. We need a new game.

While Kathleen got dressed, I sketched a few ideas. One of the Epicurean Dealmaker’s points for further discussion is the possibility that our systems are too complex to be managed. One way to simplify systems without abandoning the benefits of complexity is to break things up into smaller, self-supporting units. For example, regulatory commissioners could be drawn from a college of professional regulators without portfolio. They would be tasked to address a current problem: are markets fair? Is corporate governance suitably transparent? Has an industry or economic sector succumbed to monopolistic practices? These commissioners would appoint free-standing, independent-contracting teams of investigators, and vest them with subpoena power. The teams would consist of accountants, lawyers, economists, psychologists, environmentalists — not too many of any, but enough to get to the bottom of things and to produce an enlightening report. The commissioners would then empanel hearings, appointing a senator, a congressman, their and state- and local-level  correlatives, and any other appropriate judges. This panel would issue a legislative mandate, calling for the implementation of specific amendments to existing laws. It would also, where appropriate, hand over its findings to district attorneys.

As I spoke, it seemed that these ideas were coming to me out of the blue, but in writing them down I see that, with the exception of the last part — the bit about the panels and the hearings and the mandates — what I’ve described is the modern complex surgery, much like the operation performed on my neck in 2007 at the Hospital for Special Surgery. I remember being wheeled into a room that looked more like Steve Jobs’s sometime garage than an “operating room,” and it was full of people. Two of them, I understood, were neurological consultants who never touched me but constantly monitored my responsiveness. I don’t know about the rest, aside from the head surgeon and the anaethesiologist. It was a complex, one-shot attempt to solve a serious problem, and I have to say that the team did a great job.

The regulator in the case was my internist, who didn’t even look at the X-rays. He knew whom to call, though.

Gotham Diary:
The Second Time
24 October 2011

Monday, October 24th, 2011

This time, it counted for sure. Will was wide awake throughout his second haircut, and although he eyed the two of us warily when we sat down in the barber’s chair before the plate-glass mirror, he on my knee, and held on to his bottle, he cooperated more often than he didn’t, sitting still as Tito gathered swatches of hair between his fingers and cut them centimeters at a time. Tito worked very quickly, obviously aware that his client wasn’t going to sit still (or still-ish) indefinitely. I’d have been happier if he’d taken a little more off, but he quite rightly worked around Will’s head so that the result, when time was up, would be even.

Will’s parents were very pleased.

***

I haven’t been a fan of the Times Magazine for quite a long time — I like the new, Monocle-esque design, but there is something brutish about the cover editing, and this week’s close-up of Haruki Murakami is no exception — but as does happen in almost any magazine from time to time, I found not one but two valuable pieces in this weekend’s edition. One was Mark Bittman’s master recipe for pounded cutlets. Bittman rarely has anything new to tell me, but he’s invaluable because he makes me remember and re-prioritize what I already know.  My formation culinaire, like that of so many ambitious autodidacts, was designed (by me) to enable me to impress dinner guests; if something was easy to make, it didn’t count. Bittman’s columns are balm for the recovering would-be master chef. His recipe for Chicken Scallopine al Limone was almost completely familiar — improved by replacing chicken breasts with more favorable thighs — but I somehow needed to read it and then to follow it. I marched across the street for the boned thighs and got to work. I prepped the dish and then waited for two potatoes to be done baking. Execution was a snap. I must put the recipe in the list of 25 weeknight recipes that I’m compiling. Why, you may ask? Because I rarely think about food these days when I’m in the kitchen, and when the late-afternoon moment of decision arrives (what are we having tonight?), I’m a deer in headlights. I need lists — and Mark Bittman’s variations on everyday themes — to remind me of the possibilities.

The other piece was Stephen Marche’s indictment of Roland Emmerich’s madly irresponsible new movie, Anonymous. What makes Anonymous so regrettable is the way it harmonizes with and glorifies the junk thinking that is choking Western civilization to death.  Ordinarily, making a movie in which it is posited that somebody other than Shakespeare wrote “Shakespeare” would be just a movie. But there’s more to Anonymous than counterfactual fantasy. It’s an apotheosis of our snobbish anti-elitism, as Marche observes; but it’s also a craven tribute to our hunger for celebrity self-exposure. The trouble with Shakespeare is that he doesn’t seem to have had much of an offstage life. His record, aside from a handful of legal documents, is in his work. What he thought about his work — well, that’s in there, too. What’s maddening about Shakespeare is the absence of secrecy in his life. We want to catch him keeping a private diary, an alternative to the published poems and plays; we want to see him relax and admit that “work sucks.” His exemplary discretion ought to inspire us; instead, it enrages us. And Anonymous is nothing if not a tantrum. We can only hope that brighter young viewers will be prompted to take a look at John Madden’s equally fictitious, but more reality-based, 1998 biopic, Shakespeare in Love.

In his Op-Ed piece on Anonymous, James Shapiro explained how today’s idiot conspiracy theories work. They begin with an exciting but untelligent proposition (“How could a hick from Warwickshire have written all those great plays?”) and end by asserting that the best evidence is the complete absence of evidence. Just goes to show how dangerous the secret really is! Until now, that is. Somehow (and this is never explained), the lid has been lifted, and ordinary civilians have been given a glimpse of the “truth.” I don’t know why, but I’m reminded of Linda Colley’s blithe dismissal of Freemasonry, the popularity of which throughout the Eighteenth Century (the period covered in her magisterial book, Britons: Forging the Nation 1707-1837) she attributes to “the male delight in secret rituals and dressing-up.”

Which reminds me of a third good piece in the Magazine: an excerpt (or something like) from Daniel Kahneman’s new book, Thinking, Fast and Slow. Kahneman’s contributions to what I promise never again to call Wrongology are nothing less than seminal, but what caught my eye was his approving summary of someone else’s work.

[Terry] Odean and his colleague Brad Barber showed that, on average, the most active traders had the poorest results, while those who traded the least earned the highest returns. In another paper, “Boys Will Be Boys,” they reported that men act on their useless ideas significantly more often than women do, and that as a result women achieve better investment results than men.

“Useless ideas” — would that there were no more to Anonymous than that. But, hey, it is “only” a movie. A specatacular, action-filled, delusion-infused movie.

***

Later, after dinner on Saturday night, we learned that the very inexpensive yardstick that I bought at the hardware store isn’t going to help us to measure Will’s height anymore: he tops out at a hair over a three feet. This means, as if there were any doubt, that he will almost certainly grow to be six feet tall at least; the rule of thumb is that you double a child’s height at 23 months. Or is it 25 months? Nobody can ever remember — except that 24 months isn’t it. Will will be 22 months old next week.

I missed the best part, though. While I was out of the room, Will asked Kathleen, “Where are Mama and Daddy?” His first complete sentence. That he asked her couldn’t surprise me less. If he was a little anxious about the answer, Kathleen was obviously the right person to soothe his fears. Just as his mother had known when, older to be sure, she leaned up to Kathleen as they were walking on the beach at Fire Island, and asked why “those two men over there” were kissing. Will was reassured by Kathleen’s answer, and went right back to whatever it was that he’d been doing.  

 

Gotham Diary:
An Education
12 September 2011

Monday, September 12th, 2011

Reading Rachel Brownstein’s Why Jane Austen? is great fun, but it’s also doing a fine job of reminding that I am not cut out for teaching — teaching in an American classroom, that is, in which students are encouraged to air their personal opinions without having been given much guidance in the formation of useful ones. Brownstein’s summary of the “questions” that her broadly unread college students will “ask” about Pride and Prejudice includes, for example, this gem: “Doesn’t canonical English literature, don’t the novels of Jane Austen especially, coercively instill ruling-class attitudes in her readers — the — law, that the West is best?”

They have not read Edward Said [she notes wryly], but they do watch television, and ideology trickles down.

Brownstein has a wonderful time unpacking the famous first line of Pride and Prejudice: what does it really mean?

In the famous first paragraphs of Austen’s most-read and best-loved novel, which raise questions of truth and universality and what kind of truth gets acknowledged and what kind remains unsaid, her irony is palpable, thick. But it is not clear what it is directed at. Are we meant to read the first sentence of Pride and Prejudice as calling attention to the opposite of what it seems to be saying, that society is concerned about the fate of unmarried women, not men?

And so on. None of these thoughts would ever occur to me; general questions about abstract issues do not naturally rise in my mind, certainly not when reading agreeable novels; thinking about “truth” and “universality” wastes a lot of time and does more harm than good. (In that, I am as anti-intellectual as the most red-blooded American.) To me, the famous first line of Pride and Prejudice is nothing other than Mr Bennet’s way of putting the kind of nonsense that Mrs Bennet believes. Need one say more about it than that? Well, if you’re in “the academy,” it seems that you’re going to have to. Rachel Brownstein is nothing less than engaging about the business, and if I have to sit through discussions of truth and universality, I want her to conduct them. But I’m so relieved that I never pursued the academic life that it feels like having dodged a falling safe.

In the Spring issue of the Wilson Quarterly — one of those periodicals that I regularly consider canceling, until I finally get caught up with them and rediscover their indispensability — august emeritus professor Daniel Walker Howe mourns “Classical Education in America.” He quotes Garry Wills:

Learning classical Greek is the most economical intellectual investment one can make. On many things that might interest one — law  and politics, philosophy, oratory, history, lyric poetry, epic poetry, drama — there will be constant reference back to the founds of those forms in our civilization.

A bit starchy (especially for Garry Wills), but a home truth nonetheless. The problem with Classical education, it seems to me, is that it begins as a foreign language course, taught by grammarians who are sticklers for proper declension. This is not only beside the point but insulting. I would have benefited greatly from a course in readings from Loeb Classics, with English translations on facing pages. A good teacher would have been able to persuade, I’m sure, that it would be worth the trouble to learn Latin well enough to know, understand, and feel that

linquenda tellus et domus et placens
uxor, neque harum, quas colis, arborum
 te praeter invisas cupressos
  ulla brevem dominum sequetur.

is inescapably better than

The earth, your house, the wife that you love so well,
Must be abandoned, and, of these trees you tend
 So fondly, none except the hated
  Cypress will follow its short-lived master.

Big Ideas:
Second-Class Catholics
Thursday, 14 July 2011

Thursday, July 14th, 2011

Ordinarily, I reserve the “Big Ideas” heading for bright ideas, my own or someone else’s, that in my opinion, it would do everyone good just to think about. Today, however, I want to call attention to a big idea from a milennium ago. It has become a very bad idea for its host institution — the Roman Catholic Church — but, owing to a couple of other very bad ideas that have cropped up since (particularly the relatively recent notion of papal infallibility), it will be very difficult to pull the Church out of its pickle.

In the latest news about the Church’s ongoing pedophile crisis, critics are calling for the resignation of John Magee, the Bishop of Cloyne in Ireland. Magee admits that he may have failed to implement new anti-coverup policies in his diocese, but there’s more to it than that; he is said to have caressed a would-be seminarian. We’re not talking ancient history here. Meanwhile, in Germany,

[T]he Catholic Church’s decision to open its personnel files was an effort to restore some of the public trust in the church that the scandals have eroded. Record numbers of Catholics left the church in Germany last year after hundreds of cases of previously unreported child abuse came to light, including a case of a priest with a history of molesting boys who was returned to pastoral duties by the archbishop of Munich, Joseph Ratzinger, now Pope Benedict XVI. The priest was later convicted of molesting more boys.

Actually, I did have a bright idea, and it overlays the one that energized a series of popes at the end of the Eleventh Century. I asked myself, what does the Roman Catholic Church really stand for? What is the doctrine that it would be least likely to alter or abandon? I quickly discarded the possibility that this most vital tenet would be purely theological. The Magisterium might announce a new understanding of the Trinity tomorrow, and nothing would change. In fact, Catholic theology has undergone a largely healthy evolution during the past two thousand years. (Consider those Reformation bugaboos, Purgatory and Mariology.) There is only one matter that has not evolved, or that, rather, has devolved,  or done whatever recessive evolution would be called. If you think that “celibacy” is the answer to my question, you’d be largely right, but not right enough. Celibacy is just an aspect of the core Catholic doctrine.

Thanks to Diarmaid MacCulloch’s Christianity, I can cite this dogma as it was pronounced in a papal encyclical, Vehementer, issued by Pius X in 1906. It comes at the end of the following passage, taken from Christianity, that describes the first formal compilation of what we now call “canon law,” the Concordia discordantium canonum (known as the Decretum) attributed to the Bolognese jurist Gratian.

The Decretum and canon law in general also specifically embodied that principle of the Gregorian Revolution that there were two classes of Christians, clerical celibates and laypeople. Only a century ago, this could still be pithily spelled out in an official papal pronouncement: “The Church is essentially an unequal society, that is, a society comprising two categories of persons, the pastors and the flock, those who occupy a rank in the different degrees of the hierarchy and the multitude of the faithful.

The Roman Catholic Church is, at heart, a confraternity of celibate males who subordinate themselves to a well-established hierarchy, with the Pope at the top. Nuns, and friars who have not been ordained as priests, do not, although celibate, figure in this scheme. Neither do the parishioners whose financial contributions support the organization. Why these churchgoers expect the Church hierarchy to put their needs (and the needs of their children) first is beyond me. What club has ever prioritized non-members over members?

We’d like to know more about where those “record numbers” of Germans went. We think that they’ll be having plenty of company.

The Gregorian Revolution took place less than half the Church’s lifetime ago. It seems eternal now, but a thousand years ago it bore no more than a bogus patina of venerability.

Big Ideas:
Varsity Housekeeping
Thursday, 7 July 2011

Thursday, July 7th, 2011

This afternoon at lunch, I was reading Ken Auletta’s profile of Sheryl Sandberg, Facebook’s chief operating officer, a woman who feels passionately that “The Nº 1 impediment to women succeeding in the workforce is now in the home…” A page or so later, something else gave me what I think is a great idea. 

Many women in the room were among the rotating cast of two hundred whom Sandberg invites to her home each month for a buffet dinner and to listen to and question special guests, who have included Steinem, the playwright Eve Ensler, Microsoft CEO Steve Ballmer, the educator Geoffrey Canada, and Mayor Michael Bloomberg. 

Being me, as asked myself what I would have to say to these high-powered career women. Maybe not much — but perhaps I could talk to their husbands. My wife may not be a Silicon valley billionaire, but she’s a leader in her professional field. True, we don’t have children; and it’s also true that I don’t have to show up for work anywhere. But let’s face it, it’s not the pressing demands of a top job that prevents men from doing more than desultorily helping out at home. Most men haven’t got a clue about what goes into keeping house. Their mothers, in all likelihood, saw to it that they don’t. This saddles wives with a problem.

We will all agree that marriage is a learning experience for everyone, but there are perhaps a few courses that might be taught as prerequisites, and housekeeping is an obvious candidate. A woman oughtn’t to have to teach the man she loves how to empty a dishwasher any more than she has to teach him how to kiss. It may never be as easy to identify good vacuuming skills as it to spot a good kisser, but that’s where my idea comes in. There is no need for kissing credentials — to borrow a fashion term, they’re self-degreed. But wouldn’t it be nice if a woman could know that not only did her fiancé attend a great college but also that he passed its housekeeping course? If mothers aren’t going to do the job, then perhaps higher education can make itself useful for a change. 

Here’s how it would work. Undergraduates electing the program — which would be non-academic but as rigorous as participation in any athletic team — would be assigned to four- or six-man suites. The suites would include kitchens and laundry facitilities. Household duties would rotate among the suitemates. At first, you might only have to cook once a week, and do the laundry once every two weeks. Eventually, though, you’d have to feed and clean a suite for two weeks at a time — all the time carrying your courseload. Forget all that hogwash about “nurturing”; the food would have to be tasty and “hot food hot,” and the clothes would have to be clean and neatly folded. Suites would be very regularly examined by advisers — drafted from the ROTC program, perhaps. (Who might themselves learn a thing or two. There is a lot to be said for military tidiness, but of course  it stresses personal responsibility, not picking up after others.) The designated housekeeper would have to stock the suite with groceries and cleaning products from a school commissary, observing some kind of budgetary constraint. The course would be pass-fall, with the pass-fail rate geared more toward Navy SEAL selectivity than rocks for jocks. 

No one would be thrown into this program without a little advance training. In domestic boot camp, students would learn how to wash clothes properly and how to prepare a variety of basic foods, especially soups, salads, and stews. Nutrition and utility would be stressed. Grilling steaks would be discouraged. In a really well-run program, students would be forbidden to import snack foods from the outside world. Hey, this is school we’re talking about, not an amusement park! In the advanced programs, suites would be divided between men and women — but the women would not contribute to the housekeeping. At all. Ever. This would also be a training program, in its way, for them.

Readers inclined to dismiss my proposal as jocular ought to reflect for a moment. Housekeeping has undergone great changes in the past 50 years, but coming resource constraints and increased preventive-health awareness are going to require an even greater transformation. Men who are clueless about the day-to-day basics of feeding a household and keeping it safely clean aren’t going to be of much help. Quite the reverse.

To avoid confusion with Harvard, Haverford, and the University of Houston,
varsity housekeepers would be awarded minuscule h’s to sew onto their pristine butchers’ aprons. 

Big Ideas:
The Museum of Cognitive Monuments
Monday, 27 June 2011

Monday, June 27th, 2011

Toward the end of Incognito, David Eagleman discusses the strange case of Phineas Gage. The name first appears at the beginning of a section entitled “What It Does and Doesn’t Mean To Be Constructed of Physical Parts,” and, as is my habit, I glanced at it even as I decided to take a short break from the book. Before Eaglemen reminded me, I recalled that Phineas Gage was a nineteenth-century laborer who survived a ghastly head injury, which was so wondrous and strange in itself that it took his doctors a while to note a personality change for the worse. I had read about the case somewhere else, not too long ago. But where?

This was only the last of several such experiences while reading Incognito. An earlier example was Eagleman’s summary of a “time discounting” experiment by Daniel Kahneman and Amos Tversky. Not only had I read about that experiment, but I recalled that Tversky died before Kahneman won a Nobel Prize for their work together. Where did I read that? Not in Paul Bloom’s How Pleasure Works, which mentions the Kahneman and his Prize, but in connection with another experiment altogether, all in one succinct paragraph on page 206. Not in Eduardo Porter’s The Price of Everything, which mentions other work by Kahneman several times, but not Tversky. Not in Kathryn Schulz’s Being Wrong, which doesn’t mention either scientist.

Frustrated, I conceived the idea of the Museum of Cognitive Monuments, a Web site devoted to curating the cases and experiments that pop up again and again in books on the most exciting of current topics, the cognitive revolution — in which the notion of man as a rational animal &c &c is trounced and trashed. The Museum of Cognitive Monuments would be a concordance, collection references to discoveries in the field (both in psychology and in neurobiology, which are approaching a state of lamination), and offer brief précis of different writers’ handling of the material. In addition to indexing the growing library of books in the field, the MCM would work as a prolegomenon to it, allowing writers to assume that readers were already familiar with the relevant cases and experiments, cutting down on instances of entertaining but distracting pops of magazine-style introduction.

Incognito is altogether an introduction to “the secret lives of the brain” (the book’s subtitle), clearly aimed at readers who have never concerned themeselves with the cognitive revolution — otherwise, there would be no need for the first three chapters, which address a reader who, assuming that the conscious mind controls behavior, appears to be unaware that a cognitive revolution is underway. Eagleman’s original contribution begins when he borrows Doris Kearns Godwin’s phrase about the Lincoln Cabinet: a “team of rivals.” This metaphor serves Eagleman well, although, as I wrote last week, I find the overall tone of his business- and sports-flavored language distastefully complacent. Not to mention his reliance on the term “zombie” to describe more or less automatic and unconscious behavior patterns.

As long as the zombie subroutines are running smoothly, the CEO [the conscious mind] can sleep. It is only when something goes wrong … that the CEO is rung up. Think about when your conscious awareness comes online: in those situations where events in the world violate your expectations. When everything is going according to the needs and skills of your zombie systems, you are not consciously aware of most of what’s in front of you; when suddenly they cannot handle the task, you become consciously aware of the problem. The CEO scrambles around, looking for fast solutions, dialing up everyone to find who can address the problem best.

This is so guy that it’s embarrassing. If the passage has one inadvertent virtue, it’s that it silhouettes the unsleepingly resourceful nature of artistic consciousness. Eagleman seems unwilling to propose that the health of the moderrn mind depends on its ability to register a fair current of internal contradiction. But he lays out the evidence for such a conclusion.

Eagleman is far more interested in the assessment of criminal behavior, which he all but defines as deviant — another complacency. The book’s final two chapters constitute a mini-treatise on the overhaul of criminal law. His argument on behalf of legal reform that would bypass considerations of “blameworthiness” is interesting and persuasive, and it will undoubtedly be taken up again by Eagleman and others in bolder form elsewhere. By then, I hope, the author will have outgrown the boyish tendency to associate the normal mind with inattentiveness.  

Gotham Diary:
Braincoolio
Wednesday, 22 June 2011

Wednesday, June 22nd, 2011

David Eagleman’s Incognito: The Secret Lives of the Brain isn’t a disappointment, exactly — which is to say that it is a disappointment, in being rather less amusing to read that I expected it to be. I have the awful feeling that I’m reading a book that is aimed at guys. Worse, it might even be written by one. Eagleman’s tone is that of the sharp guy who gets a kick out of showing you that your intuitions and unexamined assumptions are way off base. The prevailing imagery is drawn from business and sports. There’s a sense of wonder at all the trouble that the brain takes to make our lives simple and efficient — to make it possible for us to pay minimal attention. 

In other words, it’s a very different book from Kathryn Schulz’s Being Wrong. Schulz approaches the brain as an error-prone organ whose bad habits we have to bear more or less constantly in mind as we navigate the complexities of social life, correcting for bias and prejudice even when — especially when — we think that we’re free of them. Eagleman thinks that the brain is cool. “Sometimes it is tempting to think that seeing is easy despite the complicated neural machinery that underlies it,” runs a characteristic observation. “To the contrary, it is easy because of the complicated neural machinery.” Thinking and consciousness are not necessarily good things, especially when the brain can perform difficult tasks on autopilot. 

The handwriting on the wall appears early, on page 6. 

Consider the activity that characterizes a nation at any moment. Factories churn, telecommunications lines buzz with activity, businesses ship products. People eat constantly. Sewer lines direct waste. All across the great stretches of the land, police chase criminals,. Handshakes secure deals. Lovers rendezvous. Secretaries field calls, teachers profess, athletes compete, doctors operate, and bus drivers navigate. You may wish to know what’s happening at any moment in your great nation, but you can’t possibly take in all the information at once. Nor would it be useful, even if you could. You want a summary. So you pick up a newspaper — not a dense paper like the New York Times but lighter fare such as USA Today. You won’t be surprised that none of the details of the activity are listened in the paper; after all, you want to know the bottom line. You want to know that Congress just signed a new tax law that affects your family, but the detailed origin of the idea — involving lawyers and corporations and filibusters — isn’t especially important to that new bottom line. And you certainly wouldn’t want to know all the details of the food supply of the nation — how the cows are eating and how many are being eaten — you only want to be alerted if there’s a spike of mad cow disease. You don’t care how the garbage is produced and packed away; you only care if it’s going to end up in your backyard. You don’t care about the wiring and infrastructure of the factories, you only care if the workers are going on strike. That’s what you get from reading the newspaper.

Your conscious mind is that newspaper. 

This imaginary “you” whom Eagleman is addressing, this solipsistic USA Today glancer, is precisely the sort of person whom one would have expected a front-liner in the cognitive revolution to disdain. Instead, Eagleman adopts the fawning peppiness of a car dealer. What does this “you” want to do with all the free time that simplistic summaries open up? From what I can tell, all “you” wants to do is to play Tetris. 

I understand that the importance of Incognito is not its presentation of the psychology experiments and fMRI analyses that have become almost familiar in recent years, thanks to books like Being Wrong — indeed, Eagleman writes for readers who haven’t been following this issue (who haven’t, for example, been reading Malcolm Gladwell) — but rather its insistence that we need to reconsiders our ideas of conventional and legal responsibility. If Charles Whitman, the Texas Tower shooter, had survived his orgy of death, and if it had been possible to detect the tumor that was compressing his amygdala, would it have been correct to hold him criminally liable for his acts? How do we manage the problems that ensue when otherwise effective medication sparks the irrepressible urge to gamble in Parkinson’s victims? What is the culpability of drug addiction? These are all important questions, and working out practical answers — refashioning our criminal legal system in the process — is going to be a tough slog. What I’ve seen of Eagleman’s thinking on these points seems thoughtful and grounded, and I’m looking forward to seeing more. But I’m disappointed to see Eagleman giving a pass to vernacular masculine inattentiveness. 

At one point, Eagleman refers to what I’ve come to call the paradox of the centipede: the centipede managed its hundred feet just fine until it was asked how it managed, whereupon it was paralysed by second-guessing. If you think “too much,” you can screw up your golf swing or your sex life, and you can become awfully familiar with insomnia. But I don’t think that thinking is the problem. Thinking is the symptom. Centipedes, we may trust, never actually stop to consider their articulatory powers, but when we do, it’s usually a sign that they’re not working. When we toss in bed, it’s a sign that our wiring is faulty; whatever the cure might be (medication, life-style modification), it is consciousness that alerts us to the dysfunction. It’s too bad that more of our fallible parts don’t do the same. 

And it’s too bad, I suppose, that David Eagleman comes from Texas, and not the Northeast Corridor.

Big Ideas:
Business and Pleasure
Monday, 20 June 2011

Monday, June 20th, 2011

A week or so after reading In Search of Civilization, I’m still surprised by John Armstrong’s suggestion that business can come to the aid of civilization by providing “desire leadership.” And that business will learn how to do this from the study of the humanities. Earlier in the book, Armstrong discusses C P Snow’s “Two Cultures” lecture, and maybe it’s simply the fact that we have been familiar with Snow’s challenge for fifty years that makes the chasm dividing science and the humanities (basically, the numerate from the innumerate) look much easier to bridge than the gulf between the humanities and business.

It is, however, perhaps the same gap; what business and science share is the determined reduction of phenomena to figures. But the very possibility of a discussion between businessmen and humanists seems outlandish. The two groups have such a long history of mutual contempt! We’re educated to flinch at the claim that genuine happiness — Armstrong, very interestingly, is more interested in “flourishing” than in happiness — might require purchases and acquisitions. And yet of course it does require them, at least for most people. At a minimum, we require reliable electric power to remain connected to the Internet, which has already transformed the nature of public discussion to an extent from which there can be no going back. 

In the biographical note at the end of In Search of Civilization, John Armstrong is identified as Philosopher in Residence at the Melbourne Business School. Now, what sort of job is that? Since when did business schools take on resident philosophers, and what do they expect of them? Whatever the answer, the job certaintly throws light on the point that Armstrong has to make about “desire leadership.” (Desire leadership, by the way, replaces the old, false relationship between business and consumers, which was desire creation.) And it explains, to no small degree, why of all the figures in the history of civilization Armstrong chooses as his model “desire leader” the Abbé Suger of St-Denis, the twelfth-century adviser to Capetian kings and, in some accounts, the personal inventor of the Gothic style of architecture. 

Suger — at St-Denis — was such an important pioneer for civilization because of his way of combining, and yet keeping apart, idealist and realist attitudes. His idealism was evident in the way he held on to a vision of perfection: he wanted people to love what was fine and beautiful and intensely serious. His realism was evident in the way he recognized what people are often like (feckless, greedy, status seeking). He did not use his realism about what people are like to undercut his vision of where he wanted them to go. He did not end up saying that since people are like this, this is fine and who am I to say they should be any different? His idealism — and the gap it opens between perfection and the way things are — did not lead him to hate or despise people. He shows us how to link generosity and the pursuit of perfection. 

A hero of civilization — like Suger or Cicero or Matthew Arnold — is
someone who is teaching us how to combine devotion to noble values with an acceptance of the ways of the world. They are heroes in my eyes because they do not seek to exploit whatever authority they might have; they accept that they have to do the work if they are to convince other people; they stand for kindness as well as wisdom. 

To speak of the fabrication of the Gothic ideal at St-Denis mere paragraphs after extolling the civilizing propensities of commercial transactions is to rub against a stubborn grain in Western thought, which has, from the dawn in which the great poets and the great industrialists first walked the earth (at the same time, if not in company), shrugged helplessly and hopelessly at the two camps’ hostile styles, instead of trying to articulate a connection between the creation of wealth and the benefits of prosperity. 

As usual, I believe that the computer will solve many tensions. The big fight between poets and industrialists concerns the importance of details, with the industrialists insisting opon the obvious importance of paying attention to facts and figures and the poets complaining that attending to figures and facts crushes the soul. The computer certainly has the potential to reduce the soul-crushing tendencies of accounting and balancing budgets, freeing industrialists to read more poetry. Dwarfing that  issue, however, is the cognitive revolution that is transforming the way we think about ourselves. I often suspect that it was the computer’s hyperrationality that allowed human beings to overcome their vanity on this point, and concede that we are not, after all, rational creatures. This ought to make business much more interesting, if only because it deprives business of the power to be boring. 

I hope that Armstrong is alert to the biggest problem facing business today, which is the pre-emption of capital by financiers (who make nothing except private fortunes). 

Business is not only to do with making profits. It is to do with facing competition, understanding the needs of your clients and customers and knowing what your strengths (and potential weaknesses) are. 

That’s all very well, but too much modern business, especially at the global level, is only to do with making profits. The only competition in view seems to be among workforces, not their employers; increasingly, sovereign governments have been persuaded to eliminate competition (formerly with regulation and tariffs, now with tax breaks and other subsidies) and to compenate for those “potential weaknesses” (by supporting organizations that are “too big to fail”). And almost everyone I know would agree that, far from understanding the needs of clients, today’s businesses insist that clients accept their desires. What we need today is  more business as Armstrong  understands it. To me, this means more small businesses. Computers help here, too, both by denaturing the advantages of economy of scale and by enabling the proliferation of goods and services that will, by means of desire leadership, put an end to mass production. I’m optimistic, but I’d like to hear some of this from the Philosopher in Residence.