Gotham Diary:
The Second Step
October 2015 (IV)

Monday 26th

Loose change: Let’s begin with a bit of lighthearted fun. In the current issue of the London Review of Books, Deborah Friedell takes a look at Michael D’Antonio’s Never Enough: Donald Trump and the Pursuit of Success. She’s very entertained by it.

‘I have made myself very rich,’ Trump says (over and over again). ‘I would make this country very rich.’ That’s why he should be president. He insists that he’s the ‘most successful man ever to run’, never mind the drafters of the constitution or the supreme commander of the allied forces. Bloomberg puts Trump’s current net worth at $2.9 billion, Forbes at $4.1 billion. The National Journal has worked out that if Trump had just put his father’s money in a mutual fund that tracked the S&P 500 and spent his career finger-painting, he’d have $8 billion.

I laughed and laughed and laughed. Finger-painting is the exact infantile correlative of what Donald Trump seems to do when he is not actually writing or cashing checks.


Dept of Mission Statements: “Needs work, to be sure.” I’ll say. All weekend, I tried to remember how I’d put it (without cheating), and I couldn’t. All I could summon was a sense of its inadequacy. Just now, I had a peek. “…rover on the network of human connections.” I do recall thinking, last Friday, that, while this describes a sensation that I should like to convey here, what it’s like to do what I do — and not, as the tradition of objective writing ordains, to cover the tracks that take me to my little discoveries — it falls short as a statement of purpose. In fact, it doesn’t even begin to fall short.

Having just read an interesting and important essay that I wasn’t expecting to encounter, I’m perhaps by that shocked into taking a very different stab at answering the question, This blog, what is it about? What am I after? What am I trying to achieve? What is important to me?

What I am trying to achieve is a comprehensive and coherent but also new way of thinking and writing about the human world, which may arguably be the only world that can be thought and written about. Why “new”? Because I believe that critical thinking — weighing and considering what you are about to say before you say it — is in danger of becoming a lost art, sunk in a sea of power points and jargon. We are in danger of running out of voices that do not tell us what we want to hear.

Indeed, the very instruments of critical thinking are a bit rusty. That became clear to me over the weekend as I read another essay, one that I was looking forward to reading enough to have ordered the book that contains it. These two essays may keep me busy for the rest of the week — by which I do not mean that I’m not going to write about anything else. One essay inspires me, the other irritates me.

The essay that irritates me is Roger Scruton’s “The Aesthetic Gaze.” The one that inspires me is Marilynne Robinson’s “Humanism,” which surprised me when I opened this week’s issue of The Nation to the back pages. I don’t read much of the political stuff at the heart of The Nation, but the “Books and the Arts” section at the rear presents me with the most demanding criticism that I read. To find an essay by this writer with this title was an indescribable treat, meant just for me.


To begin with Marilynne Robinson, I read the piece with great excitement — hey, this is what I’m working on! — but also great caution, the caution occasioned by two brow-raisers. The lesser startler is a statement to the effect that Mozart was not famous when he died — that he was out of vogue and forgotten. This is not true. Mozart may not have been as hot as he had been five years earlier, but there is no way that a man who, within a three-week period and in two different cities, has had two new operas premiered, can be said to have died, a couple of months later, in a “period of eclipse.” It’s a small point, but a stark one.

The second cause for caution is a somewhat unseemly haste that provokes Robinson into firing critical shots upon a group that she calls “the neuroscientists.” Had I been her editor at The Nation (or at FSG, which is going to publish (tomorrow) the essay in a new collection, entitled The Givenness of Things), I should have urged her to find and replace “neuroscientists” with “neuroscience journalists.” For one thing, the reading public rarely comes into contact with statements by neuroscientists that it is prepared to understand. For another, most of what the reading public does understand about neuroscience is written by journalists, not scientists. Third, and sufficient unto itself, those neuroscientists who do address the reading public about their work tend to stress the provisional nature of their findings, the crudeness of their measuring sticks, and the almost incomprehensible strides that will have to be made before we can so much as track a thought in the brain. Maybe it’s just the neuroscientists that I bump into — and I’m sorry that I don’t have any names — but they’re a humble lot. They may not be so humble about Robinson’s imputations.

If you perform the find-and-replace for yourself, however, Robinson’s essay becomes unexceptionable — indeed, magnificent. She begins by asking why it is that the humanities are being neglected today despite having been the boon companions of all the many thinkers who brought forth the modern world from the “rebirth” of “pagan learning” in the Fourteenth Century, and very much “at the center of learning throughout the period of the spectacular material and intellectual flourishing of Western civilization.” She observes that the humanities are indeed “poor preparation for economic servitude.” It becomes momentarily tricky to distinction her sincere statements from the sarcastic ones, but only momentarily; presently, Robinson summons science itself to rescue the humanities. This would be the very latest science, the study of “entangled particles,” of “a cosmos that unfolds or emerges on principles that bear scant analogy to the universe of common sense.” She says a few words about this cosmos, in which “mathematics, ontology, and metaphysics have become one thing.”

Great questions may be as open now as they have been since Babylonians began watching the stars, but certain disciplines are still deeply invested in a model of reality that is as simple and narrow as ideological reductionism can make it. I could mention a dominant school of economics with its anthropology. But I will instead consider science of a kind.

Id est, neuroscience. (For clarity’s sake, I should have altered the first sentence to read “more open now than.”)

What follows is a sometimes intricate argument against the exhaustively materialist claims of neuroscience — or what I would call the claims of neuroscience journalists, among whom I should include Richard Dawkins. People who write about these things for the reading public tend to indulge in atheistical attacks that, Robinson rightly points out, also challenge the idea of personal individuality. How, however, in a world that acknowledges string theory and quantum physics, can anyone claim to know that individuality is unimportant (because only those characteristics that enable an organism “to establish and maintain homeostasis in given environments, to live and propogate” matter), or to know that the self and the soul do not exist? The genius of the essay is the case that it makes against “neuroscientists,” finding that they are not up to date in their scientific thinking and perhaps not really scientists at all.

One might reasonably suspect that the large and costly machines that do the imaging are very crude tools whose main virtue is that they provide the kind of data their users desire and no more.

I believe that this is very unfair. There are surely real scientists who are using this data to design better, less “crude” machines, and who don’t believe for an instant that they know anything final about the brain or about the mind sustained by it. But Robinson is correct, I think, to insist that science has not so much as scratched the validity of claims on behalf of individuality, the self, and the soul. (I ought to note here that, while I do make claims for individuality, simply because it is so material obvious in everyday life, as any walk through Manhattan will demonstrate, I have nothing to say about the self or the soul except “I don’t know.”) What most interested me, as I read her dismissal of “neuroscience as essentially neo-Darwinist,” and therefore as pursuing “a model of reality that has not gone through any meaningful change in a century,” is a surreptitious critique, launched with the mention of “Darwinian cost-benefit analysis,” of that good old “dominant school of economics.”

The worst thing about any kind of success is that it begets inappropriate emulations. As Joan Cusack’s character in Working Girls wisely observes, a passion for dancing around in your underwear does not make you Madonna. I have yet to read an argument holding that the material success of businessmen who kept strict accounts ever inspired any “natural historians,” as scientists used to call themselves, to do the same (but we might make note of those scientists, such as Boyle and Lavoisier, who grew up in somewhat entrepreneurial families), but it has certainly taken a long time for us to see that certain lines of inquiry, especially those concerning human beings, are only very partially amenable to metric analysis. I have always been disturbed by the air of post hoc, propter hoc that hangs over talk about natural selection. “Survival of the fittest” is a circular statement with little currency among biologists but great clout among free-market economists. What could be a more prolific example of “creative destruction” than the bombing of Europe in World War II, occasioning as it did what the French call Les trente glorieuses — the thirty years of booming prosperity that did so much more than just put the continent back together (and that turned out to be anomalous). “Cost-benefit” analysis will always fail wherever benefits cannot be priced accurately. Human benefits appear to be measurable, at least roughly, in the aggregate (this makes insurance practicable), but as the focus approaches the individual, the information becomes less and less useful.

In other words, one of my aims here is to identify important areas of human concern in which keeping strict accounts is not useful, and possibly damaging. Unless I’m mistaken, an alarming number of people, many of whom consider themselves to be successful, do not believe that there are any such areas, and they do a great deal of harm to those who are not successful. Another aim is to identify similarly dubious terms and phrases, many of which come from the world of commerce. Commerce has been the big success story of modern times. Commerce and science have thrived together. The engineer’s determination of how long and strong a screw needs to be flows ineluctably to the businessman’s determination of how much that screw ought to cost. Thus we sweep from science to salary. What’s not quite right about this is that the two determinations are not equally objective, much as they might seem to be. For the engineer is disinterested; he only wants to know something. But the businessman is not only determining a salary. He is also concerned with paying himself. I really do not see much possibility of objectivity there. It follows that the very language of science and of commerce might be inappropriate — certainly inadequate — in considering it.


Tuesday 27th

Now for Roger Scruton. I have here a copy of An Intelligent Person’s Guide to Modern Culture (2000), the American edition of what was originally just Modern Culture. I have it because my friend Eric photographed a page from “The Aesthetic Gaze,” the fourth essay in the book, and shared it at Facebook. My eye was drawn to this statement:

Without tradition, originality cannot exist: for it is only against a tradition that it becomes perceivable.

I wholly agreed with this. Without tradition, or a sense of “normal,” originality is indistinguishable from the random and chaotic. The best examples of originality carry within them the bit of tradition upon which they are working a variation.

Eric’s Facebook entry was tantamount to a recommendation, so, as I make a point of following recommendations (unless I’ve a reason not to), I tracked down the author and his book, and here, as I say, it is.

Reading “The Aesthetic Gaze,” I was unable to find much else to agree with. it wasn’t that I disagreed with Scruton’s statements so much as that I disagreed with his way of thinking. Then I went back to the beginning of the book and read the first three essays, the second of which, “Culture and Cult,” is also quite substantial. Now I don’t know which to address first, “The Aesthetic Gaze,” with its antique philosophical apparatus (for which I have no use), or “Culture and Cult,” which ignited a cerebral explosion.

Before I’d read very far in “Culture and Cult,” I noticed that Scruton was talking an awful lot about the dead and death. What he was saying made sense — religion probably does originate in respect for the dead. But the focus on the dead, on the past, on judgment and atonement, and on alienation from the tribe made me impatient, because, to me, they are simply not that important. The dead are dead. They leave behind memories that die with the rememberers. In some cases, they leave independent material traces, artifacts that, made by them when they were alive, merit preservation, at least for the time being. In those cases, it is the artifacts that are of interest. Mozart’s music, and his rather interesting biography, remain, but it is something of a relief to me that no one knows where his bones lie.

Then I came to the following passage.

The cult of ancestors is the surest motive for sacrifice, and for the “readiness to die” on which the future of a society depends. It goes hand in hand with caring for offspring, and for offspring’s offspring, who come into being as a sacred pledge to those who have departed. The desecration of a grave is, on this account, a primary form of sacrilege…”

No, no! I broke off. Enough with the dead! Let’s get back to those offspring! They’re what matters, and regarding them as “a sacred pledge” to the dead is ghoulish. It was at this point that the explosion occurred. Or perhaps it was simply a moment of supreme illumination. In any case, new connections were made and fused in an instant. Unfortunately for some regular readers, these new connections were centered on a strange aspect of the thinking of Hannah Arendt.

When I was reading my way through Arendt, two years ago, I was regularly jarred by her references to “newborns.” Occasionally (perhaps only once), she referred to newborn children as invaders, which sounded bizarre, at least at first. I got what she was saying, but it still seemed out of place, like laughing in church, to talk about birth, infants, and children in a philosophical, or at least seriously thoughtful, discussion of contemporary crises. Once upon a time, thinkers didn’t talk about children at all, ever, except to call them innocent and ignorant. Then, thinkers developed a way of talking about children that highlighted their development, and new ideas about pedagogy were generated. But children were still bracketed apart, talked of separately. The weird thing about Arendt was her constant incorporation of them as invaders (whether she used the term or not) in her analysis of the human condition.

In the course of the explosion that occurred in my mind, Arendt’s talk about newborns instantly ceased to be weird. Newborns became, for me, the central, the original issue in any talk about culture, society, life — whatever. And then, a few lines later, Scruton said something that clinched it.

The sexual revolution of modern times has disenchanted the sexual act. Sex has been finally removed from the sacred realm: it has become “my” affair, in which “we” no longer show an interest. This de-consecration of the reproduction process is the leading fact of modern culture.

That’s as may be, but how can we regret it? The sexual act is of no interest, except to the actors. Whatever ought to be sacred (and therefore common knowledge), sex ought not. The reproductive act is significant only if and when it results in reproduction: it is birth that is significant. Now that we all know what birth looks like, from movies if not from personal experience — an idea that in my youth was regarded as nothing less than obscene — we understand, as never before, the miracle that the birth of a healthy child really is. We may understand the science (or think we do), but birth remains an obvious, self-proving miracle.

It was one of those intellectual reconfigurations that appears to have occurred in a great rush but which, as the excitement recedes, turns out to have already taken place, awaiting only illumination to be perceived. I saw that almost everything that I have written about here for the past couple of years has concerned, in some way or other, the transformation of rudimentarily human infants into contented but responsible adults. The common word for this transformation is “education,” but education covers, at best, only the things that adults can do to try to help the transformation along. Nor does education grasp the scope of “contentment” or “responsibility.” For, as we now know, the children of today will have to teach their children how to live in the world much more frugally than we do.

In short, we don’t know very much about the transformation of children into adults — into the adults that human beings are going to have to become if they are going to preserve their global homeland. All we know is that, while children do seem to need love, they don’t care to dilate on this subject, but prefer to behave as if they hated all authority. The management of children is an oblique, if not an occult, business.


Suddenly I understood why there are so few women among the philosophers — they’re not men. How like a man to focus on “the sexual act” in a guilty way. That was the fun part; and that is what chains him to responsibility for the ensuing child. And that’s what limits his interest to the welfare of his own children. I suddenly saw why the child care and the increased salaries for teachers that women have been crying out for since before the first bra was burned have not been forthcoming. Men remain profoundly unconvinced that they personally have any responsibility for the children of other men. (Liberal politicians are indeed simply giving other people’s money away.) Women certainly appear to believe that their own children are the best, but only disturbed women want their children to live apart from other children. It is my impression that women understand that the health of the community is a not negligible factor in the health of their children. Men seem inclined to view the community as an interference with their authority. I know plenty of elite men who believe otherwise, who share their wives’ view of the community, but no leader has emerged to change the mind of the general public.

You might argue that a man’s preoccupation with his own death is a function of his imaginative inability to engage with children.


Let me be perfectly clear: I am not trying to suggest that everybody ought to have children. Not at all! But beyond observing that some people seem to have more aptitude for raising children than others, and that some people don’t appear to be cut out for parenthood at all, I little to say: this is one of the many aspects of child-centered culture that requires a better understanding. Nor do I have any great insights about younger adolescents, except to recommend that they be housed away from home for a considerable stretch of this painful period.

I do want to point out that a culture that is focused on its children is in a state of constant self-review. What do we know that our children need to know? What do they need to find out that we don’t yet know? How can we lead them away from our mistakes?


The little patch that I’m working on is how to deal with the past. What to take from it, what to set aside. What, in rare cases, to lose. “The past” seems immense, and indeed, it has left us more books than anyone can read; but in fact very little of the past survives. Very little. In the area of writing alone, consider all the vanished shopping lists. Looking at a thick metropolitan telephone directory from the middle of the last century, consider all the conversations conducted via all those listed numbers. The oldest dresses that survive date to the late Seventeenth-Century; everything older has disintegrated. We have almost no idea of what Everyday Latin sounded like in the Roman Forum. Those cave paintings — what and why? Were there lots of such paintings, and only the ones buried in deep caves survived? Or was there a craze of adolescent derring-do? No, there isn’t much left.

But what do we do with what has survived? I hope that reading Roger Scruton will provoke a few ideas.


Wednesday 28th

And now for something completely different. (A Monty Python tag that I don’t know how to punctuate, but an agreeably ironic way of introducing the subject of this paragraph.) I went to the doctor yesterday. I went to two doctors, actually. I had that talk about scleritis with the rheumatologist. He agreed that my Remicade infusions can no longer be so widely spaced. (The protocol calls for eight weeks between infusions; for a few years, I was managing thirteen weeks nicely. The most recent infusion, which was preceded not only by the spontaneous inflammation of my eye but by a depression blacker than any I have ever experienced — only when I was writing here did life feel light enough a load to carry — followed the last one by eleven weeks.)

What I did not discuss with the rheumatologist — although I’d been prepared to — was my leg. My left leg was distinctly red and a little bit warm. There was no swelling, but the skin was tight. It did not hurt to walk, but it did hurt when I was just sitting down. The condition presented itself last Thursday, and there were several moments of horror over the weekend when another trip to the Emergency Room seemed to be in the offing. First thing Monday, I made an appointment with the internist, who is without a doubt my master doctor, the physician at the center of my many cases, even though endocrinology is not a specialty with any bearing on what ails me. I saw the good doctor yesterday. He examined me and sent me on my way without so much as a prescription. The leg still hurts, but the pain is unaccompanied by anxious uncertainty. And I knew all along that I was in for something.

I knew that I was “in for something” when — well, here’s what happened. The woman who cleans our kitchen and bathrooms had just been (this would be a week ago Friday), and when I went to take a shower after she left, I neglected to make sure that the bathmat was firmly pressed to the bottom of the tub. When I stepped in with my right foot, and the bathmat slid a bit, my left foot hastened to regain balance by closing the distance between feet. Unfortunately, my left foot was still outside the bathtub. There was an enormous thwack as it slammed against the tub. I steadied myself somehow; I don’t think that falling was ever a danger. (Except that falling is always a danger.) But I could tell from the ache in my calf that I was probably in for a big bruise.

There was a great deal of swelling, a mound almost big enough to enclose an egg. But the bruising never materialized. The swelling receded, and by Tuesday (this would be a week ago yesterday), I was experiencing nothing worse than an occasional twinge.

Then, Thursday night, while Kathleen was flying home from a quick trip to California, I felt the tight skin at the bottom of the calf, and a soreness at the instep, and I saw the blush of red. As I say, there was no swelling, and that was somewhat reassuring. Curiously, the redness did not surround the area where there had been swelling; it was all below that. I began to wonder if there was even a connection between the bang in the tub and this new irritation. I imagined all sorts of things, and each involved the administration of massive antibiotics in the Emergency Room, along with, no doubt, another unnecessary fuss about my blood pressure, heart rate, &c &c, which are under medical supervision at the moment and of no genuine ER concern.

I walked to the doctor’s office, on 72nd Street. I was a bit early. He took me into his office as he always does. I told him what had happened, and I told him about the scleritis for good measure, for, by this time, I was almost hoping that the leg business was another crazy spontaneous inflammation, even though this would mean that Remicade had stopped being effective. The scleritis made a certain sense if I was overdue for an infusion (as I evidently was), but my red calf made no sense two weeks after an infusion. When I had done talking, the doctor led me to an examining room and took a look for himself.

His conclusion was that the collision had ruptured blood vessels close to the bone. He looked at me and told me something that I really didn’t know: “Tissues don’t like blood.” When blood gets into places where it doesn’t belong, inflammation results. Eventually, the haemorrhaged blood (my term, not the doctor’s) gets flushed out. In the meantime, I ought to elevate the leg whenever possible, and wrap it in warm cloths, because heat, and not ice, is what’s called for in this situation.

I walked home from the rheumatologist’s office, which is only a few blocks from the internist’s. My leg was rather red. Walking is kind of the opposite of elevation with warm cloths.

Like the scleritis, a big nothing. But until yesterday, I lived with this big nothing against the lurid backdrop of the Emergency Room, a trauma unit that I found utterly traumatizing. On top of all the discomforts of an inadequate cot and no convenient loo, there was the din. Imagine a crowded subway car in which every passenger is speaking in an outside voice. Top that off with perfectly maddening beeps and bells. Add the occasional cry of pain. The only certainty was uncertainty. When I try to remember what the Emergency Room actually looked like, I see instead those medieval illuminations of the Last Judgment, or perhaps of the moment right after that, when crowds of agonized naked people are being stuffed into the maws of two-mouthed monsters. Just as the rich owners of those apocalyptic visions were terrified by the thought of eternal torment, so I dread, a little bit every day, another trip to the Emergency Room. This dread enhances the alertness with which I take care of myself.


In the late afternoon, dead tired after my medical adventures, I read the last couple of pages of Simon Winder’s Danubia. As the friend who gave it to me promised, I found it thoroughly amusing. But it was also far more substantial than I expected it to be. It was substantial in an unexpected way. Winder is amusingly apologetic about the tale that he wants to tell — the careers of the Hapsburg Emperors — and frequently promises to tell us as little of it as is absolutely necessary, since he expects that we’d be easily bored by too much detail. Danubia is not a very demanding history in that sense of the word. But for anyone who has done a modicum of demanding reading already, Danubia is something more than a plain history. It is rich in judgment and reflection. Because I’d read Miklós Bánffy’s Transylvanian Trilogy, for example, I felt that, in the passage snipped below, Winder was reminding me of an experience that I myself had had, having entered the world of Balint Abady.

As the First World War approached it became ever more complicated to be a Hungarian politician — not only did many suffer from a pathological aversion to the Austrians and their relentless attempts to undermine the Compromise or at least revise its terms, but there was upheaval across much of the kingdom. However many Magyarized [adopted Hungarian language and customs], it was never enough.

The Dual Monarchy that constituted the Hapsburg’s next act, after the deconstruction of the Holy Roman Empire, was always doomed to be a farce, so long as the Emperor ruled always from Vienna and came to Budapest for ceremonial purposes only. It was also a folly because there were more than two major constituencies within it, and Archduke Franz Ferdinand (the heir who was assassinated at Sarajevo) dreamed of beginning his reign with an invasion of Hungary and a proclamation of “Trialism,” which would enlarge upon the duality, to recognize the new Empire’s many (and many different kinds of) Slavs. If you don’t know much about the Hapsburgs or Central Europe, Danubia is an entertaining introduction. If you do know a thing or two, Danubia becomes quite thought-provoking.

For example, I was impressed by the simple power of Winder’s explanation for the failure of most of 1848’s revolutions.

A very broad spectrum of people could agree with the statement “It’s disgusting and embarrassing to be ruled by King Ferdinand II of the Two Sicilies,” but a decision on what to do next was much harder.

It was the failure to agree on the next step that allowed the military their chance, and the results were ferocious.

The failure to agree on the second step. Oh, how true! My mind spun a bit, and, when it stopped, I saw that the most successful revolution in history may well have been the real American Revolution. Not the one that began in 1776 and ended in 1783. No, the one that ran from 1787 to 1789, the revolution that limited itself to clarifying that second step, which it did in the document that we call the Constitution. The Confederation of the new United States had quickly shown itself to be a failure; everyone could agree that it needed to be junked. But nothing happened until the second step was in place, and elections could be set for Congress and for the Presidency. There seems to have been not an iota of violence. Such revolutions are unlikely to be possible, however, as revolutions are rarely engineered by the grand and the great.

One insight that emerged from reading Danubia was the awareness that the Hapsburg arrangement was doomed by the Enlightenment not because either of its empires were confused, irrational entities but because increased literacy led directly to increased nationalism, and it was nationalism that made the Austrian Empire look anachronistic. Nationalism was new and good, back in those innocent days of the early Nineteenth Century. From the moment that Herder concocted the formula of Kultur, European cosmopolitanism was endangered.

To teach people to read, you have to settle on a language to teach them; you will find it easiest to teach them to read the language that they already know. The imaginations of the newly literate are quickly swollen with new ideas and undreamed-of vistas, and languages oblige by singing the praises — praising the “national virtues” — of their speakers. That literacy should be so closely connected to demagoguery is very depressing, but this has been known since ancient times. For centuries, it was a positive conservative defense of restricting literacy.

But that is not our problem now.


Thursday 29th

Last night, having sipped perhaps a bit too much wine, and at loose ends about what to read, I fell into a dim meditation on power. The tone, although unvoiced, was incantatory. I was telling myself things in the ringing tones of an Orson Welles. I was challenging myself with overlooked truths. And of course I was exaggerating. I could imagine Conrad’s Marlow as, having heard me out, he regaled a company of after-dinner sailors with an account of my astonishing remarks. “‘I was supposed to have been a mighty man.’ said Keefe, quite as if the thought had not occurred to him before.”

Power has always puzzled me. I have always had enough for my purposes, and so I haven’t had to think much about it. Although I had no idea, when I was young, of what I would “do” when I grew up, what my career would be, it can’t have been a serious problem, because here I am at nearly seventy, still without a career but pretty good at what I want to do, which is to read and write. What this has to do with my being a physically imposing man remains mysterious to me, but I am sure that there is a connection. I do not think of myself as particularly confident — I am always fretting about something — but my body is very confident, in an indolent sort of way. It is overweight and stiff, this body, but it is big. It is often frowning — perhaps scowling would be the word (because of the fretting) — but it is usually composed, still. Many people respond to this body of mine by surrounding it with a margin of empty space. When I walk through a crowd, the crowd has a way of parting like the Red Sea. I was unaware of this for most of my life (which is also part of the puzzle), and when it was pointed out to me, I was mortified to learn that I apparently behave with obnoxious entitlement. But it isn’t I. It’s this body that I’ve been stuck with. My body is entitled, and it knows it. Other people have always told it so.

Just think — as I suspect my mother never stopped thinking — what it would have been like had I been the sort of man to make use of this body! This body was meant to be inhabited by someone of importance! When I was getting to be big, as distinct from just taller than the other boys, I excited a lot of disappointment in teachers and other men in charge. I felt that they wanted me to be something that I wasn’t — athletic and commanding — but they were only asking me to be what they saw. They wanted me to live up, as they might have put it, to my God-given body. I didn’t pay any of this much attention, because there was always a bigger battle raging on the subject of my intelligence, which, it was felt, I also wasn’t using as I ought to do. I was ashamed of my inability to be smart in the right way, but it did strike me as a genuine inability. My problem, as I saw it, was to figure out what to do with my intelligence, and I poured everything into that. I ignored the body problem, and, long before I made much progress on the brains front, it went away. I didn’t notice that it had gone from being a problem to being an asset. I’m somewhat ashamed these days to realize how much and for how long I’ve taken advantage of it.

Or so I say. Then I snap out of it and realize that my body is me, not a suit of clothes on a hanger in the closet. It has a lot to do with the way I think. It has given me an unusual amount of freedom in the world. It has taught me, for example, that the exercise of power, to the extent that it is not also an exercise of authority, is very wicked, if only because it is so toxic for the person doing the exercising, and not unlike an addictive drug. How can such coercion be prevented? I have no idea. I feel lucky, for having been too distracted by reading and writing to play power games. And for being big enough not to have had to fight for my autonomy. But the very fact that I’ve been lucky is dismaying. What about everybody else?

Why can’t everybody be big? Why can’t we create childhoods in which big is a quality, and not a quantity?


I thought that I’d be writing more about Roger Scruton than I have done. I keep putting off re-reading “The Aesthetic Gaze,” the essay that got me interested in him in the first place. I put it off because reading it the first time stirred up so many ideas that, while perfectly familiar to me, were now, suddenly, ideas that I not only hadn’t entertained recently but in fact rejected. I want to deal a bit with some of those ideas before going back and finding out that I misread Scruton, that he wasn’t saying what I thought he was saying. I also want to have a clearer idea of what I do think instead.

In “Culture and Cult,” I believe, Scruton attributes to Matthew Arnold the first full articulation of the fine arts as a replacement for lost religion. If we can no longer sincerely worship the God of our fathers, then we can at least worship the beauty of a painting by Raphael. There is something desperate about this replacement and I have never believed in it. Easy for me to say, as I’ve never had any kind of faith in that way; but what I believed was that it couldn’t work — art could not take the place of God for people who had lost faith in God. This, I think, is what Adorno nailed when he said that there could be no poetry after Auschwitz. Mere art, all-too-human-made art, could never present the ennobling, inspiring mirror of God’s love. Art has no ethical content.

But Scruton seems to believe that, as long as we go about it in the right way, art can provide our lives with meaning — with the kind of meaning that would guard us against the commission of horrible deeds. I have a lot of trouble with the search for meaning. I don’t really get it. If you asked me, “Why are you writing?”, meaning what is the purpose of what you’re doing, I shouldn’t have an answer for that, either, because the question makes no sense to me. But what interests me about Scruton’s thinking (if “interests” is the word) is his description of the right way to make art yield meaning. The way to do it is to pursue art as an end in itself, and not as a source of meaning. (Consciousness, as for so many conservatives, is often deadly for Scruton.) Using art as a means to meaning would be fatal. I can’t say that I don’t understand this language of objective and instrument — of ends and means — but I do believe that it amounts to no more than a puddle of words. You can talk about means and ends — you can profess to find an urgent difference between the one and the other — but you can talk about Ptolemaic epicycles, too, without making much less sense to me.

In human life, everything is means to an ever-unrealized end. I think that it’s because the end is endlessly unfolding that some people develop a hankering for purposelessness, for a withdrawal, however momentary, from the onrush of living action. Perhaps I can’t tell you what is meaningful about my writing, but I can tell you about the satisfaction that it brings and the hopes that I nurture for its readers. You could say that my satisfaction is an end, but does it really make any sense to apply the solid, product-y end to something as ephemeral as satisfaction? You could say, baldly, that my hopes are a means for making the world a better place, but I would probably talk about them as if they were ends in themselves. Whether my writing is a matter of “means” or a matter of “ends” depends upon how the light hits the discussion. To fuss over the distinction is to play one of those boys’ games that consume intelligence in pursuit of posturing.

I don’t believe that human beings are capable of the absolute disinterest that would be required for treating any thing or any action as “an end in itself.” We ourselves, our children and our children’s children are the ever-changing, never-changing end of everything that we do. If you want to feel that you are doing something as an end in itself, take a nap.


Friday 30th

It must seem that I’m not paying attention to the political news, which in fact I follow assiduously. I can’t bring myself to write about it, though; I’m only going to repeat myself. Whether the candidates and voters (or fans) are behaving exactly as I expect them to, or whether I simply can’t see what’s actually going on through the jungle of my old analyses is hard to tell. But it’s boring either way. It’s hard to believe that anything new is going to happen in this country anytime soon, or that it is going to do anything but slowly fall apart.

Of course I blame the South. That is, I blame Abraham Lincoln. Nicholas Lemann has a fine piece in the current New Yorker about the “Southernization” of federal politics, and in it he quotes Lincoln.

“It will become all one thing or all the other,” Abraham Lincoln declared of the beleaguered, slavery-stressed Union, in his “House Divided” speech. In fact, the South and the rest of the nation have one of those hot-blooded relationships — the major one, in American history — which never settle into either trustful intimacy or polite distance.

In other words, Lincoln was wrong. The United States is neither one thing nor the other, but simply a mess. Southerners are no longer confined to the South, and ever since Nixon’s “Southern strategy,” people of small minds everywhere vote Republican as a way of resisting attacks on the fantasy of American exceptionalism. By the way, as long as I’ve got Lemann on the page, I ought to finish a thought that he leaves dangling when he cites Sven Beckert’s Empire of Cotton. Slavery certainly seemed to be essential to the Industrial Revolution, but when it was abolished, the quickly-evolved new model, sharecropping, was exported throughout the cotton-growing world, a development that Beckert covers in detail. So slavery wasn’t, after all, essential; it’s possible that it never made economic sense. Slavery was essential in ancillary ways — the massive kidnapping of Africans created a working population out of nothing; the sexual entitlement of white slave-owners degraded even as it generated a defenseless African-American community — but not in industrial ways.

When I say that the North ought to have let the South secede (and I do say that), I’m nonetheless aware that there would have been some kind of war, if only to keep the South and its slavery out of the West. But that didn’t happen, so there’s no point dwelling on it (except to take a much, much harder look at Lincoln, great and noble man though he were). The South is like a metastasized cancer, its music and its militarism flowing freely throughout the land. It sometimes seems to me that the Southern drawl has inflected not only the American vulgate but the Anglophone vulgate. Yadda yadda yadda; I’ve said all this before.

The truly awful thing about the “Southernization” of American politics is that it is fundamentally anti-political. In the traditional South, power is exercised through hierarchical processes; elections, where they are not confrontations between unitary blocs (blacks vs whites, for example), are token affairs. The guy who ought to win is anointed in the back room, from which issues a word to the wise. Women, God love ’em, serve almost as mechanical governors, making sure that hegemonic complacency doesn’t stray too far from rough ideas of justice. But that complacency does ensure a monumental stability, and a homogenization that either marginalizes or criminalizes social deviations. Nobody, anywhere, really likes political action; it is impossible without compromise and it often requires the alliance of enemies, characteristics that expose politics to the charges, however simple-minded, of dishonesty and hypocrisy. It is easy to see why some might prefer the “politics” of the firearm.

The disregard for political integrity — yes, there is such thing — coupled with economic adversities that no one seems to understand (and that make some people filthy rich!), has produced the clown car of Republican hopefuls among whom only one, Marco Rubio, is both attractive and viable. (As David Brooks points out, Jeb Bush would be a great candidate — if it were 1956.) We shall see if party operatives can assure that he prevails as the Republican nominee. On the Democratic side, voters are always demanding more political integrity — whatever that is — than Hillary Clinton can provide. Clinton is competent, certainly, but she cuts corners — corners that too often turn out to be the wrong corners. I pity the candidate her renewed exposure to the undying wrath of Maureen Dowd, but I do agree with Dowd that Clinton ought to have followed the Benghazi situation more closely and certainly more directly. Claiming that she had 270 ambassadors to worry about was precisely the typical Clinton gaffe. Having taken the first step, getting rid of Libya’s infamous dictator, she was responsible for the second step, what next, and this included, at the minimum, arranging for diplomatic security. So while perhaps Hillary Clinton didn’t do anything wrong in the Benghazi disaster, she didn’t do anything particularly right, either. And this is, as I say, typical. Clinton is a manager by nature, not a leader; her idea of leadership is for everybody to get out of her way. So it’s no wonder that genuine centrists are unhappy with Clinton, while those to the left are understandably wild about Bernie Sanders. Smart centrists will cast passionate ballots for Clinton, but only for the sake of the Supreme Court nominating process. And they’re worried that too many of the voters who aren’t so smart, and for whom the Supreme Court is something of an abstraction, will stay at home, throwing the election to the Republican cutie-pie. Or the Donald, as the case may be.

It would be a sin to let all this invective fly by without aiming a few shots at the media. We will take as read into the record the stupefying impact of network television news. In one of my fantasy variations on the Kingsman theme, anyone who watches more than an hour of television news a week is hypnotized into sleeping through the election, and a thousand and twenty-nine voters show up to vote, the twenty-nine being Republicans. What interests me somewhat more is television’s parasitic dependence on politics. Where does all that campaign financing go? But a parasite doesn’t just suck your blood; it makes you sick. Television serves up the likes of Ben Carson and Carly Fiorina, both of whom, say what you will, make for good television. Not to mention the tycoon who just fired Iowa. I should be mystified by the persistence of the extravagantly unattractive Ted Cruz, if it weren’t for the prominence, even in the Times, of news about zombie shows.

I wonder who told Jeb! that a diet would do it?


Bon weekend à tous!