Gotham Diary:
Fail Better Still
September 2016 (II)

12, 13, 15 and 16 September

Monday 12th

The disgrace is almost asphyxiating. It seems that a number of networks and cable channels are vying for ratings by celebrating the twentieth anniversary of the murder of Jon-Benet Ramsay, the publicity of which was grotesque when the news was fresh. The little girl’s world of precocious beauty pageants was grotesque in itself (it was quite beautifully satirized in Little Miss Sunshine), but the media hugely amplified the lubricious element, tantalizing onlookers with the possibility that a sex crime might be involved. What kind of people are we?

That wasn’t how I intended to begin this entry, and it has nothing to do with what follows, except perhaps this: I want to ask you to use your imagination as intensely as you can, but I sense that the American imagination — the imagination of the liberal West, actually — has been so degraded by disgusting spectacles that it cannot be expected to respond to questions that lack a salacious charge. That means, I know, that I’m worrying about whether you’re up to my challenge, and I apologize for that, because I don’t really doubt that you are. It’s just that the sludge of recycled brainlessness gets so thick sometimes that it’s hard to stand up in it.

Over the weekend, I finally read Stuart Firestein’s Failure. I have had the book since it came out, earlier this year, but the right moment for reading it never seemed to come round. But then it did, and I swallowed it whole. To tell the truth, I couldn’t read two pages altogether without pausing for a revival-service affirmation; quite unlike even the most congenial reading matter, Failure often provoked moments of ecstatic clarity. I am not going to talk about it right now; its aftermath remains turbulent. I am going to talk about a tangent that it sent me off on.

I could label this tangent with the deadly term, “phlogiston theory,” but I’d rather not, even though that theory will play an important role in my challenge. The challenge began as one to myself: despite reading Herbert Butterfield’s chapter on the subject in The Origins of Modern Science three times, I could not explain “phlogiston theory” in a nutshell. For those of you who are unfamiliar with phlogiston theory, I will say at the outset that it was always, by our lights, completely wrong, so that it is difficult now, knowing what we educated people know, to imagine how anyone could ever have subscribed to it. That is one part of the difficulty. The other is the overthrow of the phlogiston theory. This is difficult to imagine, too, and for much the same reasons, but it occurred in stages, as discoveries were made by men who nonetheless failed to grasp the implications of their findings for the reigning theory. Although I was able to follow Butterfield’s narrative, I could not seem to hold it in my mind. So I resolved to read the chapter once again, and this time get to the bottom of my imaginative problems.

At the risk of fatuity, I will joke that the difficulty is elementary. What you have to do, before trying to understand phlogiston theory and the huge importance of its overthrow, is to see the world as every educated mind did circa 1600. It was still a world composed of the four elements, earth, water, air and fire. By 1800, as a result of the overthrow of phlogiston theory, belief in the old four elements was impossible; new elements, the ones that we are familiar with, had begun to take their place.

All four elements were involved in phlogiston theory and its overthrow, but earth not so much. The element of fire was no longer regarded as the flame itself but rather as a substance — this is the earthy part — contained in all combustible materials that was released, as Butterfield puts it, “in the flutter of flame.” Somebody proposed that this substance was an oily kind of earth, and called it terra pinguis. Somebody else saw the need to go Greek: phlogiston means, roughly, “imflammable.” Phlogiston was this inflammable substance that, although it could not be isolated, inhered in combustible things and was released by combustion. It was the element of fire, somehow — while also, somehow, an earthy substance.

This inconsistency might seem damning to you, an indication that even scientists in the Seventeenth Century weren’t very bright. But that’s why I want you to exercise your imagination. I want you to imagine what how the world could be explained if you believed that both air and water were elements, irreducible substances. Next, I want you to imagine what it would be like to try to solve the problems raised by this elementary status, given the interesting twist that air and water are not elements in different ways. Water is a compound of elements. Air is but a mixture.

Water is created by the explosion of hydrogen and oxygen molecules, as I suppose many of you were reminded by The Martian. In this compounded form, oxygen is no longer available for breathing, even by fish. Fish breathe pure oxygen that has been dissolved in water; their gills extract it. Land animals don’t need gills because atmospheric oxygen is not compounded, but free alongside the other elementary gases (mostly nitrogen) that consistute “air.” What you learn in the course of demythologising water, in short, is not going to help you to demythologise air, and vice versa. Worse, air and its constituent gases are invisible. Worse still, you have to have reason to believe that the elementary status of air and water are myths in the first place.

The virtue of imaginary phlogiston was that it offered a relatively simple explanation for a common phenomenon, couched in terminology rooted in the doctrine of the four elements, that had the effect of organizing what might have been unrelated developments in scientific inquiry. Cavendish, Black and Priestly all made discoveries that were crucial to the overthrow of phlogiston theory, but their belief in the theory persisted nonetheless. Cavendish, for example, concluded that “common air” was four parts of phlogisticated air — a compound, as it were, of air and phlogiston released by combustion, and not to be confused — then! — with something called “fixed air,” or what we know as carbon dioxide — and one part of dephlogisticated air. Cavendish had the right idea, but the wrong terminology. His phlogisticated air turned out to be elementary nitrogen, which is not the product of combustion. It was Lavoisier who gathered together everyone’s findings, for the purpose of debunking phlogiston theory.

Why did Lavoisier want to do this? Because phlogiston theory was failing to make sense in the light of replicable discoveries. Oxygen and hydrogen were isolated (if not understood), but phlogiston never was. In order to account for mounting discrepancies between fact and theory, scientists did what they always do: they patched. They claimed that phlogiston worked differently in exceptional circumstances. Phlogiston theory explained x, except when it didn’t. Sixty or seventy years after its formulation, the theory was in tatters, but most scientists continued to work under its banner. Lavoisier, the rich, elegant tax farmer, resolved to give the theory the boot. What I ought to have said was that it was the overthrow of phlogiston theory that required the organization of widespread experimental findings. These were coming so fast and free at the time that it is not possible to say with much finality who discovered what, and Lavoisier discredited his own great work by claiming credit that was not his due — he was a synthesizer, not a discoverer. But the result was that the battle against phlogiston produced modern chemistry.

I believe that “the invisible hand,” which has come to mean something that Adam Smith didn’t quite have in mind, is the phlogiston of today. If I were a trained economist, and half my age, I should devote my life to attempting to repeat Lavoisier’s success.

But wait: did I just say “success”? Was Lavoisier’s overthrow of phlogiston theory a success? Stuart Firestein doesn’t say much about success in Failure, but I think that he makes an implicit case against its usefulness, and perhaps even against its existence. Success may be just as bogus as the four elements. I’ll come back to this tomorrow.

***

Tuesday 13th

What is success? Let’s not bother with that question. Everybody knows what success is. The better question is, can success (or its negative, failure) inhere in the character of a human being? Is it reasonable to speak of successful people?

One of the old Greeks — Solon, perhaps? — counseled against regarding anyone as a success until he died. Then you could draw the line under his achievements and shortcomings and make a permanent calculation. This sounds very prudent — don’t count your chickens, &c — but it is actually short-sighted, because it assumes that success is an immediately post-mortem assessment. It overlooks the possibility that the next generation, or the generation after that, may revisit the dead man’s life, and come to a conclusion that differs from the one reached by his survivors. Modern history involves constant re-evaluation. Jeremy Bentham’s corporeal remains may be (more or less) permanently preserved, as an “auto-icon,” at University College London, but his reputation is no less fluid than anyone else’s.

We all love two kinds of stories about success. The first one is about the outwardly successful person who is inwardly miserable — or who ought to be. The second story is about the person who touches the lives of everyone who knows her — this sort of successful person is a bit more likely to be a woman — with love and inspiration, but whose success as a human being goes unsung, because it is too local and complicated. The stories of Dorian Gray and Dorothea Brooke suggest that success does not really attach itself to people. If anything, it flows away from them, either turning to dust in the hand or spreading generously among truly loved ones. Not a few fairy tales insist that true success lies in letting it go.

This is all very high-minded; what about good old-fashioned money, pots of money? Isn’t the man who has lots of money, who has earned it, one lawful way or another, a success? There are plenty of people who think so. I would bet, though, that many such successful men and women would, upon the application of some gentle pressure, admit that their success is really a matter of controlling that money, of knowing what to do with it. The man with a gazillion dollars in the bank who spends his life sipping umbrella cocktails in a hammock is not likely to inspire the admiration that success deserves. Letting money sit in a vault is just another way of losing it — everybody knows that.

***

Kathleen, my wife, is very skeptical about success. “I’m supposed to be a success,” she sighs. And she is supposed to be a success; she wouldn’t have been profiled by the Wall Street Journal earlier this year if she weren’t. “But it’s really just one thing after another. You go on to the next thing.” Sometimes, Kathleen forgets how bored she would be if she didn’t have the next thing to go on to, but talk about success does invite dreams of hammocks. If successful people have to go on to the next thing, just like people who aren’t successful, then what difference does it make? The difference, I point out, is that Kathleen, as a success — or, as I prefer to put it, as someone associated with success — is engaged to go on to the very small number of possible next things that will continue her success, or her association with success. The person who fails must try something altogether different. Movie stars keep having to prove their stardom, in film after film. That happens to be the proof of their stardom. Actors who don’t establish stardom quickly aren’t permitted to make a second or a third bid.

Success is never attained, never achieved. A very good thing, too, say I, mindful of the French meaning of s’achever — to be achieved. It is said of those whose lives are over.

What Stuart Firestein appears to be arguing in Failure is that we diminish the bounty of success by trying too hard to avoid failure. It is easy to see why failure is avoided, at least in the world of scientific investigation that is his métier. Science is expensive. Laboratories require recurrent infusions of grant money, and grant money is not awarded to scientists who openly plan to design experiments that will fail. His nutshell advice is to reform the grant-award process so that decisions are made by other scientists, not necessarily in a related field, who look for proposals that are interesting and credible, rather than by administrators with a check-list of predictors of success. Firestein critiques the vogue for citing Beckett’s line, “Fail better.” Beckett is not cleverly suggesting that there is a way to fail that is tantamount to success. “Failing better,” Firestein writes, “meant leaving the circle of what he knows. Failing better meant discovering his ignorance, where his mysteries still reside.”

It is this unordinary meaning of failure that I suggest scientists should embrace. One must try to fail because it is the only strategy to avoid repeating the obvious, beyond what you know and beyond what you know how to do. Failing better happens when we ask questions, when we doubt results, when we allow ourselves to be immersed in uncertainty. (27)

“Too often you fail until you succeed,” he continues, “and then you are expected to stop failing.” He might have added that this comes to the same thing as being expected to play dead.

***

Scientific failures are expensive in money; properly conducted — as clinical trials sometimes manage not to be — they are not expensive in health or happiness. It is different in most other fields. We all can learn from our mistakes, but mistakes made by engineers or central bankers or by judges can be costly in very undesirable ways. I read somewhere that the passengers who died in the early days of commercial aviation ought to be regarded as heroes for having contributed, so to speak, to the database that has made flying much safer than driving. Maybe so, but I’m not inclined to encourage experiments that kill people. (It might have been better, if such human sacrifice were going to be sanctioned, for them to offer themselves up to medical experimentation.) The moral of the aviation story as I see it is that there ought to have been more funding. And for my part, I can say that, to the best of my knowledge, no one has ever been made sick by my cooking.

But science at Firestein’s level is a branch of intellectual history — the proudest growth in the Western world. It not only costs nothing but money but also requires failure to grow. One of the reasons for Firestein’s advocating the publication (on a low-cost Web site) of failed experiments is that other people’s failures may very well inspire your success. He urges his students to consider failed experiments that have been reported in Science — fifteen or twenty years ago, when the technological resources were vastly more constrained. Failure, like success, can be reconsidered later. Revisited failures may be transformed into successes. But first you have to have the failures.

***

By a stroke of luck, I read a story by William Trevor yesterday that couldn’t be more on point. It’s called “Traditions.” It is set at an English public school. A group of boys have been capturing jackdaws and teaching them to speak (sort of) in a barn that is strictly off-limits. One morning, the boys discover that the birds’ necks have been broken. All but one of the boys suspects another student of committing this atrocity. The exception, a boy called Olivier, has another idea, one that he keeps to himself. It so happens that Olivier is in hot water with the headmaster, because he is doing poorly in his science classes — classes that he elected to take. Olivier offers to drop the science course, but this makes the headmaster even angrier: you don’t quit. If you sign up for science courses, you commit to doing well at them. You don’t fail, whether by doing poorly or quitting; you succeed, because success is a tradition at this school. The headmaster is incapable of grasping that Olivier has already succeeded in his science classes. He was curious about things, and so he learned about them. He could not be bothered with boring laboratory procedures. This unorthodox cast of mind is what has alerted Olivier to the identity of the culprit in the jackdaw case — and in other unsolved mysteries at the school.

Many a time in school, especially in college, did I drive teachers mad by seeming to play the dilettante, by taking what I needed from a course and flunking the rest. Even I was not particularly at ease about this habit, but there was no changing it. My curriculum was dictated by an inner voice that overrode official criteria. No doubt that inner voice required a seven year spell in the desert before confronting the requirements of law school, to which it deferred. I should not recommend Stuart Firestein to take on Oliviers as grad students, but I think that he would agree with me that we need to open up undergraduate education to more freewheeling minds, especially if the direction those to which those minds tend is toward the heart of the traditions, and not away from them. No matter how firmly I insisted on the relevance of coursework to my self-directed inquiries, I was the last to argue that “relevance” ought to shape the curriculum. I’m certainly not saying that colleges ought to be overhauled for the likes of me. But there ought to be more room for failing better.

***

Thursday 15th

Last night, Kathleen brought home two sets of print-outs of the proofed first draft of the writing project. 187 pages, 83 thousand words. A good beginning, I think — but also an uncomfortable ending, as this first stage of the work comes to a halt. For weeks, it was the center of my everyday life, even on the three days each week when I did not write. It felt “organized,” whatever that means in this context, from the start, and it quickly established its own rhythm. There were other things to worry about, but they were unusually easy to overlook, as I focused on the project. Now all of that is over.

Kathleen will read the first draft on a flight to California on Sunday; whether she finishes by touchdown (she probably will), the point is that she won’t be reading it here, with me hovering in the background. The timing of her business trip to Dana Point could not be more providential. It gives a term to the fallow period that must in any case, I think, follow the long burst of thinking and writing and (in proofing) thinking further that produced the draft. I have to set the whole thing aside for a few days — not that’s entirely possible; I’m already thinking very hard about enlarging the shortest section — but, thanks to Kathleen’s trip, I don’t have to wonder when it will be time to get going again. Coming all at once, her comments will change everything.

For I have been very careful to make sure that, when Kathleen does read the first draft, it will be fresh. I have resisted the impulse to read her the great little bits that seem so striking when they’re new but that, with time, settle into their texts. (If they don’t, it’s a problem.) I did, fairly early on, read three paragraphs from the second section that I thought were very funny. Kathleen thought they were funny, too, but my reading was interrupted by trying to make sense of typos and to fill in missing words. I decided not to repeat the performance. So I have shared what I have written with no one. I have not even described it to anyone but Kathleen. I have waited until it can be read as a coherent whole, a text that, while not perfect by any means, is fluent and comprehensible.

It occurs to me that this would be a good time to take a holiday here, as well. We still plan to spend the last week of the month in San Francisco, so I shall be silent then certainly. But I may begin tapering off before then. While Kathleen is away, I may set up the card table in the foyer and pile it with all of the extraneous stuff — in bags, in piles, and in desk drawers — that hasn’t found its place in this small book room. It seems that I’m the only person who ever walks in here freely; Kathleen won’t enter unless asked, and no one even comes to the “back half” of the apartment except to use the bathroom. I’ve taken advantage of this atmospheric privacy to make up for the absence of adequate closet space (the apartment’s one real drawback), but the joke is that that the only person who’s bothered by the bags and the piles is me. To me, they’re very noisy. They’re also in the way of the bookcases. Getting rid of them (how?) would not be a fun pastime, but this might be the time to have a go at it.

It seems to me, and to everyone that I know, that the United States is on the precipice of a national disaster. Every day, it appears just a little more possible that Donald Trump will win the presidential election in November. Why? Because he is the “honest” candidate. Charged with a wide array of failings, some of them arguably criminal, he simply shrugs, as if to tell his supporters, “If you don’t care, I don’t care.” And of course they don’t care. But what’s awful is that this comes across as candor, and candor appeals to many voters, not just to his supporters, as the key virtue, because it has come to be seen as the virtue so lacking in Hillary Clinton’s makeup. Don’t look now, but Hillary Clinton has foot-in-mouth disease; everything that she says, including “and” and “the,” sounds like a prevarication. She ought to stop touting her abilities and simply throw herself on the voters as —

As what? As a non-reality-TV-star? This is where Trump’s kind of candor highlights Clinton as the worst possible opponent — from her standpoint. She has only two ways of challenging him. Presenting herself as a capable politician and administrator plays into his hands; most people don’t really care about politics and administration right now. And to respond to Trump’s disparagements in kind is always going to be a losing battle. She’s a woman in an America that still wants to think of itself as a white Christianist homeland, and that is quick to take offense at language such as “basket of deplorables.” There is no good reason to regard Clinton’s remark as a gaffe, but the mere fact that it was questioned shows how sick the country’s political culture really is. Had Dwight Eisenhower said it, he would have been applauded.

In the end, it’s a contest between someone who wants to lead a gang and someone who doesn’t understand leadership, not with the visceral capability of Lincoln or FDR. Ordinarily, this would not be a great failing; in naming two presidents for comparison, I have not named most of them. But there is nothing ordinary about Donald Trump. Wanting to lead a gang isn’t “leadership” either, but it looks like it now, when appearances are all that matter.

Reagan, Bush, and now Trump: it’s impossible not to see an arc of mutation, as telegenic shams replace warty professionals in the top job. I’d really have to include Bill Clinton in this arc, too: he won because he was better at flim-flam than Bush’s father was. President Obama has disappointed many of his supporters by replacing the hope of his campaign with the rigor of fighting a recalcitrant Congress. How wonderful it would have been, had Hillary won in 2008, so that her vice president, Barack Obama, could battle Donald Trump now. Mind you, we’re talking only about campaigns here. But campaigns have been devouring administrations for forty years or more, as television’s broadcasting standards have become ever more dementedly sensational. I don’t know when I began to suspect that television might be more than just a terrible waste of time, that it might actually kill liberal democracy. But if Donald Trump wins in November, we’ll have had proof of its capacity to deal possibly mortal blows.

***

Friday 16th

While I was working on the first draft of the writing project, I was protected from chill winds and swampy miasmas. Bad news didn’t really get to me. Now, it’s different. Now, I’m overwhelmed by the awfulness of social failures. David Denby, in the London Review of Books, writes about the videos of white police shooting black men without objective reasonable provocation, and then treating the dead or wounded body as if it were still resisting arrest — handcuffing it, just to be on the safe side! James Surowiecki, in The New Yorker, explains why: police unions depend upon crime committed by black Americans to justify their budget demands and their refusal to reform police procedures. Is there a way out of this? Yes, according to Patrick Phillips, author of Blood at the Root and a native of Forsyth County, Georgia, from which, in 1912, the black population was driven away by every kind of force. That’s one solution.

And then there are two reviews of The Girls, Emma Cline’s adaptation of the Manson Family murders, one in the LRB, one in the New York Review of Books. Both reviewers, like the author, are American. Both say much the same thing about the novel. But novelist Diane Johnson is far more enthusiastic than Emily Witt. Johnson complains, at the end of her piece, that the literature of California is “the Canada of American regionalism.” Witt gives a demonstration of this treatment by collapsing Cline into Didion, as if to say that nothing has been added. Johnson, of course, raised a family in Los Angeles; Witt appears to be a New Yorker — there may be nothing more to their different takes than that. Both Johnson and Witt regret the almost vacant impotence of fourteen year-old girls in a consumer society, as they wait for boys to notice them and make them real. Cline’s heroine, it seems, gives up on men.

In a William Trevor story that I read last night, “Bravado,” a very pretty girl, Aisling, walks home from a Dublin nightclub to her affluent neighborhood. Her boyfriend, Manning, calls her “drop-dead gorgeous,” which, without comment, she rather likes. Manning is the alpha dog of his pack, and Aisling likes that, too, although she thinks she thinks it’s silly. As they climb the suburban hills, Manning’s group spots a nerdy kid whom Manning dislikes. While the kid, also walking home from the nightclub, finishes peeing on an old lady’s house, Manning swoops down on him, knocks him over and kicks him. It turns out that the nerd has an unusually weak heart, and he dies. Manning goes to jail. Aisling visits the dead boy’s grave, ever more clearly aware that, although she was horrified by the violence of Manning’s attack, she was pleased by the obvious tribute — he did it to show off to her. And now she cannot bear this acquiescence.

These patterns of contempt and inferiority — I was sure, when I was young, that I would live to see them broken forever. I believed that consciousness would be raised, and that people would see these horrible follies for what they are. I now understand that my expectations were not reasonable. They betrayed, pretty clearly, a desperate optimism. If racism and sexism were not overcome, then American society would collapse from within. And that seems to be what is happening. Feminism and the fight for equal civil rights have wounded the old patriarchy, perhaps mortally, depriving it of the strength to restore the status quo ante. But beleaguered white men will believe that it is heroic to pull down the whole structure in the death-agony of their self-importance. Cops will continue to persecute the black drivers of automobiles with defective taillights until everyone else begins to see the police as an oppressive occupying force. Boys will go on badgering girls to show their breasts to that the quality of these features can be judged until mothers realized that they have raised their sons to be depraved. And all of it will be cycled into televised entertainment.

David Denby writes of the shootings,

Something more than ineptitude and panic is there in these acts: refusing to accept that a man is dead may be a way of refusing to acknowledge that one bears any responsibility for his death. Feelings of pity have been chased away, as far as we can see, by fear.

Are we still in the Sixties?

***

I feel that I learned a few things from writing the first draft. I put it that way because I only sense them; they are not very clear. And some of them are negative: I’m learning that there are things that I not only don’t know but don’t know how to talk about. One of these things is the human mind. The mind is something that we ought to be able to talk about, because each of us has one and many of us are reflective enough to have a sense of how minds differ from one person to another. The differences that I’m thinking of are not pathological; they have little to do with the health of the brain. They are not moral considerations, either, because morality is, or purports to be, standard, and we are shy of the systematic and legalistic standards that characterize traditional morality.

I’m thinking of differences that, while annoying, are harmless. Am I thinking of worldviews? We use “worldview” fairly freely, but do we analyze it with any rigor? Isn’t it the case that most talk about “worldview” boils down to an idea of what moral standards ought to prevail? There’s more to worldview than that, a lot more. Surely a worldview is literally shaped by the views that one has had of the world: I know that my worldview was changed, and not insignificantly, by a week spent in Istanbul. Although I had visited Guangzhou (Canton) very briefly, Istanbul was really my first experience of a Western-seasoned city outside of Christendom. Most of the impressions that I can talk about were either touristic or “curious,” the latter being notes of correspondence with the world I already knew (such as the pastry shop in Istiklal Street called “Markiz”), but I am haunted by inarticulate recollections of the very old city that Orhan Pamuk has struggled to commit to paper. I read My Name Is Red years before Istanbul, and was perplexed by much of it; Snow, which I read while I was in Istanbul, was far more intelligible, even if I couldn’t tell you how. l also know that my view of Europe shifted perceptibly when I stood at the water gate of the Dolmabahçe Palace, looking through the palings out onto the Bosphorus, and all the ships that were on their way to or from Black Sea ports.

“Mindset” is an equally vague word. Is there a way to give it substance? I am on the verge here of that hoary old psalm, “the life of the mind.” Strictly speaking, the phrase is ridiculous, because there is no other kind of life. “The life of the mind” is an ignorant stab at guessing what it must be like to read a lot of books and to think a lot about equations and syllogisms. Or, in the alternative, the poet’s life of words. The life of the mind is something that other people have. One might pretend to want it, too, but not very sincerely.

Our minds are all different, and we are forever misunderstanding each other. It’s annoying, but potentially enlightening. There is something wrong with the way we work together (most of us), because the differences between us too often get in the way instead of sparking greater understanding. Is it prudence or a lack of intelligence that makes us cling to what we know how to deal with and dismiss everything else? I like to think that it is a rectifiable ignorance, but how hopeful can I be about that, given the the hopes with which I began this entry — hopes that ought to have withered by now?

Bon week-end à tous!