Saturday, December 30, 2006

INTRODUCTION
from here

"With the philosophy of Nicolai Hartmann we once again enter a world of sober, objective and impartial inquiry, which presses beyond man's self and seeks to grasp the universe of being so far as it is revealed to our limited capacity to know. The basic mood of Existence philosophy, as might be expected, is altogether missing from this universal way of viewing matters. (...).

The true concern of his philosophy is to discover the structural laws of the real world, of the world of being, not of some `world of mere appearances' set out in front of the real world. Traditional philosophy, according to Hartmann, has sinned a great deal in this connection and in a double manner. First, it has always believed that it faced two basic alternatives - to accept an absolute knowledge of being, or else to assume the total unknowability of the `things in themselves'. The latter course means rejecting the possibility altogether of objective knowledge of being, the former results in closed metaphysical systems that dismiss the irrational aspects of being and hold that the whole of being may in principle be grasped rationally. What has been overlooked is the middle possibility, namely, that being may be partially comprehensible conceptually despite the irrationality of the infinite portion that remains.

The second error of traditional philosophy is the propensity, stemming from the monistic need for unity, to transfer the categories or principles of one province to another that differs from it in kind. Illustrations are the application of mechanistic principles to the sphere of the organic, of organic relationships to social and political life, and, conversely, of mental and spiritual structures to the inanimate world. This infringement of categorial boundaries, as Hartmann calls the theoretical encroachment of one province of being upon another, must be eliminated by rigorous critical analysis; yet the categories must preserve their relative validity for the domain from which they were taken originally. From the standpoint of a critical ontology, the totality of beings then turns out to be a far more complicated structure than finds expression in the traditional metaphysical formulas of unity.

Knowledge belongs to the highest stratum with which we are acquainted, that of spirit or culture. Consequently only an ontology of spiritual being (geistiges Sein) can comprehend the essence of knowledge. At the same time, however, the problem of cognition must already have obtained at least a partial solution if ontological inquiry is to be admissible at all. For to begin with we do not even know whether there is any such thing as objective knowledge of being or a transcendent object independent of the subject of cognition. This fact necessarily places epistemology in a dual position. On the one hand, it must create the foundation for all ontological inquiry; but at the same time it can reach its goal only within the framework of an ontology of spiritual being. Hartmann attempts to do justice to this twofold aspect of knowledge by prefacing his works in ethics and ontology with an investigation of knowledge, by including in this investigation the ontological viewpoint, and by discussing in his ontology the consequences of his findings for the phenomenon of cognition."

From: Wolfgang Stegmüller - Main currents in contemporary German, British, and American philosophy - Dordrecht, Reidel Publishing Co. , 1969. pp. 220-221.



"It is not easy to tell what exactly Hartmann understood by his 'ontology,' which he wanted to oppose to the old Pre-Kantian form of ontology. He certainly did not identify it with metaphysics. In this respect Hartmann's enterprise differed fundamentally from the many more or less fashionable attempts to resurrect metaphysics, attempts which have rarely led to more than tentative and precarious results. Superficially Hartmann's 'ontology' may seem to be nothing but what it meant to Aristotle: the science of being qua being in its most general characteristics. In order to determine its actual content, however, it will be best to look first at the type of topics and problems which Hartmann took up under the time-honored name. They comprise not only being qua being, i.e., the most general concept of what is (das Seiende), but existence (Dasein) and essence (Sosein), which he calls Seinsmomente, and the types of being designated by the adjectives 'real' and 'ideal,' named Seinsweisen, all of which are discussed in the first volume of the ontological tetralogy. The second volume deals with the modes of being (Seinsmodi) such as possibility and actuality, necessity and contingency, impossibility and unreality -- particularly impressive and perhaps the most original part of the set. The next major theme is the categories, first the general ones applying to all the strata (Schichten) of the real world and explored in the third volume (Der Aufbau der realen Welt), then the special categories pertaining only to limited areas, such as nature, which Hartmann takes op in the final work.

Finally, there are the categories peculiar to the realm of cultural entities (geistiges Sein) which he discussed in a work whose publication actually preceded the ontological tetralogy.

The mere mention of these topics will make it clear that such an ontology differs considerably from what had passed as ontology before Hartmann. It covers more and less. It adds the spheres of being which have been opened up by the sciences and the new cultural studies as well as by the theory of values. But it omits the traditional metaphysical problems, i.e., the ultimate questions dealing with God and immortality, which were the prize pieces of speculative metaphysics. The fact that Hartmann abandoned this earlier metaphysics did not mean that he denied its problems. Their insolubility even provides the very background for his new ontology. Hence we have no right to simply ignore them.

Ontology thus conceived constitutes really a segment of a metaphysics which is no longer simply a field for speculative treatment by a priori methods. To Hartmann metaphysical problems are those which form the horizon of scientific knowledge, and which are inescapable because of their connection with what we can know scientifically, yet which cannot be solved by the methods of science alone. Some of these problems he considered to be impenetrable and 'irrational' on principle, even though they too contain an ingredient (Einschlag) which can be explored by the rational methods of critical ontology. This 'least metaphysical part' of metaphysics is the proper field of the new ontology."

From: Herbert Spiegelberg - The Phenomenological Movement. A historical introduction - Martinus Nijhoff - The Hague, 1963 (third edition). pp. 309-310.

Sunday, December 24, 2006

CORMAC MCCARTHY'S VENOMOUS FICTION: THE ONLY INTERVIEW EVER GIVEN

"You know about Mojave rattlesnakes?" Cormac McCarthy asks. The question has come up over lunch in Mesilla, N.M., because the hermitic author, who may be the best unknown novelist in America, wants to steer conversation away from himself, and he seems to think that a story about a recent trip he took near the Texas-Mexico border will offer some camouflage. A writer who renders the brutal actions of men in excruciating detail, seldom applying the anesthetic of psychology, McCarthy would much rather orate than confide. And he is the sort of silver-tongued raconteur who relishes peculiar sidetracks; he leans over his plate and fairly croons the particulars in his soft Tennessee accent.


"Mojave rattlesnakes have a neurotoxic poison, almost like a cobra's," he explains, giving a natural-history lesson on the animal's two color phases and its map of distribution in the West. He had come upon the creature while traveling along an empty road in his 1978 Ford pickup near Big Bend National Park. McCarthy doesn't write about places he hasn't visited, and he has made dozens of similar scouting forays to Texas, New Mexico, Arizona and across the Rio Grande into Chihuahua, Sonora and Coahuila. The vast blankness of the Southwest desert served as a metaphor for the nihilistic violence in his last novel, "Blood Meridian," published in 1985. And this unpopulated, scuffed-up terrain again dominates the background in "All the Pretty Horses," which will appear next month from Knopf.


"It's very interesting to see an animal out in the wild that can kill you graveyard dead," he says with a smile. "The only thing I had seen that answered that description was a grizzly bear in Alaska. And that's an odd feeling, because there's no fence, and you know that after he gets tired of chasing marmots he's going to move in some other direction, which could be yours."


Keeping a respectful distance from the rattlesnake, poking it with a stick, he coaxed it into the grass and drove off. Two park rangers he met later that day seemed reluctant to discuss lethal vipers among the backpackers. But another, clearly McCarthy's kind of man, put the matter in perspective. "We don't know how dangerous they are," he said. "We've never had anyone bitten. We just assume you wouldn't survive."


Finished off with one of his twinkly-eyed laughs, this mealtime anecdote has a more jocular tone than McCarthy's venomous fiction, but the same elements are there. The tense encounter in a forbidding landscape, the dark humor in the face of facts, the good chance of a painful quietus. Each of his five previous novels has been marked by intense natural observation, a kind of morbid realism. His characters are often outcasts -- destitute or criminals, or both. Homeless or squatting in hovels without electricity, they scrape by in the backwoods of East Tennessee or on horseback in the dry, vacant spaces of the desert. Death, which announces itself often, reaches down from the open sky, abruptly, with a slashed throat or a bullet in the face. The abyss opens up at any misstep.


McCarthy appreciates wildness -- in animals, landscapes and people -- and although he is a well-born, well-spoken, well-read man of 58 years, he has spent most of his adult life outside the ring of the campfire. It would be hard to think of a major American writer who has participated less in literary life. He has never taught or written journalism, given readings, blurbed a book, granted an interview. None of his novels have sold more than 5,000 copies in hardcover. For most of his career, he did not even have an agent.


But among a small fraternity of writers and academics, McCarthy has a standing second to none, far out of proportion to his name recognition or sales. A cult figure with a reputation as a writer's writer, especially in the South and in England, McCarthy has sometimes been compared with Joyce and Faulkner. Saul Bellow, who sat on the committee that in 1981 awarded him a MacArthur Fellowship, the so-called genius grant, exclaims over his "absolutely overpowering use of language, his life-giving and death-dealing sentences." Says the historian and novelist Shelby Foote: "McCarthy is the one writer younger than myself who has excited me. I told the MacArthur people that he would be honoring them as much as they were honoring him."


A man's novelist whose apocalyptic vision rarely focuses on women, McCarthy doesn't write about sex, love or domestic issues. "All the Pretty Horses," an adventure story about a Texas boy who rides off to Mexico with his buddy, is unusually sweet-tempered for him -- like Huck Finn and Tom Sawyer on horseback. The earnest nature of the young characters and the lean, swift story, reminiscent of early Hemingway, should bring McCarthy a wider audience at the same time it secures his masculine mystique.


But whatever it has lacked in thematic range, McCarthy's prose restores the terror and grandeur of the physical world with a biblical gravity that can shatter a reader. A page from any of his books -- minimally punctuated, without quotation marks, avoiding apostrophes, colons or semicolons -- has a stylized spareness that magnifies the force and precision of his words. Unimaginable cruelty and the simplest things, the sound of a tap on a door, exist side by side, as in this typical passage from "Blood Meridian" on the unmourned death of a pack animal:


"The following evening as they rode up onto the western rim they lost one of the mules. It went skittering off down the canyon wall with the contents of the panniers exploding soundlessly in the hot dry air and it fell through sunlight and through shade, turning in that lonely void until it fell from sight into a sink of cold blue space that absolved it forever of memory in the mind of any living thing that was."


Rightful heir to the Southern Gothic tradition, McCarthy is a radical conservative who still believes that the novel can, in his words, "encompass all the various disciplines and interests of humanity." And with his recent forays into the history of the United States and Mexico, he has cut a solitary path into the violent heart of the Old West. There isn't anyone remotely like him in contemporary American literature.

A COMPACT UNIT, SHY OF 6 feet even in cowboy boots, McCarthy walks with a bounce, like someone who is also a good dancer. Clean-cut and handsome as he grays, he has a Celtic's blue-green eyes set deep into a high-domed forehead. "He gives an impression of strength and vitality and poetry," says Bellow, who describes him as "crammed into his own person."


For such an obstinate loner, McCarthy is an engaging figure, a world-class talker, funny, opinionated, quick to laugh. Unlike his illiterate characters, who tend to be terse and crude, he speaks with an amused, ironic manner. His involved syntax has a relaxed elegance, as if he had easy control over the direction and agreement of his thoughts. Once he had agreed to an interview -- after long negotiations with his agent in New York, Amanda Urban of International Creative Management, who promised he wouldn't have to do another for many years -- he seemed happy to entertain company for a few days.


Since 1976 he has lived mainly in El Paso, which sprawls along the concrete-lined Rio Grande, across the border from Juarez, Mexico. A gregarious recluse, McCarthy has lots of friends who know that he likes to be left alone. A few years ago The El Paso Herald-Post held a dinner in his honor. He politely warned them that he wouldn't attend, and didn't. The plaque now hangs in the office of his lawyer.


For many years he had no walls to hang anything on. When he heard the news about his MacArthur, he was living in a motel in Knoxville, Tenn. Such accommodations have been his home so routinely that he has learned to travel with a high-watt light bulb in a lens case to assure better illumination for reading and writing. In 1982 he bought a tiny, whitewashed stone cottage behind a shopping center in El Paso. But he wouldn't take me inside. Renovation, which began a few years ago, has stopped for lack of funds. "It's barely habitable," he says. He cuts his own hair, eats his meals off a hot plate or in cafeterias and does his wash at the Laundromat.


McCarthy estimates that he owns about 7,000 books, nearly all of them in storage lockers. "He has more intellectual interests than anyone I've ever met," says the director Richard Pearce, who tracked down McCarthy in 1974 and remains one of his few "artistic" friends. Pearce asked him to write the screenplay for "The Gardener's Son," a television drama about the murder of a South Carolina mill owner in the 1870's by a disturbed boy with a wooden leg. In typical McCarthy style, the amputation of the boy's leg and his slow execution by hanging are the moments from the show that linger in the mind.


McCarthy has never shown interest in a steady job, a trait that seems to have annoyed both his ex-wives. "We lived in total poverty," says the second, Annie DeLisle, now a restaurateur in Florida. For nearly eight years they lived in a dairy barn outside Knoxville. "We were bathing in the lake," she says with some nostalgia. "Someone would call up and offer him $2,000 to come speak at a university about his books. And he would tell them that everything he had to say was there on the page. So we would eat beans for another week."


McCarthy would rather talk about rattlesnakes, molecular computers, country music, Wittgenstein -- anything -- than himself or his books. "Of all the subjects I'm interested in, it would be extremely difficult to find one I wasn't," he growls. "Writing is way, way down at the bottom of the list."


His hostility to the literary world seems both genuine ("teaching writing is a hustle") and a tactic to screen out distractions. At the MacArthur reunions he spends his time with scientists, like the physicist Murray Gell-Mann and the whale biologist Roger Payne, rather than other writers. One of the few he acknowledges having known at all was the novelist and ecological crusader Edward Abbey. Shortly before Abbey's death in 1989, they discussed a covert operation to reintroduce the wolf to southern Arizona.


McCarthy's silence about himself has spawned a host of legends about his background and whereabouts. Esquire magazine recently printed a list of rumors, including one that had him living under an oil derrick. For many years the sum of hard-core information about his early life could be found in an author's note to his first novel, "The Orchard Keeper," published in 1965. It stated that he was born in Rhode Island in 1933; grew up outside Knoxville; attended parochial schools; entered the University of Tennessee, which he dropped out of; joined the Air Force in 1953 for four years; returned to the university, which he dropped out of again, and began to write novels in 1959. Add the publication dates of his books and awards, the marriages and divorces, a son born in 1962 and the move to the Southwest in 1974, and the relevant facts of his biography are complete.


The oldest son of an eminent lawyer, formerly with the Tennessee Valley Authority, McCarthy is Charles Jr., with five brothers and sisters. Cormac, the Gaelic equivalent of Charles, was an old family nickname bestowed on his father by Irish aunts.


It seems to have been a comfortable upbringing that bears no resemblance to the wretched lives of his characters. The large white house of his youth had acreage and woods nearby, and was staffed with maids. "We were considered rich because all the people around us were living in one- or two-room shacks," he says. What went on in these shacks, and in Knoxville's nether world, seems to have fueled his imagination more than anything that happened inside his own family. Only his novel "Suttree," which has a paralyzing father-son conflict, seems strongly autobiographical.


"I was not what they had in mind," McCarthy says of childhood discord with his parents. "I felt early on I wasn't going to be a respectable citizen. I hated school from the day I set foot in it." Pressed to explain his sense of alienation, he has an odd moment of heated reflection. "I remember in grammar school the teacher asked if anyone had any hobbies. I was the only one with any hobbies, and I had every hobby there was. There was no hobby I didn't have, name anything, no matter how esoteric, I had found it and dabbled in it. I could have given everyone a hobby and still had 40 or 50 to take home." WRITING AND READING seem to be the only interests that the teen-age McCarthy never considered. Not until he was about 23, during his second quarrel with schooling, did he discover literature. To kill the tedium of the Air Force, which sent him to Alaska, he began reading in the barracks. "I read a lot of books very quickly," he says, vague about his self-administered syllabus.


McCarthy's style owes much to Faulkner's -- in its recondite vocabulary, punctuation, portentous rhetoric, use of dialect and concrete sense of the world -- a debt McCarthy doesn't dispute. "The ugly fact is books are made out of books," he says. "The novel depends for its life on the novels that have been written." His list of those whom he calls the "good writers" -- Melville, Dostoyevsky, Faulkner -- precludes anyone who doesn't "deal with issues of life and death." Proust and Henry James don't make the cut. "I don't understand them," he says. "To me, that's not literature. A lot of writers who are considered good I consider strange."


"The Orchard Keeper," however Faulknerian in its themes, characters, language and structure, is no pastiche. The story of a boy and two old men who weave in and out of his young life, it has a gnarliness and a gloom all its own. Set in the hill country of Tennessee, the allusive narrative memorializes, without a trace of sentimentality, a vanishing way of life in the woods. An affection for coon hounds binds the fate of the characters, who wander unaware of any kinship. The boy never learns that a decomposing body he sees in a leafy pit may be his father.


McCarthy began the book in college and finished it in Chicago, where he worked part time in an auto-parts warehouse. "I never had any doubts about my abilities," he says. "I knew I could write. I just had to figure out how to eat while doing this." In 1961 he married Lee Holleman, whom he had met at college; they had a son, Cullen (now an architecture student at Princeton), and quickly divorced, the yet-unpublished writer taking off for Asheville, N.C., and New Orleans. Asked if he had ever paid alimony, McCarthy snorts. "With what?" He recalls his expulsion from a $40-a-month room in the French Quarter for nonpayment of rent.


After three years of writing, he packed off the manuscript to Random House -- "it was the only publisher I had heard of." Eventually it reached the desk of the legendary Albert Erskine, who had been Faulkner's last editor as well as the sponsor for "Under the Volcano" by Malcolm Lowry and "The Invisible Man" by Ralph Ellison. Erskine recognized McCarthy as a writer of the same caliber and, in the sort of relationship that scarcely exists anymore in American publishing, edited him for the next 20 years. "There is a father-son feeling," says Erskine, despite the fact, as he sheepishly admits, that "we never sold any of his books."


For years McCarthy seems to have subsisted on awards money he earned for "The Orchard Keeper" -- including grants from the American Academy of Arts and Letters, the William Faulkner Foundation and the Rockefeller Foundation. Some of these funds went toward a trip to Europe in 1967, where he met DeLisle, an English pop singer, who became his second wife. They settled for many months on the island of Ibiza in the Mediterranean, where he wrote "Outer Dark," published in 1968, a twisted Nativity story about a girl's search for her baby, the product of incest with her brother. At the end of their independent wanderings through the rural South the brother witnesses, in one of McCarthy's most appalling scenes, the death of his child at the hands of three mysterious killers around a campfire: "Holme saw the blade wink in the light like a long cat's eye slant and malevolent and a dark smile erupted on the child's throat and went all broken down the front of it. The child made no sound. It hung there with its one eye glazing over like a wet stone and the black blood pumping down its naked belly."


"Child of God," published in 1973 after he and DeLisle returned to Tennessee, tested new extremes. The main character, Lester Ballard -- a mass murderer and necrophiliac -- lives with his victims in a series of underground caves. He is based on newspaper reports of such a figure in Sevier County, Tenn. Somehow, McCarthy finds compassion for and humor in Ballard, while never asking the reader to forgive his crimes. No social or psychological theory is offered that might explain him away.


In a long review of the book in The New Yorker, Robert Coles called McCarthy a "novelist of religious feeling," comparing him with the Greek dramatists and medieval moralists. And in a prescient observation he noted the novelist's "stubborn refusal to bend his writing to the literary and intellectual demands of our era," calling him a writer "whose fate is to be relatively unknown and often misinterpreted."


"MOST OF MY FRIENDS FROM those days are dead," McCarthy says. We are sitting in a bar in Juarez, discussing "Suttree," his longest, funniest book, a celebration of the crazies and ne'er-do-wells he knew in Knoxville's dirty bars and poolrooms. McCarthy doesn't drink anymore -- he quit 16 years ago in El Paso, with one of his young girlfriends -- and "Suttree" reads like a farewell to that life. "The friends I do have are simply those who quit drinking," he says. "If there is an occupational hazard to writing, it's drinking."


Written over about 20 years and published in 1979, "Suttree" has a sensitive and mature protagonist, unlike any other in McCarthy's work, who ekes out a living on a houseboat, fishing in the polluted city river, in defiance of his stern, successful father. A literary conceit -- part Stephen Daedalus, part Prince Hal -- he is also McCarthy, the willful outcast. Many of the brawlers and drunkards in the book are his former real-life companions. "I was always attracted to people who enjoyed a perilous life style," he says. Residents of the city are said to compete to find themselves in the text, which has displaced "A Death in the Family" by James Agee as Knoxville's novel.


McCarthy began "Blood Meridian" after he had moved to the Southwest, without DeLisle. "He always thought he would write the great American western," says a still-smarting DeLisle, who typed "Suttree" for him -- "twice, all 800 pages." Against all odds, they remain friends. If "Suttree" strives to be "Ulysses," "Blood Meridian" has distinct echoes of "Moby-Dick," McCarthy's favorite book. A mad hairless giant named Judge Holden makes florid speeches not unlike Captain Ahab's. Based on historical events in the Southwest in 1849-50 (McCarthy learned Spanish to research it), the book follows the life of a mythic character called "the kid" as he rides around with John Glanton, who was the leader of a ferocious gang of scalp hunters. The collision between the inflated prose of the 19th-century novel and nasty reality gives "Blood Meridian" its strange, hellish character. It may be the bloodiest book since "The Iliad."


"I've always been interested in the Southwest," McCarthy says blandly. "There isn't a place in the world you can go where they don't know about cowboys and Indians and the myth of the West."


More profoundly, the book explores the nature of evil and the allure of violence. Page after page, it presents the regular, and often senseless, slaughter that went on among white, Hispanic and Indian groups. There are no heroes in this vision of the American frontier.


"There's no such thing as life without bloodshed," McCarthy says philosophically. "I think the notion that the species can be improved in some way, that everyone could live in harmony, is a really dangerous idea. Those who are afflicted with this notion are the first ones to give up their souls, their freedom. Your desire that it be that way will enslave you and make your life vacuous."


This tooth-and-claw view of reality would seem not to accept the largesse of philanthropies. Then again, McCarthy is no typical reactionary. Like Flannery O'Conner, he sides with the misfits and anachronisms of modern life against "progress." His play, "The Stonemason," written a few years ago and scheduled to be performed this fall at the Arena Stage in Washington, is based on a Southern black family he worked with for many months. The breakdown of the family in the play mirrors the recent disappearance of stoneworking as a craft.


"Stacking up stone is the oldest trade there is," he says, sipping a Coke. "Not even prostitution can come close to its antiquity. It's older than anything, older than fire. And in the last 50 years, with hydraulic cement, it's vanishing. I find that rather interesting."


BY COMPARISON WITH the sonority and carnage of "Blood Meridian," the world of "All the Pretty Horses" is less risky -- repressed but sane. The main character, a teen-ager named John Grady Cole, leaves his home in West Texas in 1949 after the death of his grandfather and during his parents' divorce, convincing his friend Lacey Rawlins they should ride off to Mexico.


Dialogue rather than description predominates, and the comical exchanges between the young men have a bleak music, as though their words had been whittled down by the wind off the desert:


They rode. You ever get ill at ease? said Rawlins. About what? I dont know. About anything. Just ill at ease. Sometimes. If you're someplace you aint supposed to be I guess you'd be ill at ease. Should be anyways. Well suppose you were ill at ease and didnt know why. Would that mean that you might be someplace you wasn't supposed to be and didnt know it? What the hell's wrong with you? I dont know. Nothin. I believe I'll sing. He did.


A linear tale of boyish episodes -- they meet vaqueros, are joined by a hapless companion, break horses on a hacienda and are thrown in jail -- the book has a sustained innocence and a lucidity new in McCarthy's work. There is even a budding love story.


"You haven't come to the end yet," says McCarthy, when asked about the low body count. "This may be nothing but a snare and a delusion to draw you in, thinking that all will be well."


The book is, in fact, the first volume of a trilogy; the third part has existed for more than 10 years as a screenplay. He and Richard Pearce have come close to making the film -- Sean Penn was interested -- but producers always became skittish about the plot, which has as its central relationship John Grady Cole's love for a teen-age Mexican prostitute.


Knopf is revving up the publicity engines for a campaign that they hope will bring McCarthy his overdue recognition. Vintage will reissue "Suttree" and "Blood Meridian" next month, and the rest of his work shortly thereafter. McCarthy, however, won't be making the book-signing circuit. During my visit he was at work in the mornings on Volume 2 of the trilogy, which will require another extended trip through Mexico.


"The great thing about Cormac is that he's in no rush," Pearce says. "He is absolutely at peace with his own rhythms and has complete confidence in his own powers."


In a pool hall one afternoon, a loud and youthful establishment in one of El Paso's ubiquitous malls, McCarthy ignores the video games and rock-and-roll and patiently runs out the table. A skillful player, he was a member of a team at this place, an incongruous setting for a man of his conservative demeanor. But more than one of his friends describes McCarthy as a "chameleon, able to adjust easily to any surroundings and company because he seems so secure in what he will and will not do."


"Everything's interesting," McCarthy says. "I don't think I've been bored in 50 years. I've forgotten what it was like."


He bangs away in his stone house or in motels on an Olivetti manual. "It's a messy business," he says about his novel-building. "You wind up with shoe boxes of scrap paper." He likes computers. "But not to write on." That's about all he will discuss about his process of writing. Who types his final drafts now he doesn't say.


Having saved enough money to leave El Paso, McCarthy may take off again soon, probably for several years in Spain. His son, with whom he has lately re-established a strong bond, is to be married there this year. "Three moves is as good as a fire," he says in praise of homelessness.


The psychic cost of such an independent life, to himself and others, is tough to gauge. Aware that gifted American writers don't have to endure the kind of neglect and hardship that have been his, McCarthy has chosen to be hardheaded about the terms of his success. As he commemorates what is passing from memory -- the lore, people and language of a pre-modern age -- he seems immensely proud to be the kind of writer who has almost ceased to exist.
APPROXIMATELY NINE MILLION REVIEWS OF CORMAC MCCARTHY'S BOOK THE ROAD

THE NEW YORK TIMES

In “The Road” a boy and his father lurch across the cold, wretched, wet, corpse-strewn, ashen landscape of a post-apocalyptic world. The imagery is brutal even by Cormac McCarthy’s high standards for despair. This parable is also trenchant and terrifying, written with stripped-down urgency and fueled by the force of a universal nightmare. “The Road” would be pure misery if not for its stunning, savage beauty.

This is an exquisitely bleak incantation — pure poetic brimstone. Mr. McCarthy has summoned his fiercest visions to invoke the devastation. He gives voice to the unspeakable in a terse cautionary tale that is too potent to be numbing, despite the stupefying ravages it describes. Mr. McCarthy brings an almost biblical fury as he bears witness to sights man was never meant to see.

“There is no prophet in the earth’s long chronicle who is not honored here today,” the father says, trying to make his son understand why they inhabit a gray moonscape. “Whatever form you spoke of you were right.” Thus “The Road” keeps pace with the most enterprising doomsayers as death and desperation manifest themselves on every page. And in a perverse miracle it yields one last calamity when it seems that things cannot possibly get worse.

Yet as the boy and man wander, encountering remnants of the lost world and providing the reader with more and more clues about what destroyed it, this narrative is also illuminated by extraordinary tenderness. “He knew only that the child was his warrant,” it says of the father and his mission. “He said: if he is not the word of God God never spoke.”

The father’s loving efforts to shepherd his son are made that much more wrenching by the unavailability of food, shelter, safety, companionship or hope in most places where they scavenge to subsist.

Keeping memory alive is difficult, since the past grows increasingly remote. It is as if these lonely characters are experiencing “the onset of some cold glaucoma dimming away the world.” The past has become like a place inhabited by the newly blind, all of it slowly slipping away. As for looking toward the future, “there is no later,” the book says starkly. “This is later.”

The ruined setting of “The Road” is strewn with terrible, revealing artifacts. There are old newspapers. (“The curious news. The quaint concerns.”) There is one lone bottle of Coca-Cola, still absurdly fizzy when all else is dust. There are charred corpses frozen in their final postures, like the long-dead man who sits on a porch like “a straw man set out to announce some holiday.” Sometimes these prompt the father to recall “a dull rose glow in the windowglass” at 1:17 in the morning, the moment when the clocks stopped forever.

“The Road” is not concerned with explaining what caused this cataclysm. It is more abstract than that. Instead it becomes a relentless cautionary tale with “Lord of the Flies”-style symbolic impact, marked by a dark fascination with the primal laws of survival. Much of its impact comes from the absolute lawlessness of its backdrop as it undermines the father’s only remaining certitude: that he must keep his boy alive no matter what danger befalls them.

As they move down the metaphorical road of the title, father and son encounter all manner of perils. The weather is bitter, the landscape colorless, the threat of starvation imminent. There is also the occasional interloper or ominous relic, since the road is not entirely abandoned.

The sight of a scorched, shuffling man prompts the boy to ask what is wrong with him; the father simply replies that the man has been struck by lightning. Spear-carrying marchers on the road offer other hints about recent history. Groups of people are stowed away in hidden places as if they were other people’s food supply. In a book filled with virtual zombies and fixated on the living dead, it turns out that they are.

Since the cataclysm has presumably incinerated all dictionaries, Mr. McCarthy’s affinity for words like rachitic and crozzled has as much visceral, atmospheric power as precise meaning. His use of language is as exultant as his imaginings are hellish, a hint that “The Road” will ultimately be more radiant than it is punishing. Somehow Mr. McCarthy is able to hold firm to his pessimism while allowing the reader to see beyond it. This is art that both frightens and inspires.

Although “The Road” is entirely unsentimental, it gives father and son a memory to keep them moving, even if it is the memory of how and why the boy’s mother chose to die. She was pregnant when the world exploded, and the boy was born a few days after she and the man “watched distant cities burn.”

Ultimately she gave up and took a bullet: “She was gone and the coldness of it was her final gift.” In a book whose events are isolated and carefully chosen, the appearance of a flare gun late in the story is filled with echoes of her final decision.

The mother’s suicide is one more reason for astonishment at Mr. McCarthy’s final gesture here: an embrace of faith in the face of no hope whatsoever. Coming as it does after such intense moments of despondency, this faith is even more of a leap than it might be in a more forgiving story. It adds immeasurably to the staying power of a book that is simple yet mysterious, simultaneously cryptic and crystal clear.

“The Road” offers nothing in the way of escape or comfort. But its fearless wisdom is more indelible than reassurance could ever be.

NEWSWEEK

For a more than decent summary of the plot of Cormac McCarthy’s latest novel, “The Road,” consult the Library of Congress boilerplate that follows the book’s title page: “1. Fathers and sons-Fiction. 2. Voyages and travels-United States-Fiction. 3. Regression (Civilization)-Fiction. 4. Survival skills-Fiction.” For that matter, it’s not a bad imitation of the novel’s style. Using the stripped-down prose that he employed so effectively in his last book, “No Country for Old Men,” McCarthy spins an entire novel around two people, a father and his young son fighting their way through a post-apocalyptic world reduced to cold ashes and ruins. The action is equally minimal. The man and boy are traveling out of the mountains and toward the coast, searching for warmer weather and hoping to find someone neither malign nor crazy with whom they can join forces. McCarthy never says what happened to bring the world to cinders. Nor does he name his characters, or tell us how old the boy is or where they are exactly. He merely posits a world where everything is bombed out and broken beyond repair, soon to be populated by “men who would eat your children in front of your eyes” and looters who look like “shoppers in the commissaries of hell.” Darkness is a perennial McCarthy theme, but here it is in full flower. “The Road” is the logical culmination of everything he’s written.

It is also, paradoxically, his most humane and compassionate book. Father and son are genuinely affectionate toward each other. Each would give his life if it meant the other could live. This is as far as McCarthy has ever gone to acknowledge the goodness in people. And in the light of that relationship, the question that the novel implicitly poses—how much can you subtract from human existence before it ceases to be human?—takes on heartbreaking force.

“The Road” could have been a novella. Almost everything in the story—scrounging for food, hiding from the “Road Warrior”-like evildoers who haunt the highway—happens more than once. But the tedium that creeps in from time to time is integral to the narrative. Hunger and danger and cold are not just one-time obstacles for these pilgrims but things they must confront again and again; their courage lies in their refusal to give in. The boy and his father call themselves “the good guys.” It’s something a father would say to a son he wanted to guide and protect, but the more you see of these two, the more you want to remove the quotes from those words. They’re not ironic. The characters’ lives are gnawed down to the bone: all they have is their love for each other. And that, in the end, suffices.

One measure of a good writer is the ability to surprise. Terse, unsentimental, bleak—McCarthy’s readers have been down that road before. But who would ever have thought you’d call him touching?

VILLAGE VOICE

Have all of Cormac McCarthy's fictional odysseys been leading to this, a world blasted gray and featureless by human folly and cosmic indifference, inhabited only by pitiless predators and (arguably) lucky survivors? Or is The Road just further rumination from a man who, metaphorically or otherwise, finds himself on unrecognizable terrain in the final years of his life?
Take your pick. The genius of Mc- Carthy's work, whether you find it risible or profound, is in its bold, seamless melding of private revelation, cultural insight, and unabashed philosophizing. Sci-fi divination is new for him, though, and the freshness he brings to this end-of-the-world narrative is quite stunning: It may be the saddest, most haunting book he's ever written, or that you'll ever read.

His previous novel, No Country for Old Men, was nothing if not pre-apocalyptic, and The Road fulfills that bitter promise in spades. Its stripped-down story, however, couldn't be more removed from the doggedly elliptical No Country: A man and his young son trek southwesterly through an unnamed, nuclear-winterized landscape in search of warmth and on the run from bands of cannibalistic outlaws. As the pair scavenge for food and comfort among eerily abandoned towns and withered forests, they provide each other with—just barely—a reason not to lie down and die.

Never one to indulge in explosive action (he's more the propulsive type—"they went on" is this tale's Blood Meridian–like mantra), McCarthy holds back even more than usual here. The milieu—a sprawling, horizonless vale of drifting ash and spindly rubble, "the ponderous counterspectacle of things ceasing to be"—is startling for its lack of customary descriptive detail, and the book is all the more wrenching for it; the degree of ruin might make even Judge Holden blanch. McCarthy underplays the familiar last-man-on-earth pulp accoutrements as well, making The Road more Time of the Wolf than Mad Max, and more Kuroi Ame than either of those (devoid of that novel's debatable reassurance that the world was more or less intact after Hiroshima's incineration).

It's also McCarthy's purest fable yet. The troubled bond that links the man, a compulsive isolationist tyrannized by his fading memories ("The names of things slowly following those things into oblivion. Colors. The names of birds. Things to eat. Finally the names of things one believed to be true"), and the boy, who longs to stop and make contact with other "good guys" on the road, is key to the book's mythic scope: Its argument exists in the tension between the rank self-centeredness necessary to survive as an individual and the altruism required to survive as a species. As such, it seems as much McCarthy's second response to the West's accelerated social erosion (the frankly bewildered No Country being the first) as a heartsick accounting of irretrievable extinction.

The Road also represents a more personal reckoning, albeit a less angry one than its predecessor. Despite the apocalyptic setting, McCarthy lets down his cynical guard enough to suggest that the future—to say nothing of the present—invariably resembles a wasteland when viewed from the vantage point of someone with an abundance of past. (Not that he's lost his edge; there are plenty of robust allusions to Western lit's better-known Father and Son act here, too.) It's a gentle, compassionate gesture, and hints that this could well be McCarthy's swan song—potential bad news for his fans.

Whether or not that's the case, they should be satisfied with the current offering's characteristic helpings of hypnotic, gut-punching prose and bracing depictions of emotional longing ("She held his hand in her lap and he could feel the tops of her stockings through the thin stuff of her summer dress. Freeze this frame. Now call down your dark and your cold and be damned" )—qualities McCarthy's detractors seem bizarrely content to underestimate or overlook. Indeed, for all its allegorical underpinnings and stark grandeur, the tender precariousness of The Road's human relationships is what finally makes it such a beautiful, difficult, near perfect work.

THE GUARDIAN

Shorn of history and context, Cormac McCarthy's other nine novels could be cast as rungs, with The Road as a pinnacle. This is a very great novel, but one that needs a context in both the past and in so-called post-9/11 America.

We can divide the contemporary American novel into two traditions, or two social classes. The Tough Guy tradition comes up from Fenimore Cooper, with a touch of Poe, through Melville, Faulkner and Hemingway. The Savant tradition comes from Hawthorne, especially through Henry James, Edith Wharton and Scott Fitzgerald. You could argue that the latter is liberal, east coast/New York, while the Tough Guys are gothic, reactionary, nihilistic, openly religious, southern or fundamentally rural.

The Savants' blood line (curiously unrepresentative of Americans generally) has gained undoubted ascendancy in the literary firmament of the US. Upper middle class, urban and cosmopolitan, they or their own species review themselves. The current Tough Guys are a murder of great, hopelessly masculine, undomesticated writers, whose critical reputations have been and still are today cruelly divergent, adrift and largely unrewarded compared to the contemporary Savant school. In literature as in American life, success must be total and contrasted "failure" fatally dispiriting.
But in both content and technical riches, the Tough Guys are the true legislators of tortured American souls. They could include novelists Thomas McGuane, William Gaddis, Barry Hannah, Leon Rooke, Harry Crews, Jim Harrison, Mark Richard, James Welch and Denis Johnson. Cormac McCarthy is granddaddy to them all. New York critics may prefer their perfidy to be ignored, comforting themselves with the superlatives for All the Pretty Horses, but we should remember that the history of Cormac McCarthy and his achievement is not an American dream but near on 30 years of neglect for a writer who, since The Orchard Keeper in 1965, produced only masterworks in elegant succession. Now he has given us his great American nightmare.

The Road is a novel of transforming power and formal risk. Abandoning gruff but profound male camaraderie, McCarthy instead sounds the limits of imaginable love and despair between a diligent father and his timid young son, "each other's world entire". The initial experience of the novel is sobering and oppressive, its final effect is emotionally shattering.

America - and presumably the world - has suffered an apocalypse the nature of which is unclear and, faced with such loss, irrelevant. The centre of the world is sickened. Earthquakes shunt, fire storms smear a "cauterised terrain", the ash-filled air requires slipshod veils to cover the mouth. Nature revolts. The ruined world is long plundered, with canned food and good shoes the ultimate aspiration. Almost all have plunged into complete Conradian savagery: murdering convoys of road agents, marauders and "bloodcults" plunder these wastes. Most have resorted to cannibalism. One passing brigade is fearfully glimpsed: "Bearded, their breath smoking through their masks. The phalanx following carried spears or lances ... and lastly a supplementary consort of catamites illclothed against the cold and fitted in dogcollars and yoked each to each." Despite this soul desert, the end of God and ethics, the father still defines and endangers himself by trying to instil moral values in his son, by refusing to abandon all belief.

All of this is utterly convincing and physically chilling. The father is coughing blood, which forces him and his son, "in their rags like mendicant friars sent forth to find their keep", on to the treacherous road southward, towards a sea and - possibly - survivable, milder winters. They push their salvage in a shopping cart, wryly fitted with a motorcycle mirror to keep sentinel over that road behind. The father has a pistol, with two bullets only. He faces the nadir of human and parental existence; his wife, the boy's mother, has already committed suicide. If caught, the multifarious reavers will obviously rape his son, then slaughter and eat them both. He plans to shoot his son - though he questions his ability to do so - if they are caught. Occasionally, between nightmares, the father seeks refuge in dangerously needy and exquisite recollections of our lost world.

They move south through nuclear grey winter, "like the onset of some cold glaucoma dimming away the world", sleeping badly beneath filthy tarpaulin, setting hidden campfires, exploring ruined houses, scavenging shrivelled apples. We feel and pity their starving dereliction as, despite the profound challenge to the imaginative contemporary novelist, McCarthy completely achieves this physical and metaphysical hell for us. "The world shrinking down to a raw core of parsible entities. The names of things slowly following those things into oblivion. Colours. The names of birds. Things to eat. Finally the names of things one believed to be true."

Such a scenario allows McCarthy finally to foreground only the very basics of physical human survival and the intimate evocation of a destroyed landscape drawn with such precision and beauty. He makes us ache with nostalgia for restored normality. The Road also encapsulates the usual cold violence, the biblical tincture of male masochism, of wounds and rites of passage. His central character can adopt a universal belligerence and misanthropy. In this damnation, rightly so, everyone, finally, is the enemy. He tells his son: "My job is to take care of you. I was appointed by God to do that ... We are the good guys." The other uncomfortable, tellingly national moment comes when the father salvages perhaps the last can of Coke in the world. This is truly an American apocalypse.

The vulnerable cultural references for this daring scenario obviously come from science fiction. But what propels The Road far beyond its progenitors are the diverted poetic heights of McCarthy's late-English prose; the simple declamation and plainsong of his rendered dialect, as perfect as early Hemingway; and the adamantine surety and utter aptness of every chiselled description. As has been said before, McCarthy is worthy of his biblical themes, and with some deeply nuanced paragraphs retriggering verbs and nouns that are surprising and delightful to the ear, Shakespeare is evoked. The way McCarthy sails close to the prose of late Beckett is also remarkable; the novel proceeds in Beckett-like, varied paragraphs. They are unlikely relatives, these two artists in old age, cornered by bleak experience and the rich limits of an English pulverised down through despair to a pleasingly wry perfection. "He rose and stood tottering in that cold autistic dark with his arms out-held for balance while the vestibular calculations in his skull cranked out their reckonings. An old chronicle."

Set piece after set piece, you will read on, absolutely convinced, thrilled, mesmerised with disgust and the fascinating novelty of it all: breathtakingly lucky escapes; a complete train, abandoned and alone on an embankment; a sudden liberating, joyous discovery or a cellar of incarcerated amputees being slowly eaten. And everywhere the mummified dead, "shrivelled and drawn like latterday bogfolk, their faces of boiled sheeting, the yellowed palings of their teeth".

All the modern novel can do is done here. After the great historical fictions of the American west, Blood Meridian and The Border Trilogy, The Road is no artistic pinnacle for McCarthy but instead a masterly reclamation of those midnight-black, gothic worlds of Outer Dark (1968) and the similarly terrifying but beautiful Child of God (1973). How will this vital novel be positioned in today's America by Savants, Tough Guys or worse? Could its nightmare vistas reinforce those in the US who are determined to manipulate its people into believing that terror came into being only in 2001? This text, in its fragility, exists uneasily within such ill times. It's perverse that the scorched earth which The Road depicts often brings to mind those real apocalypses of southern Iraq beneath black oil smoke, or New Orleans - vistas not unconnected with the contemporary American regime.

One night, when the father thinks that he and his son will starve to death, he weeps, not about the obvious but about beauty and goodness, "things he'd no longer any way to think about". Camus wrote that the world is ugly and cruel, but it is only by adding to that ugliness and cruelty that we sin most gravely. The Road affirms belief in the tender pricelessness of the here and now. In creating an exquisite nightmare, it does not add to the cruelty and ugliness of our times; it warns us now how much we have to lose. It makes the novels of the contemporary Savants seem infantile and horribly over-rated. Beauty and goodness are here aplenty and we should think about them. While we can.

CORMAC MCCARTHY WEBSITE

Saturday, December 23, 2006

NEO CULPA
from Vanity Fair

I: About That Cakewalk …

I remember sitting with Richard Perle in his suite at London's Grosvenor House hotel and receiving a private lecture on the importance of securing victory in Iraq. "Iraq is a very good candidate for democratic reform," he said. "It won't be Westminster overnight, but the great democracies of the world didn't achieve the full, rich structure of democratic governance overnight. The Iraqis have a decent chance of succeeding."

In addition to a whiff of gunpowder, Perle seemed to exude the scent of liberation—not only for Iraqis, but for all the Middle East. After the fall of Saddam Hussein, Perle suggested, Iranian reformers would feel emboldened to change their own regime, while Syria would take seriously American demands to cease its support for terrorists.

Perle had spent much of the 1990s urging the ouster of Saddam Hussein. He was aligned with the Project for the New American Century, a neoconservative think tank that agitated for Saddam's removal, and he had helped to engineer the 1998 Iraq Liberation Act, which established regime change as formal U.S. policy. After the accession of George W. Bush, in 2001, Perle was appointed chairman of the Pentagon's Defense Policy Board Advisory Committee, and at its first meeting after 9/11—attended by Defense Secretary Donald Rumsfeld; his deputy, Paul Wolfowitz; and Rumsfeld's No. 3, Douglas Feith—Perle arranged a presentation from the exiled Iraqi dissident Ahmad Chalabi. Perle wanted to shut down terrorist havens—not only in Afghanistan but also in Iraq. When we spoke at Grosvenor House, it was late February 2003, and the culmination of all this effort—Operation Iraqi Freedom—was less than a month away.

Three years later, Perle and I meet again, at his home outside Washington, D.C. It is October 2006, the worst month for U.S. casualties in Iraq in nearly two years, and Republicans are bracing for what will prove to be sweeping losses in the upcoming midterm elections. As he looks into my eyes, speaking slowly and with obvious deliberation, Perle is unrecognizable as the confident hawk I once knew. "The levels of brutality that we've seen are truly horrifying, and I have to say, I underestimated the depravity," Perle says, adding that total defeat—an American withdrawal that leaves Iraq as an anarchic "failed state"—is not yet inevitable, but is becoming more likely. "And then," he says, "you'll get all the mayhem that the world is capable of creating."

According to Perle, who left the Defense Policy Board in 2004, this unfolding catastrophe has a central cause: devastating dysfunction within the Bush administration. The policy process has been nothing short of "disastrous," he says. "The decisions did not get made that should have been. They didn't get made in a timely fashion, and the differences were argued out endlessly. At the end of the day, you have to hold the president responsible.… I think he was led to believe that things were chugging along far more purposefully and coherently than in fact they were. I think he didn't realize the depth of the disputes underneath. I don't think he realized the extent of the opposition within his own administration, and the disloyalty."

Perle goes as far as to say that, if he had his time over, he would not advocate an invasion of Iraq: "I think if I had been delphic, and had seen where we are today, and people had said, 'Should we go into Iraq?,' I think now I probably would have said, 'No, let's consider other strategies for dealing with the thing that concerns us most, which is Saddam supplying weapons of mass destruction to terrorists.' … I don't say that because I no longer believe that Saddam had the capability to produce weapons of mass destruction, or that he was not in contact with terrorists. I believe those two premises were both correct. Could we have managed that threat by means other than a direct military intervention? Well, maybe we could have."

Having spoken with Perle, I wonder: What do the rest of the war's neoconservative proponents think? If the much-caricatured "Prince of Darkness" is now plagued with doubt, how do his comrades-in-arms feel? I am particularly interested in finding out because I interviewed some of the neocons before the invasion and, like many people, found much to admire in their vision of spreading democracy in the Middle East.

I expect to encounter disappointment. What I find instead is despair, and fury at the incompetence of the Bush administration many neocons once saw as their brightest hope.

David Frum, the former White House speechwriter who co-wrote Bush's 2002 State of the Union address, accusing Iraq of being part of an "axis of evil," says it now looks as if defeat may be inescapable, because "the insurgency has proven it can kill anyone who cooperates, and the United States and its friends have failed to prove that it can protect them. If you are your typical, human non-hero, then it's very hard at this point to justify to yourself and your family taking any risks at all on behalf of the coalition." This situation, he says, must ultimately be blamed on "failure at the center."

Kenneth Adelman, a longtime neocon activist and Pentagon insider who has served on the Defense Policy Board, wrote a famous op-ed article in The Washington Post in February 2002, arguing, "I believe that demolishing Hussein's military power and liberating Iraq would be a cakewalk." Now he says, "I am extremely disappointed by the outcome in Iraq, because I just presumed that what I considered to be the most competent national-security team since Truman was indeed going to be competent. They turned out to be among the most incompetent teams in the postwar era. Not only did each of them, individually, have enormous flaws, but together they were deadly, dysfunctional."

Fearing that worse is still to come, Adelman believes that neoconservatism itself—what he defines as "the idea of a tough foreign policy on behalf of morality, the idea of using our power for moral good in the world"—is dead, at least for a generation. After Iraq, he says, "it's not going to sell." And if he, too, had his time over, Adelman says, "I would write an article that would be skeptical over whether there would be a performance that would be good enough to implement our policy. The policy can be absolutely right, and noble, beneficial, but if you can't execute it, it's useless, just useless. I guess that's what I would have said: that Bush's arguments are absolutely right, but you know what? You just have to put them in the drawer marked CAN'T DO. And that's very different from LET'S GO."

James Woolsey, another Defense Policy Board member, who served as director of the C.I.A. under President Clinton, lobbied for an Iraq invasion with a prodigious output of articles, speeches, and television interviews. At a public debate hosted by Vanity Fair in September 2004, he was still happy to argue for the motion that "George W. Bush has made the world a safer place." Now he draws explicit parallels between Iraq and Vietnam, aghast at what he sees as profound American errors that have ignored the lessons learned so painfully 40 years ago. He has not given up hope: "As of mid-October of '06, the outcome isn't clear yet." But if, says Woolsey, as now seems quite possible, the Iraqi adventure ends with American defeat, the consequences will be "awful, awful.… It will convince the jihadis and al-Qaeda-in-Iraq types as well as the residual Ba'thists that we are a paper tiger, and they or anybody they want to help can take us on anywhere and anytime they want and be effective, that we don't have the stomach to stay and fight."

Professor Eliot Cohen of Johns Hopkins University's School of Advanced International Studies, yet another Defense Policy Board member and longtime advocate of ousting Saddam Hussein, is even more pessimistic: "People sometimes ask me, 'If you knew then what you know now, would you still have been in favor of the war?' Usually they're thinking about the W.M.D. stuff. My response is that the thing I know now that I did not know then is just how incredibly incompetent we would be, which is the most sobering part of all this. I'm pretty grim. I think we're heading for a very dark world, because the long-term consequences of this are very large, not just for Iraq, not just for the region, but globally—for our reputation, for what the Iranians do, all kinds of stuff."
II: Let the Finger-Pointing Begin

I turn in my piece on Thursday, November 2—five days before the midterm elections. The following day, the editors phone to say that its contents—especially the comments by Perle, Adelman, and Frum—are so significant and unexpected that they have decided to post an excerpt that afternoon on the magazine's Web site, vanityfair.com.

The abridged article goes up at about 4:45 P.M., eastern standard time. Its impact is almost immediate. Within minutes, George Stephanopoulos confronts Vice President Dick Cheney with Perle's and Adelman's criticisms during an on-camera interview. Cheney blanches and declines to comment, other than to say that the administration remains committed to its Iraq policy and will continue to pursue it, "full speed ahead." By the next morning, news of the neocons' about-face has been picked up by papers, broadcasters, and blogs around the world, despite a White House spokesperson's attempt to dismiss it as "Monday-morning quarterbacking."

Some of my interviewees, Richard Perle included, protest in a forum on National Review Online that they were misled, because they believed that their words would not be published until V.F.'s January issue hit newsstands—after the midterms. Posting a preview on the Web, they say, was a "partisan" attempt to score political points. In response, the magazine issues a statement: "At a time when Vice President Dick Cheney is saying that the administration is going 'full speed ahead' with its policy in Iraq and that 'we've got the basic strategy right,' and the president is stating that Defense Secretary Rumsfeld's job is secure, we felt that it was in the public's interest to hear now, before the election, what the architects of the Iraq war are saying about its mission and execution."

Some of the neocons also claim that the Web excerpt quotes them out of context—implying, perhaps, that in other parts of their interviews they had praised the performance of Bush and his administration. That charge is untrue. Meanwhile, not all the neocons are unhappy. On Wednesday, November 8, with news of the Democratic takeover of Congress still fresh and Rumsfeld's resignation still hours away, I receive an e-mail from Adelman. "I totally agree with you," he writes. "Why keep Issue #1 behind closed doors until the American people have a chance to vote? That's why I was (among the only ones) not giving any 'rebuttal' to the [Web] release, despite being asked and pressured to do so, since I think it's just fine to get word out when it could make a difference to people.

"Plus I personally had no rebuttal. I thought the words I read from you were fair and right on target."

A cynic might argue that, since the Iraqi disaster has become so palpably overwhelming, the neocons are trashing what is left of Bush's reputation in the hope of retaining theirs. Given the outcome of the midterms, it also seems likely that these interviews are the first salvos in a battle to influence how history will judge the war. The implications will be profound—not only for American conservatism but also for the future direction and ambitions of American foreign policy. The neocons' position in this debate starts with an unprovable assertion: that when the war began, Iraq was "a doable do," to use a military planner's phrase cited by David Frum. If not for the administration's incompetence, they say, Saddam's tyranny could have been replaced with something not only better but also secure. "Huge mistakes were made," Richard Perle says, "and I want to be very clear on this: they were not made by neoconservatives, who had almost no voice in what happened, and certainly almost no voice in what happened after the downfall of the regime in Baghdad. I'm getting damn tired of being described as an architect of the war. I was in favor of bringing down Saddam. Nobody said, 'Go design the campaign to do that.' I had no responsibility for that."

Some of those who did have responsibility, and were once the most gung-ho, are also losing heart. In December 2005, I spoke with Douglas Feith, the former undersecretary of defense for policy, whose Office of Special Plans was reportedly in charge of policy planning for the invasion and its aftermath. He told me then, "I have confidence that in 20 to 30 years people will be happy we removed Saddam Hussein from power and will say we did the right thing. They will look back and say that our strategic rationale was sound, and that through doing this we won a victory in the war on terror."

When we talk again, in October 2006, Feith sounds less certain. It is beginning to seem possible that America will withdraw before Iraq achieves stability, he says, and if that happens his previous statement would no longer be justified. "There would be a lot of negative consequences," he says, adding that America's enemies, including Osama bin Laden, have attacked when they perceived weakness. Leaving Iraq as a failed state, Feith concludes, "would wind up hurting the United States and the interests of the civilized world." In 2005, Feith thought failure unimaginable. Now he broods on how it may occur, and envisions its results.

At the end of 2003, Richard Perle and David Frum published a book, An End to Evil: How to Win the War on Terror. Neoconservatives do not make up an organized bloc—much less a "cabal," as is sometimes alleged—but the book ends with a handy summary of their ideas. Foreign policy, write Perle and Frum, should attempt to achieve not only the realist goals of American wealth and security but also less tangible ends that benefit mankind. The neoconservative dream, they say, is similar to that which inspired the founders of the United Nations after World War II: "A world at peace; a world governed by law; a world in which all peoples are free to find their own destinies." But in Perle and Frum's view, the U.N. and similar bodies have failed, leaving "American armed might" as the only force capable of bringing this Utopian world into being. "Our vocation is to support justice with power," they write. "It is a vocation that has earned us terrible enemies. It is a vocation that has made us, at our best moments, the hope of the world."

Although Perle was one of the first to frame the case for toppling Saddam in realist terms of the threat of W.M.D.—in a letter he sent to Clinton in February 1998 whose 40 signatories included Rumsfeld, Wolfowitz, and Feith—he insists that the idealist values outlined in his book shaped the way he and his allies always believed the war should be fought. At the heart of their program was an insistence that, no matter how Saddam was deposed, Iraqis had to be allowed to take charge of their destiny immediately afterward.

In the 1990s, the neocons tried to secure American air and logistical support for an assault on Saddam by a "provisional government" based in Kurdistan—a plan derided by former CentCom chief General Anthony Zinni as a recipe for a "Bay of Goats." After 9/11, as America embarked on the path to war in earnest, they pushed again for the recognition of a provisional Iraqi government composed of former exiles, including Chalabi. In addition to acting as a magnet for new defectors from the Iraqi military and government, they argued, this government-in-exile could assume power as soon as Baghdad fell. The neocons, represented inside the administration by Feith and Wolfowitz, also unsuccessfully demanded the training of thousands of Iraqis to go in with coalition troops.

The failure to adopt these proposals, neocons outside the administration now say, was the first big American error, and it meant that Iraqis saw their invaders as occupiers, rather than liberators, from the outset. "Had they gone in with even just a brigade or two of well-trained Iraqis, I think things could have been a good deal different," James Woolsey tells me at his law office, in McLean, Virginia. "That should have been an Iraqi that toppled that statue of Saddam." Drawing a comparison to the liberation of France in World War II, he recalls how "we stood aside and saw the wisdom of having [the Free French leaders] de Gaulle and Leclerc lead the victory parade through Paris in the summer of '44." The coalition, he says, should have seen the symbolic value of allowing Iraqis to "take" Baghdad in 2003. He draws another historical parallel, to the U.S. campaigns against Native Americans in the 19th century, to make another point: that the absence of Iraqi auxiliaries deprived coalition soldiers of invaluable local intelligence. "Without the trained Iraqis, it was like the Seventh Cavalry going into the heart of Apache country in Arizona in the 1870s with no scouts. No Apache scouts. I mean, hello?"

If the administration loaded the dice against success with its pre-war decisions, Kenneth Adelman says, it made an even greater blunder when Saddam's regime fell. "The looting was the decisive moment," Adelman says. "The moment this administration was lost was when Donald Rumsfeld took to the podium and said, 'Stuff happens. This is what free people do.' It's not what free people do at all: it's what barbarians do. Rumsfeld said something about free people being free to make mistakes. But the Iraqis were making 'mistakes' by ruining their country while the U.S. Army stood there watching!" Once Rumsfeld and General Tommy Franks failed to order their forces to intervene—something Adelman says they could have done—several terrible consequences became inevitable. Among them, he tells me over lunch at a downtown-D.C. restaurant, was the destruction of Iraq's infrastructure, the loss of documents that might have shed light on Saddam's weapons capabilities, and the theft from Iraq's huge munitions stores of tons of explosives "that they're still using to kill our kids." The looting, he adds, "totally discredited the idea of democracy, since this 'democracy' came in tandem with chaos." Worst of all, "it demolished the sense of the invincibility of American military power. That sense of invincibility is enormously valuable when you're trying to control a country. It means, 'You fuck with this guy, you get your head blown off.' All that was destroyed when the looting began and was not stopped."

According to Frum, there was a final ingredient fueling the wildfire spread of violence in the second half of 2003: intelligence failures that were, in terms of their effects, even "grosser" than those associated with the vanishing weapons. "The failure to understand the way in which the state was held together was more total," he tells me in his office at the neoconservative think tank the American Enterprise Institute (A.E.I.). America assumed it was invading a functional, secular state, whose institutions and lines of control would carry on functioning under new leadership. Instead, partly as a result of the 1990s sanctions, it turned out to be a quasi-medieval society where Saddam had secured the loyalty of tribal sheikhs and imams with treasure and S.U.V.'s. Here, Frum says, another disadvantage of not having an Iraqi provisional government made itself felt: "There's no books, there's no treasury, and he's distributing. One guy gets a Land Rover, another guy gets five Land Rovers, somebody else gets a sack of gold.… That is information that only an Iraqi is going to have, and this is something I said at the time: that Iraq is going to be ruled either through terror or through corruption. Saddam knew how to do it through terror. Ahmad Chalabi would have known how to do it through corruption. What we are now trying to do, in the absence of the knowledge of who has to be rewarded, is to do a lot of things through force." The state had ceased to "deliver" rewards to loyalists, and in that vacuum the insurgency began to flourish.

III: The Trouble with Bush and Rice

As V.F. first revealed, in the May 2004 issue, Bush was talking about invading Iraq less than two weeks after 9/11, broaching the subject at a private White House dinner with British prime minister Tony Blair on September 20, 2001. With so much time to prepare, how could the aftermath have begun so badly? "People were aware in February or March of 2003 that the planning was not finished," Frum says. "There was not a coherent plan, and in the knowledge that there was not a coherent plan, there was not the decision made to wait." The emphasis here needs to be on the word "coherent." In fact, as Frum points out, there were several plans: the neocons' ideas outlined above, a British proposal to install their client Iyad Allawi, and suggestions from the State Department for a government led by the octogenarian Adnan Pachachi. To hear Frum tell it, the State Department was determined to block the neocons' anointed candidate, Ahmad Chalabi, and therefore resisted both Iraqi training and a provisional government, fearing that these measures would boost his prospects.

It would have been one thing, the neocons say, if their plan had been passed over in favor of another. But what really crippled the war effort was the administration's failure, even as its soldiers went to war, to make a decision. Less than three weeks before the invasion, Bush said in a rousing, pro-democracy speech to the A.E.I., "The United States has no intention of determining the precise form of Iraq's new government. That choice belongs to the Iraqi people." But with the administration unable to decide among Allawi, Pachachi, and Chalabi, the Iraqis ultimately were given no say. Instead, L. Paul Bremer III soon assumed almost unlimited powers as America's proconsul, assisted by a so-called Governing Council, which he was free to ignore and which, to judge by Bremer's memoir, he regarded as a contemptible irritant.

The place where such interagency disputes are meant to be resolved is the National Security Council, chaired during Bush's first term by Condoleezza Rice, who was national-security adviser at the time. A.E.I. Freedom Scholar Michael Ledeen—whose son, Gabriel, a lieutenant in the Marines, recently returned from a tour of duty in Iraq—served as a consultant to the N.S.C. under Ronald Reagan and says the council saw its role as "defining the disagreement" for the president, who would then make up his mind. "After that, we'd move on to the next fight." But Rice, says Ledeen, saw her job as "conflict resolution, so that when [then secretary of state Colin] Powell and Rumsfeld disagreed, which did happen from time to time, she would say to [then deputy national-security adviser Stephen] Hadley or whomever, 'O.K., try to find some middle ground where they can both agree.' So then it would descend at least one level in the bureaucracy, and people would be asked to draft new memos." By this process, Ledeen complains, "thousands of hours were wasted by searching for middle ground, which most of the time will not exist." Sometimes—as with the many vital questions about postwar Iraq—"it may well have been too late" by the time decisions emerged.

"The National Security Council was not serving [Bush] properly," says Richard Perle, who believes that the president failed to tackle this shortcoming because of his personal friendship with Rice. "He regarded her as part of the family." (Rice has spent weekends and holidays with the Bushes.) The best way to understand this aspect of the Bush administration, says Ledeen, is to ask, Who are the most powerful people in the White House? "They are women who are in love with the president: Laura [Bush], Condi, Harriet Miers, and Karen Hughes." He cites the peculiar comment Rice reportedly made at a dinner party in 2004, when she referred to Bush as "my husb—" before catching herself. "That's what we used to call a Freudian slip," Ledeen remarks.

Whatever the N.S.C.'s deficiencies, say the neocons, the buck has to stop with the president. "In the administration that I served," says Perle, who was an assistant secretary of defense under Reagan, there was a "one-sentence description of the decision-making process when consensus could not be reached among disputatious departments: 'The president makes the decision.'" Yet Bush "did not make decisions, in part because the machinery of government that he nominally ran was actually running him." That, I suggest, is a terrible indictment. Perle does not demur: "It is." Accepting that, he adds, is "painful," because on the occasions he got an insight into Bush's thinking Perle felt "he understood the basic issues and was pursuing policies that had a reasonable prospect of success." Somehow, those instincts did not translate into actions.

On the question of Bush, the judgments of some of Perle's ideological allies are harsher. Frank Gaffney also served under Reagan as an assistant secretary of defense; he is now president of the hawkish Center for Security Policy, which has close ties with the upper echelons of the Pentagon. Gaffney describes the administration as "riven," arguing that "the drift, the incoherence, the mixed signals, the failure to plan this thing [Iraq] rigorously were the end product of that internal dynamic." His greatest disappointment has been the lack of resolution displayed by Bush himself: "This president has tolerated, and the people around him have tolerated, active, ongoing, palpable insubordination and skulduggery that translates into subversion of his policies.… He doesn't in fact seem to be a man of principle who's steadfastly pursuing what he thinks is the right course," Gaffney says. "He talks about it, but the policy doesn't track with the rhetoric, and that's what creates the incoherence that causes us problems around the world and at home. It also creates the sense that you can take him on with impunity."

In 2002 and '03, Danielle Pletka, a Middle East expert at the A.E.I., arranged a series of conferences on the future of Iraq. At one I attended, in October 2002, Perle and Chalabi were on the platform, while in the audience were a Who's Who of Iraq policymakers from the Pentagon and the vice president's office. Pletka's bitterness now is unrestrained. "I think that even though the president remains rhetorically committed to the idea of what he calls his 'freedom agenda,' it's over," she says. "It turns out we stink at it. And we don't just stink at it in Iraq. We stink at it in Egypt. And in Lebanon. And in the Palestinian territories. And in Jordan. And in Yemen. And in Algeria. And everywhere else we try at it. Because, fundamentally, the message hasn't gotten out to the people on the ground.… There is no one out there saying, 'These are the marching orders. Follow them or go and find a new job.' That was what those fights were about. And the true believers lost. Now, that's not to say had they won, everything would be coming up roses. But I do think that we had a window of opportunity to avert a lot of problems that we now see."

For Kenneth Adelman, "the most dispiriting and awful moment of the whole administration was the day that Bush gave the Presidential Medal of Freedom to [former C.I.A. director] George Tenet, General Tommy Franks, and Jerry [Paul] Bremer—three of the most incompetent people who've ever served in such key spots. And they get the highest civilian honor a president can bestow on anyone! That was the day I checked out of this administration. It was then I thought, There's no seriousness here. These are not serious people. If he had been serious, the president would have realized that those three are each directly responsible for the disaster of Iraq."

The most damning assessment of all comes from David Frum: "I always believed as a speechwriter that if you could persuade the president to commit himself to certain words, he would feel himself committed to the ideas that underlay those words. And the big shock to me has been that, although the president said the words, he just did not absorb the ideas. And that is the root of, maybe, everything."
IV: Was Rumsfeld Lousy? You Bet!

Having started so badly, the neocons say, America's occupation of Iraq soon got worse. Michael Rubin is a speaker of Persian and Arabic who worked for Feith's Office of Special Plans and, after the invasion, for the Coalition Provisional Authority (C.P.A.), in Baghdad. Rubin, who is now back at the A.E.I., points to several developments that undermined the prospects for anything resembling democracy. First was the decision to grant vast powers to Bremer, thus depriving Iraqis of both influence and accountability. "You can't have democracy without accountability," says Rubin, and in that vital first year the only Iraqi leaders with the ability to make a difference were those who controlled armed militias.

The creation of the fortified Green Zone, says Rubin, who chose to live outside it during his year in Baghdad, was "a disaster waiting to happen." It soon became a "bubble," where Bremer and the senior C.P.A. staff were almost completely detached from the worsening realities beyond—including the swelling insurgency. "The guys outside—for example, the civil-affairs officers, some of the USAID [United States Agency for International Development] workers, and so forth—had a much better sense of what was going on outside, but weren't able to get that word inside," Rubin says. Because Bremer was their main source of information, Rumsfeld and other administration spokesmen were out of touch with reality and soon "lost way too much credibility" by repeatedly claiming that the insurgents were not a serious problem.

Meanwhile, waste, corruption, and grotesque mismanagement were rife. Perle tells me a story he heard from an Iraqi cabinet minister, about a friend who was asked to lease a warehouse in Baghdad to a contractor for the Americans in the Green Zone. It turned out they were looking for someplace to store ice for their drinks. But, the man asked, wouldn't storing ice in Iraq's hot climate be expensive? Weren't the Americans making ice as and when they needed it? Thus he learned the extraordinary truth: that the ice was trucked in from Kuwait, 300 miles away, in regular convoys. The convoys, says Perle, "came under fire all the time. So we were sending American forces in harm's way, with full combat capability to support them, helicopters overhead, to move goddamn ice from Kuwait to Baghdad."

Perle cites another example: the mishandling of a contract to build 20 health clinics. While it is certainly "a good thing for the U.S. to be building clinics, and paying for it," Perle says, "the prime contractor never left the Green Zone. So there were subcontractors, and the way in which the prime contractor superintended the project was by asking the subcontractors to take videos of their progress and send them into the Green Zone. Now, you've got to expect projects to go wrong if that's the way you manage them, and indeed they did go wrong, and they ran out of money, and the contract was canceled. A complete fiasco." He knows, he says, "dozens" of similar stories. At their root, he adds, is America's misguided policy of awarding contracts to U.S. multi-nationals instead of Iraqi companies.

To former C.I.A. director Woolsey, one of this saga's most baffling features has been the persistent use of military tactics that were discredited in Vietnam. Since 2003, U.S. forces have "fought 'search-and-destroy' instead of 'clear-and-hold,'" he says, contrasting the ineffective strategy of hunting down insurgents to the proven one of taking territory and defending it. "There's never been a successful anti-insurgency campaign that operated according to search-and-destroy, because bad guys just come back in after you've passed through and kill the people that supported you," Woolsey explains. "How the U.S. government's post-fall-of-Baghdad planning could have ignored that history of Vietnam is stunning to me." But Rumsfeld and Bush were never willing to provide the high troop levels that Woolsey says are necessary for clear-and-hold.

Adelman's dismay at the handling of the insurgency is one reason he now criticizes Rumsfeld so severely. He is also disgusted by the former defense secretary's claims that the mayhem has been exaggerated by the media, and that all the war needs is better P.R. "The problem here is not a selling job. The problem is a performance job," Adelman says. "Rumsfeld has said that the war could never be lost in Iraq; it could only be lost in Washington. I don't think that's true at all. We're losing in Iraq."

As we leave the restaurant together, Adelman points to an office on the corner of Washington's 18th Street Northwest where he and Rumsfeld first worked together, during the Nixon administration, in 1972. "I've worked with him three times in my life. I have great respect for him. I'm extremely fond of him. I've been to each of his houses, in Chicago, Taos, Santa Fe, Santo Domingo, and Las Vegas. We've spent a lot of vacations together, been around the world together, spent a week together in Vietnam. I'm very, very fond of him, but I'm crushed by his performance. Did he change, or were we wrong in the past? Or is it that he was never really challenged before? I don't know. He certainly fooled me."
V: "A Huge Strategic Defeat"

Though some, such as James Woolsey, still hope against hope for success in Iraq, most of the neocons I speak with are braced for defeat. Even if the worst is avoided, the outcome will bear no resemblance to the scenarios they and their friends inside the administration laid out back in the glad, confident morning of 2003. "I think we're faced with a range of pretty bad alternatives," says Eliot Cohen. "The problem you're now dealing with is sectarian violence, and a lot of Iranian activity, and those I'm not sure can be rolled back—certainly not without quite a substantial use of force that I'm not sure we have the stomach for. In any case, the things that were possible in '03, '04, are no longer possible." Cohen says his best hope now is not something on the way toward democracy but renewed dictatorship, perhaps led by a former Ba'thist: "I think probably the least bad alternative that we come to sooner or later is a government of national salvation that will be a thinly disguised coup." However, he adds, "I wouldn't be surprised if what we end up drifting toward is some sort of withdrawal on some sort of timetable and leaving the place in a pretty ghastly mess." And that, he believes, would be "about as bad an outcome as one could imagine.… Our choices now are between bad and awful."

In the short run, Cohen believes, the main beneficiary of America's intervention in Iraq is the mullahs' regime in Iran, along with its extremist president, Mahmoud Ahmadinejad. And far from heralding the hoped-for era of liberal Middle East reform, he says, "I do think it's going to end up encouraging various strands of Islamism, both Shia and Sunni, and probably will bring de-stabilization of some regimes of a more traditional kind, which already have their problems." The risk of terrorism on American soil may well increase, too, he fears. "The best news is that the United States remains a healthy, vibrant, vigorous society. So, in a real pinch, we can still pull ourselves together. Unfortunately, it will probably take another big hit. And a very different quality of leadership. Maybe we'll get it."

Frank Gaffney, of the Center for Security Policy, is more pessimistic. While defeat in Iraq is not certain, he regards it as increasingly likely. "It's not a perfect parallel here, but I would say it would approximate to losing the Battle of Britain in World War II," he says. "Our enemies will be emboldened and will re-double their efforts. Our friends will be demoralized and disassociate themselves from us. The delusion is to think that the war is confined to Iraq, and that America can walk away. Failure in Iraq would be a huge strategic defeat." It may already be too late to stop Iran from acquiring nuclear weapons, Gaffney says, pointing out that the Manhattan Project managed to build them in less than four years from a far smaller base of knowledge. "I would say that the likelihood of military action against Iran is 100 percent," he concludes. "I just don't know when or under what circumstances. My guess is that it will be in circumstances of their choosing and not ours."

Richard Perle is almost as apocalyptic. Without some way to turn impending defeat in Iraq to victory, "there will continue to be turbulence and instability in the region. The Sunni in the Gulf, who are already terrified of the Iranians, will become even more terrified of the Iranians. We will be less able to stop an Iranian nuclear program, or Iran's support for terrorism. The Saudis will go nuclear. They will not want to sit there with Ahmadinejad having the nuclear weapon." This is not a cheering prospect: a Sunni-Shia civil war raging in Iraq, while its Sunni and Shia neighbors face each other across the Persian Gulf armed with nukes. As for the great diplomatic hope—that the Iraq Study Group, led by George Bush Sr.'s secretary of state James Baker III, can pull off a deal with Syria and Iran to pacify Iraq—Perle is dismissive: "This is a total illusion. Total illusion. What kind of grand deal? The Iranians are not on our side. They're going to switch over and adopt our side? What can we offer them?"

If the neocon project is not quite dead, it has evidently suffered a crippling blow, from which it may not recover. After our lunch, Adelman sends me an e-mail saying that he now understands the Soviet marshal Sergei Akhromeyev, who committed suicide in the Kremlin when it became clear that the last-ditch Communist coup of 1991 was going to fail. A note he left behind stated, "Everything I have devoted my life to building is in ruins." "I do not share that level of desperation," Adelman writes. "Nevertheless, I feel that the incompetence of the Bush team means that most everything we ever stood for now also lies in ruins."

Frum admits that the optimistic vision he and Perle set out in their book will not now come to pass. "One of the things that we were talking about in that last chapter was the hope that fairly easily this world governed by law, the world of the North Atlantic, can be extended to include the Arab and Muslim Middle East," he says. "I think, coming away from Iraq, people are going to say that's not true, and that the world governed by law will be only a portion of the world. The aftermath of Iraq is that walls are going to go up, and the belief that this is a deep cultural divide is going to deepen." This is already happening in Europe, he adds, citing the British government's campaign against the wearing of veils by women and the Pope's recent critical comments about Islam. As neoconservative optimism withers, Frum fears, the only winner of the debate over Iraq will be Samuel Huntington, whose 1996 book famously forecast a "clash of civilizations" between the West and Islam.

Reading these interviews, those who always opposed the war would be justified in feeling a sense of vindication. Yet even if the future turns out to be brighter than the neocons now fear, the depth and intractability of the Iraqi quagmire allow precious little room for Schadenfreude. Besides the soldiers who continue to die, there are the Iraqis, especially the reformers, whose hopes were so cruelly raised. "Where I most blame George Bush," says the A.E.I.'s Michael Rubin, "is that, through his rhetoric, people trusted him, people believed him. Reformists came out of the woodwork and exposed themselves." By failing to match his rhetoric with action, Bush has betrayed them in a way that is "not much different from what his father did on February 15, 1991, when he called the Iraqi people to rise up, and then had second thoughts and didn't do anything once they did." Those who answered the elder Bush's call were massacred.

All the neocons are adamant that, however hard it may be, stabilizing Iraq is the only option. The consequences of a precipitous withdrawal, they say, would be far worse. Listening to them make this argument, I cannot avoid drawing a deeply disturbing conclusion. One of the reasons we are in this mess is that the neocons' gleaming pre-war promises turned out to be wrong. The truly horrifying possibility is that, this time, they may be right.
MIND VERSUS GODEL
by Damjan Bojadziev, from here

Formal self-reference in Gödel's theorems has various features in common with self-reference in minds and computers. The theorems do not imply that there can be no formal, computational model of the mind, but on the contrary, suggest the existence of such models within a conception of mind as subject to similar limitations as formal systems. If reflexive theories do not themselves suffice as models of mind-like reflection, reflexive sequences of reflexive theories could be used.

* Introduction
* Self-reference in Gödel's theorems
o Implications of Gödel's theorems
o Non-implications of Gödel's theorems
* Formal models of the mind
o The basic incompleteness argument
o Mind over machine omega:1 ?
o Reflexive sequences of reflexive theories
* Self-reference in computers
* Self-reference in minds
* Conclusion
* References

Introduction
At first sight, the designation of the topic of this special issue, "MIND <> COMPUTER", also transcribed as "Mind NOT EQUAL Computer", looks like a piece of computer ideology, a line of some dogmatic code. But there are as yet no convincing artificial animals, much less androids, and computers are not yet ready for the unrestricted Turing test. Although they show a high degree of proficiency in some very specific tasks, computers are still far behind humans in their general cognitive abilities. Much more, and in much more technical detail, is known about computers than about humans and their minds. Thus, the required comparison between minds and computers does not even seem possible, much less capable of being stated in such a simple formula.

On the other hand, it could be argued that it is precisely because we do not know enough about ourselves and our minds that we can make comparisons with computers and try to design computational models. This is especially so because we also do not know exactly what computers are incapable of, although we have some abstract, general results about their limitations, such as Turing's theorem about the inability of an idealized computer to determine for itself whether its computation terminates or not. This theorem, and related results by Gödel and Church, are frequently used in arguments about the existence of formal models of the mind; interestingly enough, they have been used to argue both for and against that possibility. As a preliminary observation, it can be noted that the "negative" use of limitative theorems, as these meta-mathematical results are called, is less productive in the sense that the faculty by which mind is supposed to transcend "mere" computation remains essentially mysterious. The "positive" use of the theorems promotes a more definite, less exalted view of the mind as something which has its own limitations, similar to those which formal systems have. The present paper argues for this latter view, exploring the common feature of all these theorems, namely self-reference, and focusing on Gödel's theorems.

Self-reference in Gödel's theorems
The application of Gödel's theorems to fields outside meta-mathematics, notably the philosophy of mind, was initiated by Gödel himself. He had a strong philosophical bent towards realism/platonism which also motivated his (meta)mathematical discoveries [Feferman 88, p. 96], [Weibel & Schimanovich 86]. Gödel first thought that his theorems established the superiority of mind over machine [Wang 90, pp. 28-9]. Later, he came to a less decisive, conditional view: if machine can equal mind, the fact that it does cannot be proved [Weibel & Schimanovich 86], [Casti 89, p. 321]. This view also parallels the logical form of Gödel's second theorem: if a formal system of a certain kind is consistent, the fact that it is cannot be proved within the system. Gödel's more famous first theorem says that if a formal system (of a certain kind) is consistent, a specific sentence of the system cannot be proved in it.

Gödel's theorems are actually special, self-referential consequences of the requirement of consistency: in a consistent system, something must remain unprovable. One unprovable statement is the statement of that very fact, namely the statement which says of itself that it is unprovable (first theorem): you cannot prove a sentence which says that it can't be proved (and remain consistent). Another unprovable statement in a consistent system is the statement of consistency itself (second theorem). In addition, if the formal system has a certain stronger form of consistency, the sentence which asserts its own unprovability, called the Gödel sentence, is also not refutable in the system. Rosser later constructed a more complicated sentence for which simple consistency is sufficient both for its unprovability and for its unrefutability. Similar sentences were constructed by others (Rogers, Jeroslow [Boolos 79, pp. 65-6]), showing that consistent formal systems cannot prove many things about themselves. On the other hand, a formal system can retain all the insight into itself that is compatible with consistency: thus, although it cannot prove its Gödel sentence, if it is to remain consistent, it can prove that very fact, namely the fact that it cannot prove its Gödel sentence if it is consistent [Robbin 69, p. 114].

Implications of Gödel's theorems
The fact that a particular sentence is neither provable nor disprovable within a system only means that it is logically independent of the axioms: they are not strong enough to either establish or refute it - they don't say enough about it one way or the other. Saying more, by adding additional axioms (or rules of inference) might make the sentence provable. But in Gödel's cases, this does not work: even if Gödel's sentence is added as an additional axiom, the new system would contain another unprovable sentence, saying of itself that it is not provable in the new system. This form of self-perpetuating incompleteness might be called, following [Hofstadter 79, p. 468] and [Mendelson 64, p. 147], essential incompleteness.

Gödel's theorems uncover a fundamental limitation of formalization, but they say that this limitation could be overcome only at the price of consistency; we might thus say that the limitation is so fundamental as to be no limitation at all. The theorems do not reveal any weakness or deficiency of formalization, but only show that the supposed ideal of formalization - proving all and only all true sentences - is self-contradictory and actually undesirable:

* what good is a formalization that can prove a sentence which says that it is not provable (first theorem)?
* what good is a formalization that can prove its consistency when it would follow that it is not consistent (second theorem)?

On the positive side, the theorems show that certain formal systems have a much more intricate, reflexive structure then formerly suspected, containing much of their own meta-theory.

Gödel's theorems show that the notions of truth and provability cannot coincide completely, which at first appears disturbing, since, as Quine says,

we used to think that mathematical truth consisted in provability [Ways of Paradox, p. 17].

Gödel's theorems undermine the customary identification of truth with provability by connecting truth with unprovability: the first theorem presents a case of
(*)
not provable -> true

(if the sentence asserting its own unprovability is not provable, then it is true); the second theorem presents a case of
true -> not provable

(if the sentence asserting the consistency of the system is true, then it is not provable). However, the notion of truth has a problem of its own, namely the liar paradox, of which Gödel's sentence is a restatement in proof-theoretic terms. Thus, Gödel's theorems do not actually establish any disturbing discrepancy between provability and truth. Furthermore, the implication (*) above is an oversimplification: assuming consistency, Gödel's sentence is not simply true, because it is not always true i.e. not in all interpretations. If it were, it would be provable, by the completeness theorem (also proved by Gödel), as noted in [Penrose 94, p. 116, note 5]: provability is truth in all interpretations). The first theorem shows that if the system is consistent, it can be consistently extended with the negation of the Gödel sentence, which means that the sentence is actually false in some models of the system. Intuitively, without going into details, this could be explained by saying that in those models the Gödel sentence acquires a certain stronger sense of unprovability which those models do not support [Bojadziev 95a, p. 391]. Gödel's theorem thus shows that there must always exist such unusual, unintended interpretations of the system; as Henkin says, quoted in [Turquette 50]:

We tend to reinterpret Gödel's incompleteness result as asserting not primarily a limitation on our ability to prove but rather on our ability to specify what we mean ... when we use a symbolic system in accordance with recursive rules [Gödel & the synthetic a priori].

Similarly, Polanyi says, though only in connection with the second theorem:

we never know altogether what our axioms mean [Personal Knowledge, p. 259]. We must commit ourselves to the risk of talking complete nonsense if we are to say anything at all within any such system [p. 94].

This characterization of formal language sounds more like something that might be said about ordinary, natural language. Thus, if we take as a characteristic of ordinary language its peculiar inexhaustibility and the frequent discrepancy between intended and expressed meaning ("we never know altogether what our sentences mean; we must risk talking nonsense if we are to say anything at all"), Gödel's theorems would show that, in this respect, some formal languages are not so far removed from natural ones. Certain similarities between the self-reference in natural language and in Gödel's sentence and theorems have also been noticed at the lexical and pragmatic level (indexicals [Smullyan 84], performatives [Hofstadter 79, p. 709]). This line of thought, namely that the self-reference which leads to Gödel's theorems makes a formal system more human, so to speak, will be followed here to the conclusion that such systems are indeed suitable for modelling the mind.

Non-implications of Gödel's theorems
Certain authors, especially some of those who attempt to apply Gödel's theorems to disciplines other than meta-mathematics, are handicapped by a more or less severe misunderstanding of the theorems. For example, Watzlawick, Beavin and Jackson state:

Gödel was able to show that it is possible to construct a sentence G which

1. is provable from the premises and axioms of the system, but which
2. proclaims of itself to be unprovable.

This means that if G be provable in the system, its unprovability (which is what it says of itself) would also be provable. But if both provability and unprovability can be derived from the axioms of the system, and the axioms themselves are consistent (which is part of Gödel's proof), then G is undecidable in terms of the system [Pragmatics of Human Communication, p. 269].

Of course, this is completely garbled, but the authors nevertheless have very interesting ideas about applications of Gödel's theorems.

Formal models of the mind
Gödel's (first) incompleteness theorem can be expressed in the form: a sufficiently expressive formal system cannot be both consistent and complete. With this form, the attempt to use such formal systems as models of the mind invites the following brush off:

Since human beings are neither complete nor consistent, proving that computers can't be both doesn't really help [Roger B. Jones, "quoted" from sci.logic, May 1995].

A different intuition was followed by Wandschneider: the limitations of formalization revealed by Gödel's theorems prevent the use of formal systems as models of the mind [Wandschneider 75]. Most authors, however, accept the comparison between mind and formal systems of the kind considered by Gödel, but reach different conclusions. For example, according to Haugeland

most people are agreed ... that [Gödel's] result does not make any difference to cognitive science [Mind Design, p. 23].

According to [Kirk 86], arguments against mechanism based on Gödel's theorems are agreed to be mistaken, though for different reasons; cf. [Dennett 72] and especially [Webb 80]. These arguments try to establish the superiority of mind by suggesting that mind can reach conclusions which a formal system cannot, such as Gödel's sentence.

The basic incompleteness argument
Arguments about the relative cognitive strength of minds and machines usually invoke only the first Gödel theorem, although the second theorem also establishes the existence of a sentence which, if true, is not provable. The premise of both theorems is consistency, and it frequently appears neglected in the basic version of the argument from incompleteness: since any formal system (of a certain kind) contains a true but unprovable sentence, mind transcends formalism because mind can "see " that the unprovable sentence is true. This conviction can be traced, in various forms, from [Penrose 94], [Penrose 89], through [Lucas 61] back to [Nagel & Newman 58, pp. 100-1]. For example, Lucas says:

However complicated a machine we construct, it will ... correspond to a formal system, which in turn will be liable to the Gödel procedure for finding a formula unprovable-in-that-system. This formula the machine will be unable to produce as being true, although a mind can see it is true. And so the machine will still not be an adequate model of the mind [Minds, Machines and Gödel].

The consistency premise is not very prominent here, but some suspicious phrasing is: 'producing as being true', 'seeing to be true', instead of the simpler and more to the point 'proving'. This way of comparing cognitive strength in men and machines leaves out an obvious symmetry while emphasizing a dubious asymmetry. The symmetry is that, just as a formal system cannot prove a sentence asserting its own unprovability, unless it is inconsistent, so can a mind not do so, if it is consistent; cf. [Casti 89, p. 321]. The doubtful asymmetry between mind and machine concerns their possession of the notion of truth. The mind is supposed to have this notion in addition to the notion of provability, and is supposed to have no problems with it (but it does, namely the liar paradox). On the other hand, the machine is only supposed to be able to prove things (as its only means of establishing truth) without having, and apparently without being able to have, an additional notion of truth. But this is not so: for expressing the truth of the Gödel sentence (as opposed to proving it), even the most restricted definition of the truth predicate true1(x), covering sentences containing at most one quantifier, is sufficient [Webb 80, p. 197].

Mind over machine omega:1 ?
A more intricate version of the argument from incompleteness considers adding a "Gödelizing operator" to the system. This form of the incompleteness argument was also first advanced by Lucas:

The procedure whereby the Gödel formula is constructed is a standard procedure ... then a machine should be able to be programmed to carry it out too ... This would correspond to having a system with an additional rule of inference which allowed one to add, as a theorem, the Gödel formula of the rest of the formal system, and then the Gödel formula of this new, strengthened, formal system, and so on ... We might expect a mind, faced with a machine that possessed a Gödelizing operator, to take this into account, and out-Gödel the new machine, Gödelizing operator and all [Minds, Machines and Gödel].

The sound part of this argument is already contained in the notion of essential incompleteness: a Gödel operator only fills a deductive "lack" of the system by creating a new one. Adding the Gödel sentence of a system as a new axiom extends the notion of provability and thereby sets the stage for a new Gödel sentence, and so on. Thus, a Gödel operator only shifts the original "lack" of the system through a series of displacements, without ever completing the system.

The Lucas argument, especially in the form advanced by Penrose [Penrose 94], now centers on how far into the transfinite can a Gödel operator follow the mind's ability to produce the Gödel sentence of any system in the sequence
S0
S1 = S0 + G(S0)
S2 = S1 + G(S1)
...
Somega
Somega+1 = Somega + G(Somega)
.........
A relevant result here is the Church-Kleene theorem which says that there is no recursive way of naming the constructive ordinals [Hofstadter 79, p. 476]. This would mean that a Gödel operator could only follow the mind's ability to produce Gödel sentences through the recursively nameable infinite; cf. [Penrose 94, p. 114]. [...] On the other hand, as Webb says,

there is not the slightest reason to suppose that ... a machine could not model the 'ingenuity' displayed by a mind in getting as far as it can [Mechanism, Mentalism and Metamathematics, p.173].

But for the purposes of this paper it is more interesting to observe that it does not seem plausible that the argument about the formalizability of mind should be decided by the outcome of the race between mind and machine over remote reaches of transfinite ordinality. And even if it makes sense to conceive of mind as always being able to out-reflect a reflective formal model, it would seem that the ability to perform the self-reflection is more important than the question of how far does this ability (have to) reach.

Reflexive sequences of reflexive theories [update]
A further possibility in the direction of making reflexive formal models is to make the progression of reflexive theories itself reflexive. The usual ways of extending a reflexive theory by adding its Gödel sentence, or the statement of consistency (Turing), or other reflection principles (Feferman) are themselves not reflexive: what is added to a theory only says something about that theory, and nothing about the one which its addition produces. Thus, what is usually added to a theory does not take into account the effect of that very addition, which is to shift the incompleteness of the original theory to the extended one. Of course, certain things about the extended theory cannot be consistently stated; for example, the sentence stating that its addition to a theory produces a consistent theory would lead to contradiction, by the second Gödel theorem. But the sentence which is added to a theory could make some other, weaker statement about the theory which its addition produces. If the procedure of theory extension operated not only on the theory it is to extend but also on a representation of itself, it could build on its own action and improve its effects. It might thus produce in a single step an extension which is much further down the basic sequence of extensions, produced by linear additions of Gödel sentences; the size of this ordinal jump could then be taken as a measure of the reflexivity of the procedure. This kind of procedure, operating on something which contains a representation of that procedure itself, is already familiar from the construction of the Gödel sentence: the process of diagonalization operates on a formula containing a representation of (the result of) that very process [Hofstadter 79, p. 446], [Bojadziev 90]. Another example of a reflexive procedure of this kind would be the Prolog meta-circular interpreter, which can execute itself, though only to produce statements of iterated provability [Bratko 90, p. 536].

Self-reference in computers
In saying of itself that it is not provable, the Gödel sentence combines three elements: the representation of provability, self-reference and negation. In computer science, self-reference is more productive in a positive form, and in programs, programming systems and languages more than in individual sentences. The first ingredient in Gödel's sentence, the representation of provability, corresponds to the explicit definition of the provability predicate of a logic programming language in that same language. In the simplest case, specifying Prolog provability in Prolog, the definition consists of just a few clauses [Bratko 90, p. 536], comparable to those which express the conditions on the provability predicate under which Gödel's theorems apply. This definition of Prolog provability is then used as a meta-circular interpreter to extend the deductive power of the basic interpreter, for example by detecting loops in its proof attempts. This use of the meta-circular interpreter could be compared to the work of the Gödel operator on extending the basic, incomplete theory. Meta-circular interpretation is also applicable to other programming languages, notably LISP [Smith 82].

Generalizing meta-circular interpretation, provability can be specified in a separate meta-language, and reflection principles defined for relating and mixing proofs in both languages. Such meta-level architectures [Yonezawa & Smith 92] can be used to implement reflective or introspective systems, which also include an internal representation of themselves and can use it to shift from normal computation about a domain to computation about themselves [Maes & Nardi 88] in order to achieve greater flexibility. Meta-level architectures are useful for knowledge representation, allowing the expression and use of meta-knowledge, and opening the possibility of computational treatment of introspection and self-consciousness [Giunchiglia & Smaill 89, p. 128]. For example, Perry suggested an architecture of self-knowledge and self in which indexicals mediate between bottom level representations, in which the organism is not itself represented, and higher levels at which it is represented generically, as any other individual [Perry 85].

Self-reference in minds
The basic lesson of Gödel theorems, namely that the ability for self-reflection has certain limits, imposed by consistency, does not seem to be less true of minds than it is of formal systems. Applied to minds, it would translate to some principled limitation of the reflexive cognitive abilities of the subject: certain truths about oneself must remain unrecognized if the self-image is to remain consistent [Hofstadter 79, p. 696]. This formulation recalls the old philosophical imperative which admonishes the subject to know himself. If this were simple or possible to do completely, there would be no point to it; the same goes for the explicit interrogative forms: who am I, where am I going, what do I want, ... Hofstadter rhetorically asks:

Are there highly repetitious situations which occur in our lives time and time again, and which we handle in the identical stupid way each time, because we don't have enough of an overview to perceive their sameness? [Gödel, Escher, Bach, p. 614].

Such an overview can be hard to achieve, especially in regard to oneself, as Laing's knots in which minds get entangled show [Laing 70]. In a similar vein, Watzlavick, Beavin and Jackson suggest that the limitative theorems show the mathematical form of the pragmatic paradoxes to which humans are susceptible in communication [Watzlavick et al 67, p. 221]. [update]

It may be that, as Webb says, the phrase 'the Gödel sentence of a man' is an implausible construction [Webb 80, p. x], but certain interpretations might be imagined, such as self-falsifying beliefs. On a humorous note, the Gödel sentence for a human could work like a recipe for self-destruction, activated in the process of its comprehension or articulation; by analogy with the terminology of performatives we might call it a (self-)convulsive, asphyxiative, combustive, ... A more elaborate interpretation, as the paralysing effect of some self-referential cognitive structure, is presented in Cherniak's story [Hofstadter & Dennett 81, p. 269]. The history of logic itself records lethal cases (Philetas) and cases of multiple hospitalization (Cantor, Gödel). Of course, this is all anecdotal, speculative and inconclusive, but it does suggest that the apparent gap between minds and machines could be bridged, in two related ways:

* the vulnerability of minds to paradoxes of self-reference
* the implementation of self-referential structures in machines

The mind-machine gap could thus be reduced by emphasizing the formal, machine-like aspects of the mind and/or by building mind-like machines.

Finally, taking speculation one literal step further, the self-reference in Gödel's sentence can be compared to a formal way of self-recognition in the mirror, by noticing the parallelism between things (posture, gesture, movement) and their mirror images. The basis for this comparison is the way the Gödel code functions as a numerical mirror in which sentences can refer to, "see" themselves or other sentences "through" their Gödel numbers. The comparison, developed in [Bojadziev 95b], covers the stages of construction of Gödel's sentence and relates them to the irreflexivity of vision and the ways of overcoming it. The comparison attempts to turn arithmetical self-reference into an idealized formal model of self-recognition and the conception(s) of self based on that capacity. The motivation for this is the cognitive significance of the capacity for self-recognition, in mirrors and otherwise. The ability to recognize the mirror image, present in various degrees in higher primates and human infants, has been proposed as an objective test of self-awareness [Gregory 87, p. 493]. Self-recognition in the mirror is a basic, even paradigmatic case of self-recognition, the general case being the recognition of effects on the environment of our own presence in it. Self-recognition in this wider sense is the common theme of Dennett's conditions for ascribing and having a self-concept and consciousness [Hofstadter & Dennett 81, p. 267]. Self-recognition is also the common theme of the self-referential mechanisms which, according to [Smith 86], constitute the self:

* indexicality (self-relativity of representations)
* autonimy (recognizing one's own name)
* introspection (recognizing one's own internal structure)
* reflection (recognizing one's place in the world)

The comparison between formal and specular self-reference and self-recognition might also connect these contemporary attempts to base the formation of a self(-concept) on the capacity for self-recognition with the long philosophical tradition of thinking about the subject in optical terms.

Conclusion

It is not possible to see oneself completely, in the literal, metaphorical ("see=understand"), formal and computational sense of the word. Gödel's theorems do not prevent the construction of formal models of the mind, but support the conception of mind (self, consciousness) as something which has a special relation to itself, marked by specific limitations.