Thursday, July 14, 2005

A bottle in front of me, or a frontal lobotomy?

As the brief editorial below suggests, the book is terrific in that it is as much about the business of medicine as it is about a flawed tragic figure.


Volume 353:119-121 July 14, 2005 Number 2

Last-Ditch Medical Therapy — Revisiting Lobotomy

Barron H. Lerner, M.D., Ph.D.

Desperate times call for desperate measures. So thought Walter J. Freeman, a neurologist who became the United States's staunchest advocate of the lobotomy between the 1930s and the 1970s. A new book, The Lobotomist, by journalist Jack El-Hai,1 chronicles Freeman's advocacy of a procedure that was viewed by many, and continues to be viewed, as barbaric. In exploring the ways in which lobotomy became part of common medical practice, El-Hai raises questions not only about how we should judge the procedure in retrospect, but also about what lobotomy teaches us about last-ditch medical interventions.

In the early 1900s, relatives frequently committed their loved ones to long stays in understaffed, overcrowded, and often filthy mental institutions. The therapeutic options for severe mental illness were quite limited.

One option, the lobotomy, also known as leucotomy, was devised in 1935 by the Portuguese neurologist Egas Moniz. It involved drilling holes in the skull and using a blade to sever nerve fibers running from the frontal lobes to the rest of the brain. Moniz believed that psychiatric symptoms were caused by faulty nerve connections established over a period of years. If these nerves were severed and new connections were allowed to form, he postulated, patients' symptoms would improve. Lobotomies were originally used to treat patients with depression but were later often performed to treat schizophrenic patients suffering from agitation and paranoid delusions.

The principal U.S. proponent of lobotomy was Freeman, of George Washington University Medical School. In June 1937, at the annual meeting of the American Medical Association, Freeman and his colleague James W. Watts, a neurosurgeon, presented data on 20 patients who had undergone lobotomy.2 Their paper launched a fierce debate on the procedure. On the one hand, certain members of the medical profession consistently condemned it as brutal, unscientific, and harmful. This appears to have been the case with the 1941 lobotomy performed on Rosemary Kennedy, the mildly retarded sister of John F. Kennedy, whose cognitive functions were severely worsened by the operation. The negative image of the lobotomy entered the popular culture through Ken Kesey's 1962 novel One Flew Over the Cuckoo's Nest and the movie based on it, in which the rebellious hero becomes nearly catatonic after undergoing the operation.

On the other hand, Freeman's data painted quite a different picture. The condition of 13 of the 20 patients, he and Watts claimed, had improved. In one case, a 63-year-old housewife who had had increasing anxiety and agitation for a year, they said, "now manages home and household accounts, enjoys people, attends theater, drives her own car."2

Bolstered by such results, which were confirmed by later studies, Freeman's enthusiasm for lobotomy increased. In 1946, he devised the so-called transorbital lobotomy, in which he used a mallet to pound an ice pick through the patient's eye socket into the brain, then moved the pick around blindly to sever the nerve fibers. He traveled the world promoting his new procedure.

Certain physicians, especially those who treated the roughly 400,000 patients in state mental hospitals, embraced the lobotomy. So did the media, thanks in part to Freeman's showmanship. Tens of thousands of lobotomies were performed in the United States before the introduction of chlorpromazine and other neuroleptic medications made the operation all but obsolete by the 1960s. In 1949, Moniz was awarded the Nobel Prize in Physiology or Medicine for inventing the procedure.

One of the virtues of historical scholarship is its dynamism: each scholar, building on new information and insights, can revise the conclusions of earlier works. The first book to evaluate lobotomy, Elliot S. Valenstein's Great and Desperate Cures,3 was highly critical of Freeman and his operation, which Valenstein saw as providing a cautionary tale about overzealous physicians. Joel Braslow's Mental Ills and Bodily Cures argued that a major motivation for lobotomies was to create "apathetic, indifferent, and docile" patients who would be more compliant than they had been.4 But Jack D. Pressman, in Last Resort, emphasized the importance of evaluating historical events within the context of their own time.5 Although the notion of cutting brain tissue in order to make people submissive is repugnant from our modern perspective, the ability to discharge psychiatric patients even to a limited existence at home was perceived as a therapeutic triumph in the 1940s and 1950s.

Having immersed himself in Freeman's papers, El-Hai found himself, much to his surprise, siding with Pressman. The physician who had been compared to the Nazi doctor Josef Mengele actually appeared to have helped many people. For example, Harry Dannecker, an Indiana man with a long history of anxiety and depression, had been suicidal before he underwent a lobotomy in 1937; during World War II, completely recovered, he worked long hours in a war-materials plant. Among the pieces of evidence stressed by El-Hai are thousands of letters from grateful patients. Freeman and Watts, one wrote, "saved my mind and set my spirit free."

So, was lobotomy a reasonable intervention for a desperate problem or a routine cause of harm, as Christine Johnson, whose grandmother had a lobotomy in 1954, charges? Johnson has founded a Web site,, that is sponsoring a petition to get Moniz's Nobel Prize revoked.

One difficulty in assessing the procedure arises from the nature of Freeman's research. He kept in touch with as many patients as possible, even traveling across the country to find them. Yet since he conducted no controlled studies, interpreting his data is difficult. For example, since mental illness in any particular patient may wax and wane, it is possible that some patients' symptoms might have improved even if portions of their brains had not been cut away. And grateful letters may represent a skewed sample. Still, it is hard to deny that some patients who had been institutionalized for years lived apparently satisfactory lives after undergoing lobotomy — even, in rare cases, becoming lawyers or physicians, according to El-Hai.

Surely the most disturbing aspect of Freeman's story was his decision to perform lobotomies on unwilling patients. Some of the stories El-Hai recounts are positively gruesome. In 1950, for example, Freeman did a transorbital procedure in a motel room while police held the agitated patient down. As late as the 1960s, he performed lobotomies in otherwise healthy adolescent boys who had been diagnosed with anxiety — an act that surely violated medicine's admonition to "do no harm." To the extent that Freeman's fellow physicians knew about and tolerated such activities, this episode represents a blot on the history of the medical profession.

But whereas Freeman's later excesses raise obvious red flags, his earlier efforts on behalf of a population of very ill patients pose a more complicated question. To what degree should physicians and researchers "push the envelope" in search of an effective remedy? Here the history of lobotomy offers a somewhat surprising answer. Lobotomy was not, as it was long considered, an aberrant and cruel therapy promulgated by fringe practitioners. Rather, it exemplified a common characteristic of medical practice, in which doctors and patients have often felt the need to "do something" in the face of seemingly hopeless situations. In such cases, some patients have inevitably served as guinea pigs. Radical cancer surgery, artificial-heart implantation, and the early organ transplantations come to mind. Sometimes, the interventions are the first step toward a successful remedy; in other instances, they prove worthless.

In this sense, Freeman's story is less a cautionary tale of a doctor gone wrong than a cautionary tale of business as usual in medicine. Last-ditch medical interventions will probably always be with us. We must therefore continue to scrutinize them, not only in retrospect but as they are being conceptualized, publicized, and carried out.

Source Information

Dr. Lerner is an associate professor of medicine and public health at the Columbia University Medical Center, New York.


El-Hai J. The lobotomist: a maverick medical genius and his tragic quest to rid the world of mental illness. New York: Wiley, 2005.
Laurence WL. Surgery used on the soul-sick: relief of obsessions is reported. New York Times. June 7, 1937:1, 10.
Valenstein ES. Great and desperate cures: the rise and decline of psychosurgery and other radical treatments for mental illness. New York: Basic Books, 1986.
Braslow J. Mental ills and bodily cures: psychiatric treatment in the first half of the twentieth century. Berkeley: University of California Press, 1997.
Pressman JD. Last resort: psychosurgery and the limits of medicine. Cambridge, England: Cambridge University Press, 1998.

The New England Journal of Medicine is owned, published, and copyrighted © 2005 Massachusetts Medical Society. All rights reserved.


Post a Comment

<< Home