Surgical Specialties Emerging from World War I (Part One of Two)

Around mid-November, I was to give the paper captioned above at a World War I Symposium entitled “Imperial Implosions: World War One and Its Implications” at California State University Channel Islands in Camarillo. About an hour before I was to go on, the campus was evacuated because of the advancing Woolsey Fire. The university was physically safe, but continued smoke forced symposium sponsors to cancel, and the campus didn’t reopen until the Monday after Thanksgiving.

Here is that paper, presented in two parts, the next part next week. I could not get the footnotes to transfer, so if you want to see them, let me know and I’ll send you a copy of the paper.

At the very worst, this should help you fall asleep tonight!

Echoes of World War One in the Surgical World

Thomas L Snyder

The Hippocratic admonition that “a man who would become a surgeon should join an army and follow it” has long been cliché in medical circles. The ancients generally attributed disease to the anger of unappeased gods. But they took a distinctly practical view when it came to the injuries of combat: sword and arrow wounds were visibly the result of human agency, and called for active human intervention instead of appeals to distant deities. The desire to help their comrades eventually saw the emergence of men who “specialized” in handling the wounds of war. The Sumerian / Akkadian (4000-1000 BCE) Asu was an empirical operator who wielded sharp (surgical) instruments, in contradistinction to the more spiritual sorcerer (Baru) and priestly (Ashipu) healers of disease. The Assyrians (900-600 BCE) formalized the Asu’s role as a military official, responsible for wound care, field hygiene (including burial of the dead), and health assessment of prisoners of war (who were prospective slaves). The Egyptian swnw was similarly appointed to serve in the army in war and peace; he was expected to be skilled in the management of war wounds and other injuries. In ancient India, Hindu men who practiced surgery were given the sobriquet shalyahara – remover of arrows. Similarly, the ancient Greek word for “physician”, iatros, translates as “arrow extractor” from Ionian Greek. While physicians may have been contracted by their Generals, Roman legionaries received field treatment from men called capsarii, binders of wounds. Roman military hospitals – an innovation that arose from the need to care for soldiers at the frontiers of the empire and a long, hazardous distance from home – were sophisticated permanent structures that featured an elaborate surgical set-up and protected interior “pulse” space designed for the care of an influx of fresh combat injuries should local battles break out.

Not much in the way of surgical advances occurred in the 5th through the 15th centuries after the Roman era. The Byzantines, military and medical successors to the Romans, who referred to themselves as Romanoi (“Romans”, but in Greek) merely perpetuated the Roman way of combat casualty care. Medieval Arabs produced advances in eye surgery and translated ancient Greek and Roman writings in medicine and surgery. Being a largely nomadic people, they also developed a form of mobile hospital for use in military and civil settings. The teaching of surgery was dropped from the curricula of French universities of the early Renaissance (the notion being that surgeons, who worked with their hands, were “laborers”, and not worthy of the scholarly tradition of medicine), though the Italians maintained a robust academic surgical tradition. Once the Catholic Church’s abhorrence for the shedding of blood in surgery (especially by educated priests), and of dissection of cadavers for the study of anatomy was overcome, and once the rigid conservatism of the Scholastic tradition yielded to the humanism of the Renaissance, surgery began to make advances in both theory and practice. Even so, until the advent of effective anesthesia in the mid-19th Century, major surgery of any type was a fraught affair, undertaken by brave surgeons for desperate patients and done as rapidly as possible so as to finish the operation before the patient went into shock because of the pain and blood loss. European surgeons led the way, and 19th century doctors from America and other nations typically toured the famous hospitals of England, Scotland, France, Germany, Austria and Italy to learn the most up-to-date techniques. Finally, once the bacterial cause of surgical infections was elucidated, techniques to check the infections (“antiseptic surgery”) and, later, to prevent infection (“aseptic surgery”) finally made the major kinds of surgery we think of today, particularly orthopedic and abdominal operations safe. Surgery on chest organs came much later. Meanwhile, advances in medicine, especially in vaccination (especially against smallpox), but also in nutrition (for instance, scurvy was a major cause of death among siege armies in the 13th through the 15th centuries) and hygiene (for instance, the understanding that cholera, an often-fatal infection of the digestive tract, came from water contaminated by excrement provided scientific justification for careful regulation of field latrines in relation to water supplies) meant that by the time of the Franco-Prussian War, for the first time in human history, deaths from combat injuries actually outnumbered those caused by disease and contagion. By the outbreak of World War I, the only apparent significant gaps in our understanding or tools of combat casualty care involved the prevention and treatment of shock, and the treatment of infection in contaminated wounds.

In the Great War, artillery barrages and mass infantry attacks produced the expected extremity, chest and abdominal wounds. But the unique aspect of trench warfare saw men standing in trenches peering out at the enemy with just heads showing. These men suffered brain and facial injuries in unexpected numbers. Wounds of the head and neck accounted for 15 – 20% of all combat wounds during the Great War. Grievous wounds of the brain and of the face resulted in the evolution of two new surgical specialties – neurosurgery and plastic reconstructive surgery. This is necessarily an example of “great man” history, because almost no one had practice (and certainly no one was trained) in these areas of the surgical art prior to the war. Necessarily then, brave pioneering surgeons played an outsized role in wading in where no man had gone before, to establish principles of practice that largely persist until today.

As regards neurosurgery, the great man is the American Harvey Cushing. Up until the advent of good anesthesia and aseptic (infection-preventing) surgical technique, few men had ventured into the cranium, and when they did, the complications of hemorrhage and infection, almost invariably fatal, discouraged further efforts. Cushing undertook to study and practice brain surgery at Harvard starting around 1908. While individual surgeons had written about their pioneering forays into neurosurgery , it was by no means an established specialty, and no formal training programs existed. As Cushing himself put it, “[a]nything classified as neurological is looked upon by many of us as baffling and difficult, and a feeling prevails that the ultimate functional results after recovery from serious cranial injuries are, to say the least, forlorn. Few medical officers had received training in the surgery of the central nervous system before the war, no organized instruction has been given in the subject since; and the tools provided for the work have been inadequate and antiquated.” Thus it was that, when Cushing arrived in Europe as a volunteer in 1915, he had opportunity to observe the work of just a few individuals who were making pioneering efforts to respond to the wounds that modern warfare had wrought. The wounds sustained by soldiers in Europe carried special risks because of the fields in which they fought had for centuries been well fertilized with manure and therefore bore a rich variety of bacteria, many of which were carried by projectiles or shrapnel into the brain, along with fragments of filthy clothing. The combination of the physical damage and contamination demanded a vigorous surgical response. Early in the war, individual French, German, Austrian, Russian and British surgeons took up the gauntlet. They gradually, through experience, established guidelines and techniques that improved outcomes, but much of this work was unpublished. When Cushing returned to Europe with a Harvard team of fourteen surgeons and four nurses in March, 1917, as director of American Base Hospital #5, he soon was detached to a BEF receiving hospital, where he and his team operated full time on neurosurgical cases. By late April, they started the work of consolidating the experiences of their European predecessors by carefully and systematically utilizing, then adjusting their techniques to lay down principles of traumatic brain surgery. As he and his associates gained experience, their results steadily improved so that by war’s end, the survivorship of brain surgery for war wounds had increased from around 45% to 71%. One of his earliest learnings was that sticking a finger into the brain to find a bullet or fragment was a bad idea (he referred to this as “Little Jack Horner” surgery); rather, Cushing adopted the use of soft rubber tubes snaked into the wound track. By applying gentle suction, he could remove damaged brain tissue, bone fragments and other wound debris. He even adopted a technique using a magnetized steel nail to extract metal fragments from deep inside the brain. One other surgical innovation that Cushing adopted was to layer Dichloramine T, referred to as a chlorine antiseptic at the time, but really an early sulfa antibiotic precursor, into the brain wounds. Combined with careful surgical technique and an insistence on operation as soon after wounding as possible, this approach reduced the rate of brain infections to near zero by war’s end.

Cushing’s Illustration of the Use of a Soft Catheter and Gentle Suction to Debride and Irrigate Brain Wounds

Before the U.S. entered the war, and based on Allied experience, Army Surgeon General Gorgas concluded that we would need something like 200 neurosurgeons. In response to a national survey, about 50 men stepped forward, claiming experience. At this point, Gorgas established crash 70-day programs in Philadelphia, Chicago, New York, St. Louis and Camp Greenleaf, GA to train selected surgeons in the art of brain surgery. Ultimately, about 190 neurosurgeons served in Europe. Only a few of them continued in the specialty after the war. Cushing returned to Harvard after the war. He published his learnings and expanded a training program in neurosurgery that he had started before the war. Perhaps in part because of an ongoing debate between non-surgeon neurologists and neurosurgeons over their respective bailiwicks, training of the surgical specialists seemed to languish in the United States, and an official certifying body, the American Board of Neurological Surgeons didn’t even come into existence until 1940. Only a few training programs, in New York, Virginia, San Francisco, St Louis, Cleveland the Mayo Clinic in Rochester, Minnesota, the Johns Hopkins Hospital in Baltimore and at the University of Pennsylvania, and perhaps a few others operated in the interwar period. A similarly desultory effort at neurosurgical training appears to have obtained in France and Britain, while the dictatorial regimes of German and the Soviet Union seem to have done a better job of planning for the contingencies of war. As a result of this dearth of residency programs, the Army could count on a pool of only about 200 trained neurosurgeons at the beginning of World War II. Once again, short training programs were established to teach the rudiments of brain surgery to promising young general surgeons. These programs produced about 250 brain surgeons and went far to meet the demand. By war’s end, there had been nearly 61,000 neurosurgical admissions to U.S. Army hospitals. After the war, training programs proliferated both in the United States (110 in 2018) and abroad so that today, most major medical schools train neurosurgeons, of whom about 3500 practice in the United States. Postwar neurosurgery in a divided Germany presents an interesting story, as robust development in Western Germany produced a growth in neurosurgical centers from 18 in 1950 to 85 by the early 21st century; on the other hand, in Eastern Germany, a struggling economy and regressive regime limited neurosurgical progress to just a few talented individuals who gained worldwide notoriety. By 2006, 1200 fully trained neurosurgeons were serving the entire German population, performing nearly a quarter million neurological surgeries yearly. Many advances in the years since World War I, including antibiotics, CT scanning, electrocautery for control of bleeding, and the use of medications to reduce the brain swelling that accompanies brain injuries have led to ever safer and more successful brain surgery. Today, surgery for traumatic brain injuries represents about 18% of all brain operations in the U.S. (2011 statistics, the most recent year for which statistics are available). 18% of traumatic brain injuries are caused by firearms (the majority being suicides); of those, 90% are fatal, the patients dying usually even before reaching hospital. Cancer surgery, such as that offered to the late Senator McCain, represents another 21% of brain surgery.

Next week, Part II, the story of plastic / reconstructive surgery.

(C)2018 Thomas L Snyder

Advertisements
Post a comment or leave a trackback: Trackback URL.

Comments

  • Kay Flavell  On 03 Dec 2018 at 10:37

    A very interesting read, with a most impressive historical span.

    • thomaslsnyder  On 03 Dec 2018 at 12:03

      Thanks, Kay. As a non-historian writing medical history, I always worry that the work isn’t up to par, despite (or perhaps because of) the PhDs’ mantra at the American Association for the History of Medicine (once dominated by old white physicians and now dominated by (mostly) old white PhDs), that “MDs provide the verisimilitude while PhDs provide veracity” (or something like that…).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: