There has never been greater or more widespread suffering in America than in the years of our Civil War, 1861- 1865, and a huge proportion of it came from disease. The concentration of vast armies gave U.S. and Confederate soldiers and physicians a medical situation unprecedented in America—which they were ill-prepared to deal with, despite European lessons from the Crimea and the Napoleonic period. Men in the armies faced the massive threat of epidemic disease, whether in treatment of others or through personal contraction. While inexperience and medical ineptitude prolonged or even caused much suffering from such diseases, general knowledge of how to limit them did increase during the war years, as did knowledge of how to treat the suffering patients. In fact, the process of dealing with such problems better prepared Americans to embrace and apply the theoretical advances of Pasteur and others in the years and decades to come.
As this study seeks to understand how Americans in the military perceived and dealt with disease in this period, it will examine both contagious diseases (such as smallpox) and environmentally-caused diseases (such as diarrhea). While the vectors are different, physicians and soldiers generally considered them linked, and treated all such diseases in the same ways—though types of treatment varied widely. These resulted in failures as well as successes both intended and accidental.
A STATE OF UNREADINESS
The state of medical knowledge in America in 1861 was mediocre at best, compared to that of Europe (Houck 1986, 50). Some students learned by apprenticeship from a single practitioner; a student at a medical school often required only four to nine months of study to graduate (Cunningham 1958, 11). The studies usually consisted of large lecture classes which merely observed dissections and other demonstrations, rather than engaging in hands-on work (Schroeder-Lein 1994, 30-31); the administrations of these schools often cared more about the recruitment of students to pay tuition than the skill or dedication of the prospective doctors (Rutkow 2005, 51) . Much of the doctrine taught at such schools was heavily grounded in the work of physicians like Dr. Benjamin Rush (1745-1813), an American physician highly regarded for his prolix and definitive writings on many areas of medicine. Unfortunately his work consisted of little more than strong reaffirmations of Greco-Roman medical practices, with little or no acknowledgement of any advances since then; his death ironically resulted from complications of his own bleeding and purging regime (Rutkow 2005, 53). In spite of this, his professionalism and conviction caused many to regard him as “the American Hippocrates” (Rutkow 2005, 41-44).
Armed with often outdated principles, most school physicians graduated into a world of small-town medical isolation, where few could or would compare notes with other physicians to find better treatments and identify unsatisfactory ones. Others learned their trade as apprentices, producing practitioners Rutkow calls “unbeholden to guidelines or principles;” he encapsulates the problem by noting that “the unscientific basis of medical practice made it impossible to prove or disprove the soundness of specific claims” (2005, 49). For better or for worse, the old and new physicians who would fight the war on disease during the greater conflict generally gained what experience they had through practical work, rather than study. Chance decided whether a doctor’s experiences caused him to progress beyond or regress into the fallacies circulating in the country’s cumulative medical consciousness.
The existing knowledge relating to epidemic disease was largely empirical; Thomas P. Lowry described well the related areas of understanding of “crucial medical verities: [T]he use of quinine in the prevention and treatment of malaria; vaccination as a preventive of smallpox; [and] opium and its derivatives in the treatment of pain and diarrhea” (2001, 131). Overuse of drugs in some of these cases will be discussed below, but properly controlled methods were effective. Civil War physicians did possess some progressive and useful knowledge going into the conflict.
Conversely, some long-held, long-debunked medical ideas still persisted in physicians with little or no modern, formal training. Some still practiced the ancient techniques of bloodletting and blistering (Houck 1986, 135). Medical historicist Ira Rutkow notes that “the mid-nineteenth century physician’s material medica (the drugs and other therapeutic substances used in medicine) consisted of herbal and mineral concoctions similar to those used since the time of Hippocrates” (2005, 40). A semipopular belief among physicians held that diseases could transform from one to another over the course of illness, which allowed multiple conflicting diagnoses to take place. This could and sometimes did result in dangerous combinations of treatments. Some students recognized the folly of their elders: historian H.H. Cunningham states that “Some young practitioners . . . undoubtedly felt that the sick ‘would be better of if they entrusted entirely to nature rather than to the haphazard empiricism of the doctors, with their blistering, bleeding, and monumental dosing’” (1958, 13). Indeed, in 1866 surgeon Peyer Porcher would admonish the former “Confederate physicians on their overzealous use of certain drugs” (Houck 1986, 134). Then as now, overdoses could kill patients, and often did.
America had relatively few cities and citydwellers in the 1860s, which shaped the medical destinies of both physicians and soldiers. Soldiers who grew up on farms or in small towns had never been exposed to many “childhood diseases” such as measles and mumps; thus their chances of catching them (and consequently developing immunities) was low. Such diseases can be much more dangerous when contracted by adults. Large musters of nonimmune and un-vaccinated troops only needed one case of such diseases before many were laid low—as the examples in the following section shall indicate. Most physicians had experience in small towns, or in rural beats with comparably small populations. Moreover, as the recruits’ bodies were ill-prepared to deal with the huge concentrations of men, the physicians were illprepared to deal with the sheer magnitude of the problems that would arise. Huge numbers of contagious sick men, unparalleled sanitation problems, and the vagaries of the elements all confronted the Civil War physician—and only in success and failure of response would he learn from it all.
With the onset of war in 1861, soldiers and physicians flocked to the colors on both sides to offer their services. But while the rank and file would have months of time and training before their first battles, the trials began for the doctors almost as soon as they rode into camp. The aforementioned childhood diseases began to take their toll almost immediately and other diseases followed. Peter Houck describes the recurring problem of disease in warfare: “The cramming together of humanity at its worst, soldiers in close quarters, wetness, exposure, mental and physical exhaustion, all engendered an ideal culture media [sic] for opportunistic bacteria and viruses” (1986, 40). The soldiers would now endure nature’s worst.
Horace Cunningham points out that Southerners were more likely to suffer from these because the South had fewer cities and thus a higher proportion of its recruits lacked immunities (1968, 25). Nonetheless, both sides had numerous cases of measles, mumps, whooping cough, and chicken pox. A typical representative of the units mustering North and South, the Third Tennessee had half to three-quarters of its members down with childhood diseases, especially measles, in its first two months of service: about 500-750 sick out of 1,000 men (Shroeder-Lein 1994, 42). Peter Houck estimates that measles attacked one in seven Confederates in summer 1861 (1986, 41); the disease rendered 67,763 Union men disabled and 4,246 dead over the course of the war (Houck 1986, 45). Similar tolls are estimated for the effects of other childhood diseases.
Even as these childhood diseases cut their swathes through the rural soldiery, other contagious diseases ravaged the camps regardless of the recruits’ home origins. The deadliest of these, by far, was smallpox. The aptly-named “Pest House” in Lynchburg, Virginia, the quarantine location for all the city’s smallpox patients, provides a good example of the horrors caused by this disease:
“Soldiers bedridden with the draining sores of smallpox were crowded into the twostory frame house, incubating a mixture of epidemic pestilence. The stench from the sores and poor hygiene was so pungent that drunkenness was the only way patients and staff could tolerate it . . . It was no wonder that this ‘smallpox hospital’ was conveniently located next to the Confederate graveyard” (Houck 1986, 45).
Union troops suffered 12,236 cases of smallpox throughout the war, causing 4,717 deaths. Though most Confederate records are incomplete due to the burning of Richmond at war’s end, it is known that in one 16-month period, 1,020 smallpox victims died in Virginia alone (Houck 1986, 41).
Edward Jenner’s discovery of a vaccination process for smallpox had occurred at the end of the 18th century and (some) doctors on both sides of the Atlantic knew well how to implement it. Tragically however, vaccination was not considered a requirement for recruits upon initial enlistment (Houck 1986, 40), and many susceptible individuals entered the ranks, especially in the first year of the war. This was a general problem in recruiting stations across both sections of America—officers eager to fill their quotas allowed many men of weak constitutions to be sent to the camps. Many of these contracted diseases and spread them to their fellows. Equally harmful was the laxity of standards for army physicians—the weakness in their training caused more deaths through incompetence (Cunningham 1968, 24).
Even before the diseases brought in by the men had finished taking their toll, the environmentally-caused diseases began to arise: diarrhea, dysentery, cholera, pneumonia, malaria, and typhoid fever, among others. While pneumonia is caused by exposure to the elements, and malaria (or “marsh miasm,” a coining which demonstrates physicians’ lack of understanding of the vector (Rutkow 2005, 16) by mosquitoes, the majority of these were caused or magnified by the problems of sanitation in camps—which deserve some discussion.
The absence of the germ theory of disease, which was gradually coalescing in Europe but unheard-of in America, meant that sanitation was little understood. The illiteracy and ignorance of most recruits and doctors meant that sanitary practices were not implemented, especially in the first years of the war. Hospitals, supposedly bastions of health, were sometimes quite the opposite. Sterilization was not practiced, and medical personnel often disagreed on standards of cleanliness (Schroeder-Lein 1994, 109). Bedding and instruments were reused in hospitals without washing, thus continuing the spread of infection.
Views on sanitation differed, but one doctor spoke for many in writing that “the ambition of superior cleanliness, whether it be permanent or a spasmodic feeling, should not be pushed to extremes, or be considered the one thing needed in our hospital” (Houck 1986, 45). With the masses of sick and wounded to deal with, even physicians who desired cleanliness did not usually have the experience or the power to effect it. Without proper sanitation, hospitals concentrating large numbers of sick and wounded often helped more to spread diseases than to assuage them. This was also true for hospital staff; author Louisa May Alcott’s stint as a Northern hospital nurse was cut short by typhoid she caught from patients (Schroeder-Lein 1994, 76). Epidemic diseases made their mark on Alcott’s memory of her brief stay at the hospital: “I spent my hours . . . with pneumonia on one side, diphtheria on the other, [and] two typhoids opposite” (Rutkow 2005, 225).
Conditions in the camps were often even worse. Food, trash, and feces were left everywhere, and “bathers, cattle, cooks, defecators, drinkers and launderers would all share a campsite’s water stores” (Rutkow 2005, 123). Latrines, when they were dug and used, were often still located near the supplies of drinking water, or near hospitals, against the wishes of the doctors, who certainly objected to the smell if not the sanitation problem (Schroeder-Lein 1994, 44). Cholera, diarrhea, dysentery, and typhoid are all transmitted through the excretions of sick individuals—so it is no wonder camps were a breeding ground.
Doctors tried many varied solutions, some of which failed miserably. Some believed diarrhea was caused by “limestone water," while others believed it to be a cure (Schroeder-Lein 1994, 83). A crucial goal for many physicians in the treatment of typhoid and dysentery was “keeping the bowels open,” until the harmful liquids and solids had passed out of the body; to this extent purging with mercury was accepted (Schroeder-Lein 1994, 32-3). However, many such treatments did more harm than good.
The combined effects of epidemic diseases and medical ignorance took a harsh toll on the fighting forces. Some examples may illustrate: in the campaign leading up to the Battle of First Manassas in July 1861, two untried armies maneuvered toward each other with many illness-prone recruits. The Confederate Army of the Shenandoah, moving to reinforce other forces at Manassas, numbered nearly 11,000 men in total— but left 1,700 of them behind due to sickness (Cunningham 1968, 27). The next month, General Joseph Johnston (CSA) reported that out of 18,178 men present at Manassas, 4,809 were sick (Cunningham 1968, 25).
In spring 1862, the 85th New York sailed to the Virginia Peninsula with the rest of the Army of the Potomac. In one month’s delay in the wet low-lying terrain, it lost 97 men, dead or incapacitated, to malaria alone, one ninth of its initial strength upon enlistment in late 1861—not counting any deaths from disease prior to the Peninsula Campaign. The entire army suffered similarly, with the added misery of diseases brought on by human waste seeping through the marshy ground; within two months 20% of its men were incapacitated (Rutkow 2005, 117). The regiment’s surgeon, William Smith, angrily wrote in his diary: “One of the largest, best and most munificently appointed armies ever led into battle has been wasted in the poisonous swamps of the Peninsula…Forty percent of the [army] was hors de combat before the series of battles commencing with the 26th of June [three months after the army made landfall]” (Lowry 2001, 25). Yet only four percent of those incapacitated were rendered so by combat wounds—disease caused the rest (Rutkow 2005, 147). During the same campaign General Henry Wise (CSA) reported to his superiors that nearly 25% of his 1,700 men were ill from various causes (Cunningham 1968, 73). Malaria has no post-disease immunity, so it continued to plague the armed forces throughout the war: Northern records for the war count nearly 1.4 million sick and 15,000 dead from malaria over the course of the conflict (Rutkow 2005, 15).
William Smith’s principal rancor came from the commanding general’s neglect to move the army promptly out of such disease-infested areas—even knowledge the doctors did possess could not always be applied adequately because of such factors. Planning for disease became a part of military physician life: Samuel Stout, medical director for the Army of Tennessee (CSA), allowed for the presence of 4,000 sick in each of the major hospitals he planned (Schroeder-Lein 1994, 124). This didn’t even begin to accommodate the battle casualties once they began to arrive. A frequent encounter for Civil War physicians was the filling up of hospitals with diseased men, which left little room for the wounded.
More than temporary absences, the fatality rate of the diseases was devastating.
In the prison camps of both sides disease was especially appalling; an average of 18.4% of all Union prisoners in Confederate camps died from infectious disease (Houck 1986, 127). Dysentery and diarrhea killed 6,000 men from both sides in the first 19 months of the war, and physician Bedford Brown stated that “nine-tenths of all recruits were attacked by this condition and so weakened physically that they became easy victims for other ailments” (Cunningham 1958, 185). Joseph Jones, a Confederate surgeon, estimated that of deaths from all causes between January 1862 and August 1863, 17,209 were caused by typhoid (or 25% of the total), and around 19,000 from pneumonia in the same period (Cunningham 1958, 194-202). At Andersonville, widely acknowledged as the worst Confederate prison, nearly 13,000 of its total of 49,000 prisoners died over the eight months it operated. 11,086 of these were reported from disease, out of 15,987 reported cases of disease at the prison hospital (Cunningham 1958, 7)—a 69% mortality rate among those reported sick.
A Southern newspaper correspondent at Manassas in the fall of 1861 wrote: “Disease is by long odds too common and too fatal in our camps . . . a painful degree of mortality has prevailed . . . on Sunday I visited the ‘Junction’ to procure a coffin, and found thirteen orders ahead of me” (Cunningham 1968, 69-70). This testifies to the fate of many Civil War soldiers—death by childhood diseases in the early months, and death by other epidemic diseases for many of those who survived the first onslaught. Luckily for many more people,
IMPROVEMENTS OVER THE COURSE OF THE WAR
soldiers and civilians alike, the army physicians who did not crack under the terrible strain began to learn from their mistakes. Amidst the many failures of Civil War medicine, medical improvements would be made through empirical discoveries in the millions of cases treated, and progressive ideas validated by success were disseminated among American physicians. These would affect both the latter course of the war and the decades to come, though the scope of this essay covers only the former.
Doctor J.J. Terrell, a Confederate physician at Lynchburg, Virginia, was one of the progressive physicians who made a difference in patients’ lives with his methods. Observant of the news from Europe concerning the connection between cleanliness and the spread of disease, he took to boiling his instruments in hot water (Houck 1986, 55). Transferred to the aforementioned Pest House by his own request, he experimented and found that three inches of dry white sand piled on the floor would expel the smell of smallpox sores and improve the atmosphere for patients and doctors alike, allowing better health and treatment. He also expanded the facility and improved sanitation and nutrition; all of the above helped him bring the smallpox death rate at Pest House from 50% (double the average of that time) to 5% (Houck 1986, 56-7). Better sanitation gradually became more and more generally accepted as its effects began to show in the hospitals. The advancement of ideas about sanitary issues in hospitals bore particular fruit at the hospital complex on Richmond’s Chimborazo Hill, by far the largest and best-organized of any military hospital during the war, or any previous American hospital (see below).
Sanitation principles conceived by the doctors made their way into regular camp life as well. Dr. Jonathan Letterman, in addition to revolutionizing the army’s hospital organization, handed down general cleanliness policies which eventually became standard. Tents were to be relocated every week, soldiers were compelled to bathe at least once a week, and stricter regulations of latrine location were instituted. Daily sanitation inspections of each camp were instituted to ensure the measures were rigorously enforced (Rutkow 2005, 126-7). Enough mistakes had been made that soldiers and physicians alike finally began to recognize the patterns of unsanitary conditions, and to embrace empirically tested methods to stop them.
The advances discovered in the armies, especially those in hygiene, began to filter down to civilian life even before the war was over. Rutkow points out that the efforts of civilian groups such as the U.S. Sanitary Commission to render medical aid to soldiers caused rippling effects on the civilian population. Several members of the Sanitary Commission who were residents of New York launched a campaign of sanitation reform which led to the city’s implementation of new cleanliness laws and a citywide sanitation survey in 1863-1864. The key players, of course, were espousing reforms they learned while tending to sick Federal soldiers down south (2005, 268-70).
Nutrition was often overlooked as a factor in patients’ health while combating their diseases, but became recognized and gradually refined. By 1863, the Confederate Surgeon General had issued an order concerning diet amounts and types for hospital patients, and further regulations followed requiring hospital personnel to prescribe a proper diet via a “diet roll” for each patient in each ward (Cunningham 1958, 82). Hospitals were allowed first priority on foods beyond the ken of normal military rations, like chicken, eggs, vegetables, and milk—and as the war went on more were established with their own gardens and pasture land to sustain the facilities (Schroeder-Lein 1994, 103). Though supplies were often limited, doctors evidently associated different foods with different types of disease and recovery and tried to apply that knowledge, as Horace Cunningham states: “Surgeons in charge of hospitals generally attempted to vary the diet as much as possible and to provide special foods for those who required them” (1958, 83).
Both sides undertook hospital construction to care for sick and wounded more effectively, and in better locations, in the years following 1861. In 1864, Confederate Surgeon General Samuel Moore ordered research into proper methods of hospital construction, to be gathered from physicians across the South. In particular he asked for information on “the proper proportion of patients to cubic capacity; necessary hygienic measures to ensure rapid recovery; origin, communication, prevention, and cure of infectious maladies peculiar to military hospitals; and the employment of disinfectants” (Cunningham 1958, 63). Though many of these issues could not be adequately addressed in the failing South’s medical military structure, the mere fact of their mention shows the new sophistication beginning to permeate that structure; they were at least learning to ask the right questions. General Moore’s shining example was Chimborazo Hospital, established in Richmond in 1861 and expansive and active throughout the war. With large, uncrowded, and varied facilities, good situation for drainage and water supplies, fine ventilation, numerous bakeries, icehouses, and pastures to support its capacity of 8,000 patients, Chimborazo became a model for future military and civilian hospitals alike (Denney 1994, 49-50).
Rutkow’s study of medical improvements over the course of the war notes that the Federal military medical administration finally began to draw from the example of the Crimean War. He comments that Nightingalian principles of sanitation reform became so prevalent that pavilion-style hospitals (providing appropriate ventilation and light) “were now considered sacrosanct.” At least one Federal physician declared that no shelter at all for sick men was better than hospitals in improper, unhealthy buildings (2005, 163).
Treatment of diseases themselves improved as well. Malaria, which had long troubled the South in the antebellum years, was treated on a large scale for the first time with quinine, due to success with that treatment (Houck 1986, 47). Though the childhood diseases slowed their rampage late in the war for the simple reason that most surviving soldiers were now immune to them, more advanced quarantine procedures helped as well. By 1865, the rate of measles on the Union side had fallen from 77 per 1,000 in 1861, to two per 1,000 men, (Houck 1986, 47).
The other major improvement in the American treatment of epidemic disease as a result of the war regarded smallpox. This took place in the development of measures to deal with large smallpox outbreaks in the military, and in the recognition of the importance of smallpox vaccination. Medical officers on both sides ordered smallpox patients to be quarantined in separate facilities and canceled furloughs for enlisted men or officers who had been recently exposed (Houck 1986, 43). Though the microbial nature of smallpox was not yet known, these directives showed that some practical knowledge was gained in understanding how to limit the spread of the disease.
After smallpox epidemics in 1862 and 1863, the Confederate Medical Department directed that all hospital patients be vaccinated against contracting the disease from others in the wards. The scarcity of the vaccine made it all the more valuable— a good smallpox scab from a child (the best source from which a vaccine could be generated) was worth $5 at that time—no small sum. The military necessity even generated benefits for the civilian population: Confederate Surgeon General sent a message to Lynchburg in 1862 which instructed physicians there to “insure you have a fresh crust [of] vaccine virus with which you will vaccinate gratuitously the young children of the neighborhood and in this way a sufficient quantity of vaccine will be obtained” (Houck 1986, 47). The process had been suggested, even required, in some states in the 15 years before the war, but the concentration of men and the deadly toll exacted upon them by smallpox during the war gave renewed impetus to, and experience in, eradicating the disease through vaccination. Although many soldiers died from smallpox, Ira Rutkow points out that smallpox inoculations were eventually given to enough fighting men that its incidence did not reach epidemic levels (2005, 16)—the mid-war changes in combating the disease kept things from getting worse. Many of the younger physicians who saw their first major trials during the Civil War would spearhead the major American smallpox eradication efforts in the latter half of the nineteenth century—under one government.
When peace emerged in 1865, much of the American South had been destroyed and every section of the country had lost a substantial number of its citizens to the maelstrom of war. Over 620,000 Americans had died, and while accounts differ (and are complicated by missing records on the Confederate side), it is apparent that from 300,000 to 400,000 of those deaths were caused by disease—in all likelihood more than half of the total deaths in the war. Ironically, many, if not most, of the deaths in America’s bloodiest war happened in hospital tents far from the battlefield.
The graves at Andersonville bear mute testimony to those who succumbed to illness during America’s war. Image from Photo. Net (At http://photo.net) Throughout the turmoil physicians fought their own war of epidemic disease, and, as in the shooting war, there were successes and failures, errors in judgment and flashes of inspiration, which would resound for years to come. Ira Rutkow rightly points out that “there were no astounding medical breakthroughs during the American Civil War” (2005, 318)—the silver lining amidst the terrible cost was the concentration of physicians in a context which allowed them to widely test and disseminate the knowledge they did have. Houck states it well: “numerous doctors who had been working in a vacuum [were cloistered] into a crucible where they had to communicate, share ideas, and] set standards” ([1986, 135). The minds and shared consciousness of veteran doctors emerging from the conflict could now prove fertile ground for the astounding medical breakthroughs which would burst upon the scene in the next few decades. Its deadliest war over, America would now benefit from their experience in understanding and applying the wave of medical discoveries of the late nineteenth century.
Bush, Vivian Karen. “The History Time-Line of the 85th New York Volunteers.” Alleghany County, New York GenWeb site. <http://www.rootsweb.ancestry.com/~nyallega/.>
Cunningham, Horace H. Doctors in Gray: The Confederate Medical Service. Gloucester: Louisiana State Press, 1958.
- - -. Field Medical Services at the Battles of Manassas (Bull Run). Athens: University of Georgia Press, 1968.
Denney, Robert E. Civil War Medicine: Care and Comfort of the Wounded. New York: Sterling Publishing Company, Inc, 1994.
Houck, Peter W. A Prototype of a Confederate Hospital Center in Lynchburg, Virginia. Lynchburg: Warwick House Publishing, 1986.
Lowry, Thomas P., ed. Swamp Doctor: The Diary of a Union Surgeon in the Virginia and North Carolina Marshes. Mechanicsburg: Stackpole Books, 2001.
Rutkow, Ira M. Bleeding Blue and Gray: Civil War Surgery and the Evolution of American Medicine. New York: Random House, 2005.
Schroeder-Lein, Glenna R. Confederate Hospitals on the Move: Samuel H. Stout and the Army of Tennessee. Columbia: University of South Carolina Press, 1994.
Welsh, Jack D. Medical Histories of Confederate Generals. Kent: Kent State University Press, 1995.
- - -. Medical Histories of Union Generals. Kent: Kent State University Press, 1996.