Arquivo da categoria: Estados Unidos

Obama Builds Environmental Legacy With 1970 Law (New York Times)

WASHINGTON — President Obama could leave office with the most aggressive, far-reaching environmental legacy of any occupant of the White House. Yet it is very possible that not a single major environmental law will have passed during his two terms in Washington.

Instead, Mr. Obama has turned to the vast reach of the Clean Air Act of 1970, which some legal experts call the most powerful environmental law in the world. Faced with a Congress that has shut down his attempts to push through an environmental agenda, Mr. Obama is using the authority of the act passed at the birth of the environmental movement to issue a series of landmark regulations on air pollution, from soot to smog, to mercury and planet-warming carbon dioxide.

The Supreme Court could still overturn much of Mr. Obama’s environmental legacy, although the justices so far have upheld the regulations in three significant cases. More challenges are expected, the most recent of which was taken up by the court on Tuesday. The act, however, was designed by lawmakers in a Democratic Congress to give the Environmental Protection Agency, which was created at the same time, great flexibility in its interpretation of the law.

Gina McCarthy, the Environmental Protection Agency administrator, credits the Clean Air Act of 1970 for giving the president the authority to make new, far-reaching environmental policy.CreditManuel Balce Ceneta/Associated Press 

“It’s the granddaddy of public health and environmental legislation,” said Paul Billings, a vice president of the American Lung Association. “It empowers the E.P.A. and states to be bold and creative.”

Gina McCarthy, the E.P.A. administrator, credits the act for the authority that Mr. Obama claims in setting environmental policy. “The administration is relying very heavily on this tool that Congress provided us 44 years ago,” she said.

Jody Freeman, director of Harvard University’s environmental law program, and a former counselor to the president, said Mr. Obama was using the Clean Air Act “to push forward in a way that no president ever has.”

Taken together, the Clean Air Act regulations issued during the Obama administration have led to the creation of America’s first national policy for combating global warming and a fundamental reshaping of major sectors of the economy, specifically auto manufacturing and electric utilities. The regulations could ultimately shut down existing coal-fired power plants, freeze construction of new coal plants and end demand for the nation’s most polluting fuel.

Republicans and the coal industry have attacked the new rules as a “war on coal.”

Mr. Obama’s most recent regulation, proposed on Wednesday, would reduce ozone, a smog-causing pollutant that is created by emissions from factories and coal plants and is linked to asthma, heart disease and premature death. That regulation is the latest of six new rules intended to rein in emissions of hazardous pollutants from factory and power-plant smokestacks, including soot, mercury, sulfur and nitrogen oxide.

The most consequential regulations are those that cut emissions of carbon dioxide, the gas dispersed from automobile tailpipes and coal plants and which contributes to global warming.

More rules are on the way: By the end of the year, the E.P.A. is expected to announce plans for regulating the emission of methane at natural gas production facilities.

Republicans and industry leaders have fought back against the rules, attacking them as “job-killing” regulations. “The Clean Air Act is a direct threat,” said Hal Quinn, president of the National Mining Association.

Among the fiercest critics is Senator Mitch McConnell, Republican of Kentucky, who is expected to take over as majority leader in the next Congressional term and whose home state is a major producer of coal. Mr. McConnell has vowed to put forth legislation to block or delay the administration’s regulations.

Although the E.P.A. regulations are today the target of Republican ire, in 1970 the Clean Air Act passed with overwhelming bipartisan support, clearing the Senate with a vote of 73 to 0. President Richard M. Nixon, a Republican, signed the bill into law. “The idea was to give E.P.A. broad authority, making sure that it had tools to exercise this authority,” said Robert Nordhaus, an environmental lawyer who, as a staff lawyer in the House legislative counsel’s office, helped draft the law. Today Mr. Nordhaus is a senior partner at the environmental law firm Van Ness Feldman.

Another Republican president, the first George Bush, enacted a 1990 update to the Clean Air Act, which strengthened the E.P.A.’s authority to issue regulations. Mr. McConnell was among the 89 senators who voted for passage of the 1990 law. “I had to choose between cleaner air and the status quo,” Mr. McConnell said at the time. “I chose cleaner air.”

The 1990 iteration of the Clean Air Act also included requirements that the E.P.A. issue, and periodically update, regulations on pollutants such as ozone and mercury. Some of Mr. Obama’s new regulations are a result of that requirement.

Mr. Obama, however, is the first president to use the law to fight global warming. After trying and failing to push a new climate-change law through Congress aimed at curbing greenhouse gas pollution, the president went back to the Clean Air Act.

The E.P.A. issued a Clean Air Act regulation in Mr. Obama’s first term. The agency required automakers to comply with tough new vehicle fuel-economy standards of 54.5 miles per gallon by 2025. The regulations compelled the auto industry to research and develop hybrid and electric vehicles. Those requirements alone are expected to lead to a major reduction of carbon pollution in the coming decades.

Next year, the E.P.A. is to finalize two regulations aimed at limiting pollution from new and existing coal-fired power plants. Once they are enacted, the regulations could eventually transform the way electricity is produced, transmitted and consumed in the United States, leading to more power generation from alternative sources like wind, solar and nuclear.

But the regulations could also cause costly disruptions in power reliability and transmission, forcing companies to look for breakthroughs in technology to meet the requirements.

Officials at the Edison Electric Institute, which lobbies for privately owned electric utilities, said the regulations were forcing the industry to drastically reshape the way it does business. “He’ll have dozens of these rules under his watch,” Quin Shea, vice president of the institute, said of the president. “Taken together, they will have a far-reaching effect of transforming the electric power sector for the next 20 years.”

Correction: December 2, 2014
An article on Thursday about President Obama’s new environmental regulations misstated how ozone gets into the air. Ozone is a smog-causing pollutant created by emissions from factories and coal plants; it is not itself emitted into the air. The error also occurred in an article and headline on Wednesday about the announcement of the regulations.

The History of Pain (The Appendix)

Banner_pain

How should we write the history of that most fundamental but subjective characteristic of sentience: pain?

The History of Pain

I recently came across a fascinating article in The Appendix by Ph.D candidate Lindsay Keiter entitled Interpreting “Physick”: The Familiar and Foreign Eighteenth-Century Body. Law professor Frank Pasquale excerpted it thusly:

Because I am an historian of pain, this excerpt naturally piqued my interest, and I went to examine the entire article. Now, I must confess straight off that I study 19th and early 20th century America. But I spend an awful lot of time interloping in early modern and medieval studies of pain, in part because my work addresses changing ideas of pain in the 19th century. If you really want to understand how ideas about pain change in the modern era, you need to know at least something about the ideas that preceded them.

I was, I confess, quite skeptical about the excerpt, but I wanted to read the article from start to finish. Here is what Ms. Keiter has to say about pain:

Most visitors are mildly alarmed to learn that there was nothing available for mild, systemic pain relief in the eighteenth century. You’d have to come back next century for aspirin. Potent pain management was available via opium latex, often mixed with wine and brandy to make laudanum. In the eighteenth century, small amounts were used as a narcotic, a sedative, a cough suppressant, or to stop up the bowels, but not for headaches.

This is (appropriately) carefully qualified, but even so, I do not think it is quite right. I think there are two points that are really important to clarify when thinking about the use of medicinal therapies for the relief of pain.

First, it has long been argued that professional healers at least as far back as the Middle Ages generally were not focused on alleviating their patients’ pain. Of course, then, as now, pain is a multivalent, rich, and highly ambiguous phenomenon, one that lends easily to metaphor and account in a wide variety of social domain. So, as Esther Cohen shows, most discussions of pain in Western medieval culture tend to appear in theological contexts, whereas early modern and modern expressions of pain often appear more in literary formats. It is actually surprisingly difficult to find people discussing their own phenomenologies of pain specifically in therapeutic contexts.

wheelie

A mid-17th century depiction of a quack doctor and his assistants performing public surgery on an unfortunate young man. Various medicinal liquors and unguents are on display to his right, and a recently treated man is hastily downing a post-op beer while being wheeled away from the scene.Jan Steen, “The Quack Doctor,” c. 1660, Rijksmuseum Amsterdam.

But both medievalists and early modernists have set about revising or at least complicating some aspects of the long-held belief that analgesia was not a major priority. Cohen shows beyond doubt in late medieval culture that both lay sufferers and healers focused much on pain, and that there is ample evidence from which to conclude that healers believed in the importance of and strove, where possible, to alleviate their charges’ pain. She notes:

Surcease might not have been the primary goal of physicians, who often considered pain an ancillary phenomenon, but in the end, the recommended cure was meant to also bring freedom from pain. It is important to remember that the great majority of the suffering sick agreed with this point of view. People turned to saints, physicians, or simple healers to have their pain eases, not increased. No matter how vociferous the literature in praise of pain is, it cannot silence the evidence for the basic human search for painlessness.

The evidence, as I understand it, suggests that a medieval emphasis on the redemptive qualities of pain and the difficulties in ameliorating it existed simultaneously along with a fairly intense and significant focus on the need to alleviate it.

On Twitter, historian of medicine Samantha Sandassie noted that one can find many recipes for analgesic remedies in early modern casebooks and treatises:

Daniel Goldberg @prof_goldberg .@FrankPasquale @ArsScripta I’m a modern historian of pain, not EM, but this strikes me as not quite right. Relief of pain was major [1]

Samantha Sandassie @medhistorian.@FrankPasquale @ArsScripta agreeing with @prof_goldberg on this one; surgical casebooks & treatises contain fair bit of pain mngment info.

In a follow-up email, Ms. Sandassie suggests examining primary sources such as The Diary of Elizabeth Freke and The Diary of the Rev. Ralph Josselin.

In early modern contexts, historians such as Lisa Smith and Hannah Newtonhave documented overt and in some contexts (the pain and suffering of children) even overwhelming medical and healing attention to experiences of pain and the need for its alleviation in illness scenarios.

So, I would want to suggest that we lack a lot of good evidence for the claim that even “mild” and “systemic” pain did not occupy the attention of healers in the West during the 18th c., and we have an increasing historiography suggesting that in fact such pain occupied a good deal of attention both in those who experienced it and in those from whom the pain sufferers sought relief.

The second point I want to make here is a larger claim regarding thinking about how well medicines may have “worked” in past contexts. And here I’d like to emphasize the flip side of Ms. Sandassie’s excellent point above: that while the past may not be incommensurable, it is nevertheless at times so very different from our contemporary world that presentism is an ever-present danger. The past, as L.P. Hartley famously observed, is a foreign country, and sometimes we benefit from treating it that way.

What does it mean for a remedy to “work”? And in answering this question as responsible historians, we cannot supply an answer that provides the criteria for what “works” in our contemporary contexts — and of course there are vibrant debates on exactly what it means for a medicine to “work” even among contemporaries. For example, does a medicine work for the relief of pain if it fails to surpass placebo in a relevant clinical trial? Given that placebos can be quite effective in relieving pain, at least temporarily, do we conclude from the failure that the medicine does not “work”?

In historical context, historians of medicine should IMO aim to answer this question by asking what it meant for people in the periods in which we are interested for a remedy to “work.” Although I am an historian of ideas rather than a social historian, I take the lessons of the New Social Turn seriously. If we really want to gain insight into the phenomenology of pain and illness in the past, we have to inquire as to the social meaning of medicines and remedies in their own contexts.

In his classic 1977 paper “The Therapeutic Revolution: Medicine, Meaning and Social Change in Nineteenth‑Century America,” Charles Rosenberg argued that remedies that “worked” in the early 19th century were those that had visible effects consistent with what one would expect and desire in a humoral system:

The American physician in 1800 had no diagnostic tools beyond his senses and it is hardly surprising that he would find congenial a framework of explanation which emphasized the importance of intake and outgo, of the significance of perspiration, of pulse, or urination and menstruation, of defecation, of the surface eruptions which might accompany fevers or other internal ills. These were phenomena which he as physician, the patient, and the patient’s family could see, evaluate, and scrutinize for clues as to the sick person’s fate.

But if diagnosis for both the physician, the illness sufferer, and the family depended in pertinent part on the visible* signs that signified morbid changes in humoral balance, one would predict that remedies which also operated on this semiotic basis would be so favored. Rosenberg states:

The effectiveness of the system hinged to a significant extent on the fact that all the weapons in the physician’s normal armamentarium worked — “worked,” that is, by providing visible and predictable physiological effects: purges purged, emetics induced vomiting, opium soothed pain and moderated diarrhea. Bleeding, too, seemed obviously to alter the body’s internal balance, as evidenced both by a changed pulse and the very quantity of blood drawn. Blisters and other purposefully induced local irritations certainly produced visible effects — and presumably internal consequences corresponding to their pain and location and to the nature and extent of the matter discharged.

This, then, is the point. It is not productive, in my view, to think about whether or not early 19th c. or 18th c. remedies for pain “worked” by applying present notions of efficacy. Such an approach does not make sense of how those who used and received the remedies for pain would have understood those remedies — it obfuscates both their own phenomenologies of pain and their own efforts and the efforts of their caregivers, intimates, and healers to alleviate that pain. What we would have to do to understand the extent to which 18th c. remedies for pain “worked” is to understand what it meant for such a remedy to work for people of that time.

Rosenberg, of course, emphasizes the importance of understanding “biological and social realities” of therapeutics in early 19th c. America. And in a follow-up Twitter exchange, Benjamin Breen was quick to point out (correctly, I think), that

Benjamin Breen @ResObscura @prof_goldberg @allenshotwell @medhistorian But on the other hand, the biological efficacy of drugs had a real historical role, no? I.e.
Daniel Goldberg @prof_goldberg @ResObscura @AllenShotwell @medhistorian *writing feverishly* — hope to have post up soon . . .
Benjamin Breen @ResObscura @prof_goldberg @allenshotwell @medhistorian cinchona RX for malaria was major event, precisely because it “worked” where others failed.

This is an important corrective. Acknowledging what anthropologists have termed the “social lives of medicines” is not equivalent to denying “biological reality.” However, I reject a neat distinction between biological action and cultural factors. This is not to deny the reality of the former, but to argue instead that the distinction is not particularly helpful in making sense of the history of medicine.

*   *   *

Interpreting “Physick”: The Familiar and Foreign Eighteenth-Century Body

Photograph of recreated apothecary shop

A recreated apothecary shop in Alexandria, Virginia, is representative of the style and organization of eighteenth-century shops.Wikimedia Commons

Aspirin. It’s inevitable.

When asked what medicine they’d want most if they lived in the eighteenth century, the visitors will struggle in silence for few moments. Then, someone will offer, “Aspirin?” Sometimes it’s delivered with a note of authority, on the mistaken notion that aspirin is derived from Native American remedies: “Aspirin.” I modulate my answer depending on the tone of the question. If the mood feels right, especially if the speaker answers with a note of condescension, I’ve been known to reply, “Do you know anyone who’s died of a mild headache?”

I work as an interpreter in the apothecary shop at the largest living history museum in the US.

If visitors leave with new knowledge, I’m satisfied. Another hundred people will remember the next time they make a meringue that cream of tartar is a laxative. But what I really want them to come away with is an idea. I want visitors to understand that our eighteenth-century forbears weren’t stupid. In the absence of key pieces of information—for examples, germ theory—they developed a model of the body, health, and healing that was fundamentally logical. Some treatments worked, and many didn’t, but there was a method to the apparent madness.

Engraving from Hohberg

This seventeenth-century engraving shows medicines being compounded and dispensed. Women were not licensed as apothecaries in the eighteenth century, but evidence suggests that in England, at least, they sometimes assisted husbands and fathers, despite not being licensed.Wolfgang Helmhard Hohberg’s Georgica Curiosa Aucta (1697) via Wellcome Images

5

Most visitors are mildly alarmed to learn that there was nothing available for mild, systemic pain relief in the eighteenth century. You’d have to come back next century for aspirin. Potent pain management was available via opium latex, often mixed with wine and brandy to make laudanum. In the eighteenth century, small amounts were used as a narcotic, a sedative, a cough suppressant, or to stop up the bowels, but not for headaches.

There were headache treatments, however. Colonial medical practitioners recognized multiple types of headaches based on the perceived cause, each with its own constellation of solutions. As is often the case, the simplest solutions were often effective. For a headache caused by sinus pressure, for example, the treatment was to induce sneezing with powered tobacco or pepper. Some good, hard sneezing would help expel mucus from the sinuses, thus relieving the pressure. For “nervous headaches”—what we call stress or tension headaches—I uncork a small, clear bottle and invite visitors to sniff the contents and guess what the clear liquid inside could be.

With enough coaxing, someone will recognize it as lavender oil. While eighteenth-century sufferers rubbed it on their temples, those with jangling nerves today can simply smell it—we don’t understand the exact mechanism, but lavender oil has been shown to soothe the nervous system. As a final example, and to introduce the idea that the line between food and medicine was less distinct two hundred years ago, I explain the uses of coffee in treating migraines and the headaches induced after a “debauch of hard liquors.” Caffeine is still used to treat migraines because it helps constrict blood vessels in the head, which can reduce pressure on the brain.

But if your biggest medical concern in the eighteenth century was a headache, you were lucky. Eighteenth-century medical practitioners faced menaces like cholera, dysentery, measles, mumps, rubella, smallpox, syphilis, typhus, typhoid, tuberculosis, and yellow fever. Here are a few.


Malaria

photograph of cinchona bark

The bark of the cinchona tree, called Peruvian bark or Jesuits bark, was an important addition to the European pharmacopeia.Wikimedia Commons

In discussing larger threats, I generally choose to focus on an illness that many visitors have heard of before, and for which a treatment was available. The “intermittent fever” also gives visitors a glimpse of one of the difficulties of studying the history of medicine–vague and often multiple names for a single condition. Intermittent fever was called such because of a particular symptom that made it easier to identify among a host of other fevers–sufferers experienced not only the usual fever, chills, and fatigue, but also paroxysms: cycles of intense chills followed by fever and sweating. Severe cases could result in anemia, jaundice, convulsions, and death.

10

After describing the symptoms to guests, I mention that the disease tended to afflict those living in swampy, hot, low-lying areas—such as Williamsburg. Older visitors often put it together—intermittent fever is what we call malaria. And typically, they know the original treatment for malaria was quinine.

It’s one of the times I can say, “We have that!” rather than, “Give us another hundred years.” I turn to the rows of bottles on the shelf behind me—not the eighteenth-century original apothecary jars that line the walls, but a little army of glass bottles, corked and capped with leather. The one I’m looking for is easy to find—a deep red liquid in a clear glass bottle. As I set it on the front counter, I introduce the contents: “Tincture of Peruvian bark.” I tend to add, “This is something I would have in my eighteenth-century medical cabinet.” Walking to the rear wall, I pull open a drawer and remove a wooden container. I lift the lid to reveal chunks of an unremarkable-looking bark. I explain that the bark comes from the cinchona tree, and, as undistinguished as it looks, it was one of the major medical advances of the seventeenth century.

Also called Jesuits’ bark, cinchona was used as a fever-reducer by native peoples in South America before being exported by the Jesuits to Europe. Its efficacy in fighting fevers soon made it a staple in English medical practice. While eighteenth-century apothecaries were ignorant of quinine, which would not be isolated and named until the 1810s, they were nonetheless prescribing it effectively.

The rings and dark dots are the result of infection by Plasmodium falciparum, one of the strains of protozoa that cause malaria.Wikimedia Commons

I make a point of explaining to visitors that quinine does not act like modern antibiotics do in killing off infections directly. Malaria is neither bacterial nor viral, but protozoan. Quinine (and more modern drugs derived from it and increasingly from Chinese Artemisia) interrupts the reproductive cycle of the malaria protozoa, halting the waves of offspring that burst forth from infected red blood cells. The protozoa, now rendered impotent, hole up in the sufferer’s liver, often swarming forth later in life in another breeding bid. So technically, once infected, you’ll always have malaria, but can suppress the symptoms.

Peruvian bark was used to treat a wide range of fevers, but it was not the only treatment. In certain instances of fever, it was used in conjunction with bloodletting. Bloodletting is a practice I’m always eager to explain, because it is so revealing of just how much our understanding of the body has changed in two centuries. Plus it freaks people out.


Fevers: A Note on Phlebotomy

15

Bloodletting or phlebotomy, dates back to antiquity. In the humoral theory of the body promulgated by Greco-Roman physicians, removing blood promoted health by balancing the humors, or fluids, of the body. This theory prevailed from roughly the fourth through the seventeenth centuries. Medical theorists gradually adopted a more mechanical understanding of the body, inspired by a renewed interest in anatomy and by experiments that explored the behavior of fluids and gases. These new theories provided an updated justification for bloodletting in particular cases.

Illustration of breathing a vein

“Breathing a Vein” by James Gilray, 1804.Wikimedia Commons

Whereas bloodletting had been a very widely applied treatment in ancient times, eighteenth-century apothecaries and physicians recommended it in more limited cases. In terms of fevers, it was to be applied only in inflammatory cases, which were associated with the blood, rather than putrid or bilious fevers, which were digestive. In Domestic Medicine, a popular late-eighteenth century home medical guide, physician William Buchan warned that “In most low, nervous, and putrid fevers … bleeding really is harmful … . Bleeding is an excellent medicine when necessary, but should never be wantonly performed.”

Eighteenth-century medical professionals believe that acute overheating often brought on inflammatory fevers. Key symptoms of inflammatory fevers were redness, swelling, pain, heat, and a fast, full pulse. Anything that promoted a rapid change in temperature, such as overexertion or unusually spicy food, could set off a chain reaction that resulted in inflammation. Drawing on mechanical theories of the body and renewed attention to the behavior of fluids, doctors complicated simple humoral explanations of disease. Blood, as a liquid, was presumed to behave as other liquids did. When heated, liquids move more rapidly; within the closed system of the human body, overheated blood coursed too rapidly through the body, generating friction. This friction in turn generated more heat and suppressed the expression of perspiration and urine, which compromised the body’s natural means for expelling illness. Removing a small quantity of blood, doctors reasoned, would relieve some of the pressure and friction in the circulatory system and allow the body to regulate itself back to health.

Picking up the lancet, I roll up my sleeve and gesture to the bend of my elbow, where blue veins are faintly visible through my skin. Generally, I explain, blood was let through these veins by venisection, where the lancet—a small pointed blade that folds neatly into its wooden handle like a tiny Swiss Army knife—is used to make a small incision below a fillet—a looped bandage tightened on the upper arm to control blood flow. The process is akin to blood donation today, except that the blood removed will be discarded. Apothecaries and physicians, striving to be systematic and scientific, often caught the escaping blood in a bleeding bowl—a handled dish engraved with lines indicating the volume of the contents in ounces. The volume of blood removed, Buchan cautioned, “must be in proportion to the strength of the patient and the violence of the disease.” Generally, a single bloodletting sufficed, but if symptoms persisted, repeated bloodlettings might be advised.

Visitors are generally incredulous that the procedure was fairly commonplace, and that people did it in their homes without the supervision of a medical professional. Bloodletting was sometimes recommended to promote menstruation or encourage the production of fresh blood. Both published medical writings and private papers suggest that folk traditions of bloodletting for a variety of reasons persisted throughout the eighteenth century.

20

Modern guests question both the safety and the efficacy of bloodletting. It terms of safety, it was generally a low-risk procedure; one function of bleeding was to push pathogens out of the body, thus limiting the risk of blood-borne infections. Routine bloodletting was typically limited to six or eight ounces of blood. By comparison, blood donors today give sixteen ounces. The human body is actually fairly resilient and can withstand substantial blood loss, so even in acute cases where blood was repeatedly let, exsanguination was unlikely to be the cause of death. One famous case visitors sometimes bring up is the death of George Washington in December 1799. While it is difficult to know the circumstances precisely, Dr. David Morens with the National Institutes of Health argues that the first President was afflicted with acute bacterial epiglottitis. The epiglottis is the small flap that prevents food from entering the airway or air from entering the stomach; when it becomes infected it swells, making eating, drinking, and breathing increasingly difficult, and eventually impossible. According to notes taken by the trio of physicians who treated Washington, he endured four bloodlettings in twelve hours, removing a total of 80 ounces of blood—the limit of what was survivable. This aggressive treatment presaged the “heroic” medicine of the nineteenth century and was far out of line with the recommendations of earlier physicians such as Buchan. Even so, Morens suspects that asphyxiation, not bloodletting, was cause of death.

Thus, while bloodletting probably caused few deaths, it also saved few lives. Aside from a possible placebo effect, bloodletting’s primary efficacy is in treating rare genetic blood disorders such as polycythemia (overproduction of red blood cells) and hemochromatosis (an iron overload disorder). So while the logic behind bloodletting seemed reasonable, it was due to the lack of a critical piece of information. “What actually caused most of the diseases doctors tried to treat with bloodletting?” I’ll ask. “Germs!” a visitor calls out. “Unfortunately,” I reply, “it will be another seventy-five years until the medical establishment accepts that we’re all covered in microscopic organisms that can kill us.”


The Common Cold

Most medical recommendations weren’t so seemingly bizarre, however. Eighteenth-century doctors strove to “assist Nature” in battling disease by recommending regimens—modifications of behavior, environment, and diet that were thought to promote recovery. Doctors and caretakers induced vomiting (an “upward purge”), defecation (a “downward purge”), urination, and/or sweating to help the body expel harmful substances and offered diets that were thought to help warm, cool, or strengthen the body. When visitors ask when the most commonly prescribed medicine is, we can’t give them a direct answer—the apothecaries kept track of debts and credits but not what was purchased—but we tell them the most common category of medicine we stocked was a laxative. Keeping one’s digestion regular was a priority in the eighteenth century.

Photo of Ebers Papyrus

The common cold has been with humanity for a very long time: it was described as far back as 1,550 BCE, in the Egyptian medical text known as the Ebers Papyrus.Wikimedia Commons

Visitors are often surprised to hear that they unwittingly follow regimens themselves, often for the same common ailments that laid low our colonial and revolutionary forbears. The example I typically use is the common cold, for which there is and never has been, alas, a cure. Looking to the row of children typically pressed up against the counter, I ask, “When you’re sick and can’t go to school, do you get more rest, or more exercise?” “Rest,” they answer in chorus. “And where do you rest?” “In bed.” “And what do you eat a lot of when you’re sick?” “Soup” and “juice” are the usual answers. “You’re behaving much as you would have two hundred and fifty years ago!” I tell them. “Doctors recommended resting someplace warm and dry and eating foods that were light and easy to digest—including broths and soups.”

Visitors are fascinated and often charmed to hear that the treatment of colds has essentially stayed the same. “When you take medicine for your cold,” I continue, “does it make you feel better or make your cold go away?” Most people are dumbfounded when they consider that the majority of medicine in the eighteenth century and today are to alleviate symptoms. Then as now, individuals and households selected treatments for stuffy noses, coughs, and fevers.


Surgery

25

While the treatment of disease has aspects both foreign and familiar, our distance from our forbears truly comes across in the comparatively primitive levels of surgery and midwifery. Because the squeamishness of guests varies widely, and because interpreters are discouraged from inducing nausea or fainting, we must proceed cautiously.

Surgery, visitors are shocked to hear, was not a prestigious profession until recently. In the eighteenth century, any physical undertaking for medical purposes was surgery—bandaging, brushing teeth, bloodletting. While separate in England, in the colonies apothecaries often took on surgical duties; low population density generally prevented specialization outside of large cities. In England, surgeons shared a guild (professional organization) with barbers, who pulled teeth and let blood as well as grooming and styling hair. A surgeon’s purview was more expansive—they set broken bones, amputated extremities when necessary, and removed surface tumors, requiring greater knowledge of anatomy.

Simple breaks could be set manually, as they are today. I often use myself as an example—I have an approximately fifteen-degree angle in my wrist from a bad fall several years ago. I explain that my wrist was set manually, with no pain management, very much as it might have been in the eighteenth century. (You know you’re a historian when that’s what you’re thinking on the gurney as a doctor jerks your bones back into alignment.)

engraving of splints

Before plaster casts were developed in the nineteenth century, broken bones could only be splinted. This engraving shows more elaborate splints for broken legs.Wellcome Images

Two factors limited the scope of surgical operations in the eighteenth century. The first was the lack of antisepsis; with no knowledge of germ theory and thus little control for infections surgeons avoided guts and kept operations as simple and efficient as possible. The second was pain.

A visitor always asks, “What did they do for pain?” Upon being told, “Nothing,” they blanch and then argue.

30

“What about opium?”

“Opium makes you vomit, and you’re restrained during operations and often on your back. You wouldn’t want to choke to death during your surgery.”

“They had to have done SOMETHING! A few shots of whiskey at least.”

While we can’t be sure what people did at home or while waiting for the doctor to arrive, doctors opposed the consumption of spirits before surgery because of increased bleeding. Occasionally, a visitor will ask if patients were given a thump on the head to make them lose consciousness.

“Well, the pain will probably wake you up anyhow,” I point out, “and now you have a head injury as well as an amputation to deal with.”

35

Generally, amputations lasted less then five minutes—minimizing the risk of infection and the chances of the patient going into shock from blood loss and pain. Limbs weren’t simply lopped off, however. Surgeons could tie off large blood vessels to reduce blood loss, and the surgical kit we display shows the specialized knives, saws, and muscle retractors employed by surgeons to make closed stumps around the severed bone.

Removing troublesome tumors was another challenge surgeons faced, commonly from breast cancer. This surprises some visitors, who tend to think of cancer as modern disease. I’ve even had a visitor insist that there couldn’t have been cancer two hundred years ago, when there were no chemical pesticides or preservatives. I informed him that cancer can also arise from naturally occurring mutations or malfunctions in cells—it even showed up in dinosaurs. Mastectomies have been performed for thousands of years. Because there was no means of targeting and controlling tumors, aggressive growth sometimes cause ulcerations through the skin, causing immense pain and generating a foul smell. Medicines such as tincture of myrrh were available to clean the ulcers and reduce the smell but did nothing to limit the cancer’s growth.

When ulceration made the pain unbearable or the tumor’s size interfered with everyday activities, sufferers resorted to surgery. Surgeons sought to remove the entire tumor, believing that if the cancer were not rooted out entirely, it would strike inward where they could not treat it. They were half right; cancer is prone to reappearing elsewhere in the body. Unfortunately, the removal of tumors triggers this—tumors secrete hormones that prevent the proliferation of cancer cells in other areas of the body. Removing tumors unleashes dormant cancer cells that have been distributed throughout the body. Without antisepsis and anesthesia, surgeons could not follow cancer inward.


Midwifery

Childbirth was one mystery partially penetrated in the eighteenth century. Prominent British physicians turned their attention to the anatomy and physiology of fetal development and conducted dissections—perhaps made possible by trade in freshly murdered cadavers in major British cities.

Smellie illustration

Illustration of fetal development in William Smellie’s Treatise on the Theory and Practice of Midwifery.Wellcome Images

William Smellie, a Scottish physician, produced some of the most accurate illustrations and descriptions of birth then available. Smellie’s Treatise on the Theory and Practice of Midwifery promoted the presence of male doctors into the traditionally sex-segregated birthing room. European medical schools began offering lecture series in midwifery leading to a certificate. The vast majority of women, especially in rural areas, continued to be delivered by traditional female midwives, but man-midwives were newly equipped to handle rare emergencies in obstructed delivery. Obstetrical forceps became more widely available over the course of the eighteenth century, though they were still cause for alarm; Smellie recommended that “operators” carry the disarticulated forcep blades in the side-pockets, arrange himself under a sheet, and only then “take out and dispose the blades on each side of the patient; by which means, he will often be able to deliver with the forceps, without their being perceived by the woman herself, or any other of the assistants.”

40

You can read more about Smellie’s inventions and early modern birthing devices in Brandy Shillace’s Appendix article “Mother Machine: An Uncanny Valley in the Eighteenth Century.”

In the shop, we rarely talk about the other equipment male doctors carried, for fear of upsetting visitors or creating controversy. Men continued to carry the “destructive instruments” long used to extract fetuses in order to save the mother. With forceps man-midwifery moved in the direction of delivery over dismemberment, but it remained an inescapable task before caesarean sections could be performed safely. Despite this avoidance, it periodically pops up, and forces me as the interpreter to rely on innuendo. One particularly discomfiting instance involved an eleven-year-old girl who asked about babies getting stuck during delivery. After explaining how forceps were used, she asked, “What if that didn’t work?” The best I could come up with was, “Then doctors had to get the baby out by any means necessary so the woman wouldn’t die.” She sensed my evasion and pressed on—“How did they do that?” Unwilling to explain how doctors used scissors and hooks in front of a group including children, I turned a despairing gaze on her mother. Fortunately, she sensed my panic and ushered her daughter outside; what explanation she offered, I do not know.

Engraving of instruments

Examples of some the “destructive instruments” man-midwives carried.Wellcome Images

Most women, fortunately, experienced uncomplicated deliveries and did not require the services of a man-midwife. Birth was not quite so fraught with peril as many visitors believe. While I’ve had one visitor inform me that all women died in childbirth, human reproduction generally works quite well. American colonists enjoyed a remarkably high birthrate. While there were regional variations, maternal mortality was probably about two percent—roughly ten times the maternal mortality rate in the United States (which lags significantly behind other developed countries). Repeated childbearing compounded these risks; approximately 1 in 12 women died as a result of childbearing over the course of their lives. Childbirth was a leading cause of death for women between puberty and menopause.

Improvements in antisepsis, prenatal care, fetal and maternal monitoring, and family planning over the past two centuries have pulled birth and death further apart. Fear of death altered how parents related to infants and children, how couples faced childbearing, and reproductive strategies. While this fear persists today, it is far more contained than it was two centuries ago.


44

Americans today live in a world of medical privilege unimaginable to their colonial forbears. It’s not because we are smarter or better than we were two hundred and fifty years ago. We are the beneficiaries of a series of innovations that have fundamentally altered how we conceptualize the body and reduced once-common threats. Guests in the Apothecary Shop today think of headaches as their most frequent medical problem because so many pressing diseases have been taken off the table.

45

From this privileged perspective, it’s all too easy to look down on those who believed in bloodletting or resorted to amputation for broken limbs. But the drive to do something to treat illness, to seek explanations for disease as a means of control, to strive to hold off death—these impulses haven’t changed.

As I often tell visitors—give it two hundred and fifty years, and we’ll look stupid too.