Medicine and the State

By Nick Houtman

America’s epic struggle with compulsory health insurance began more than 100 years ago. As the nation was entering World War I, a proposal to require workers, employers and the government to pay into a health-care fund stirred discussions among legislators, physicians and labor organizers from Massachusetts to California. The concerns they raised would be familiar to anyone following the modern debate: access to care, physician-patient relationships, public expense, regulation.

Cover from a leaflet promoting the future National Insurance Act in the United Kingdom in 1911. (Source: Publication Department of the Liberal Party)

These often heated arguments serve as a prologue to more recent political turmoil, but they also point to the underlying forces at work in our medical system: the knowledge that provides a basis for effective therapy and the social and political landscapes that shape health care. By delving into personal diaries, official documents, pamphlets and other records archived in the United States and Europe, Oregon State University historians show how government and medicine have been intertwined for centuries.

That relationship was at the heart of the early debate about health insurance. In 1912, former president Theodore Roosevelt and the Progressive Party called for “a system of social insurance” to protect Americans from sickness, unemployment and old age. Two years later, the American Association for Labor Legislation (AALL) followed up with a compulsory health-insurance proposal to provide workers who were injured or sick with medical care for up to six months. People who were self-employed or out of work could participate on a voluntary basis.

It was a radical departure from what amounted to a free-for-all in health care, says Michael Osborne, OSU professor of the history of science. “Science was not the gatekeeper to get into medicine. We had what we might call scientific physicians, but we also had all these healing sects, such as Seventh Day Adventist healers, homeopathy and an odd practice called Thomsonianism (which regarded the human body as a balance of earth, air, fire and water).”

Meanwhile, advances in science and medical practice were reducing death rates and increasing lifespans. Physicians had started using X-rays for diagnosis and vaccinating populations to prevent smallpox, rabies and other diseases. New surgical procedures offered treatments for cancer. However, these services also carried a price tag. For millions, many of the latest treatments were as accessible as a month’s paid vacation.

AALL’s proposal was modeled on health-insurance systems in Germany and England. Despite an initially warm response from physicians, the idea would suffer from political winds set in motion by World War I. “All things German became bad by association,” says Osborne.

As described in Almost Persuaded by University of Wisconsin medical historian Ronald Numbers, opponents to compulsory health insurance seized on its German pedigree. In California, a campaign poster asked, “Made in Germany. Do you want it in California?” In 1918, that state’s voters turned it down in a referendum by more than 70 percent.

Before the United States entered the war, some state medical societies supported compulsory insurance, but eventually the idea also lost favor with doctors. In 1921, the American Medical Association declared its opposition to any form of “state medicine.”

Science and the Sun King

Anita Guerrini

In 17th century France, the path to understanding structure and function of the human body was paved with the tools of dissection. In her book, The Courtiers’ Anatomists (University of Chicago Press, 2015), Anita Guerrini tells a story of scientists who, with royal approval and support, pursued knowledge of anatomy by meticulously dismembering animals, both living and dead, as well as human cadavers.

“Under the cover of night,” she writes, “the dead of Paris made their journey from the burial grounds to the places of dissection. In this era of recurrent plagues, their numbers never dwindled, and for three centuries from the 1530s, they did not lie quiet in their graves.”

The stage for these investigations had been set by William Harvey, the English scientist whose dissections led him to the momentous discovery of how blood circulates. That finding was a game-changer, says Guerrini, but it was Paris, not London, that became the epicenter for what she describes as justifiably “the most widespread and significant scientific activity of the seventeenth century.”

Dissections were performed privately and for educational purposes, but those at the King’s Garden were open to the public by the king’s decree. “People would come in. There’d be music; there were tickets,” says the Horning Professor in the Humanities who has also authored histories on animal experimentation and the environment. “And people would get kind of upset, especially the students, the teenage surgical apprentices, many of them 14- and 15-year-old boys. There were riots. It was an intensely emotional experience.”

The practice of dissection was one of many types of forays into a blossoming world of science and art. “Science was part of general cultural activities,” says Guerrini. “People are going to the opera, reading novels, writing poetry, often the same people. Science is one of these new things that people get interested in.”

However, she adds, hard-won anatomical knowledge wouldn’t lead to new medical treatments for many decades. Although science revolutionized the understanding of bodily parts as a mechanical system, diseases were still considered imbalances in the “humors” of blood, phlegm and yellow and black bile. Physicians paid close attention to the color and consistency of everything that came out of their patients’ bodies because it was their only insight, other than the pulse, into what happened in the living body. Diet, bleeding, herbs and sleep were vital to the physician’s tool kit.

“They still used bleeding as a therapy, even if they knew that there’s only a finite amount of blood in the body,” says Guerrini. “It works because it relieves some symptoms. If you have a fever, bleeding can make the fever go away, at least for a while. But the underlying theory of medicine was still conservative.”

Colonial Ills

Ben Mutschler

Before germ theory and other discoveries of the 19th and 20th centuries, health and illness were separated by an invisible veil, which could be pierced at any time. That reality was no different across the Atlantic in colonial America. Ben Mutschler calls New England a “province of affliction,” referring both to the region and to the state of social and political relations.

For an upcoming book of that title, the associate professor of history delved into diaries, petitions, court records and pamphlets to understand how sickness influenced daily life. “The social and political costs of illness radiated outward from the afflicted to their families and towns and connected them, finally, to the highest levels of government,” says Mutschler, who specializes in colonial- and revolutionary-era America.

Keep in mind, he adds, that disease and illness have been a part of governance since colonial times. “Part of the social contract between the powerful and the weak included protection from the consequences of illness. In the colonies, local governments were petitioned by citizens who had fallen into poverty or debt because they suffered from disease.”

Relief for the sick was a costly and demanding enterprise involving churches as well as government. Not surprisingly, officials and elected representatives tried to limit health-care expenses for the poor as well as for war veterans. Even after the Revolution, the federal government tried to avoid paying former soldiers for illnesses that veterans claimed were a consequence of military service.

“Illness was pervasive enough to become subject to legislation in a variety of domains — public health regulations, poor laws, the law of household governance, and provisions for soldiers and their families,” writes Mutschler.

Previewing today’s debate about immigration, local colonial councils invoked residential status in decisions to grant or withhold health-care assistance. Mutschler adds: “Families, towns, and commonwealths were asked to care for their own, which led to struggles over just who rightfully belonged to these entities and what, if anything, could be allowed the ‘stranger’ or ‘foreigner’ in distress within local society.”

Military Medicine

Michael Osborne

In the 18th and 19th centuries, trade and colonialism grew in the face of epidemics that killed indigenous people and Europeans alike. Outbreaks of yellow fever, cholera, malaria and smallpox decimated and reshaped communities on every continent.

Medical men wore masks to avoid the flu at U.S. Army Hospital No. 4 in Fort Porter, New York, during the 1918-19 ‘Spanish’ Influenza pandemic.

In his book, The Emergence of Tropical Medicine in France (University of Chicago Press, 2014), Michael Osborne has traced struggles with these and other diseases in the French empire. At the heart of this story are physicians trained by the Royal Navy, which ran colonies and tried to maintain health on ships and in port cities from France to the Caribbean, West Africa and Southeast Asia. Military physicians were often on the front lines, as both healers, patients and victims.

Health care in these places reflected the state of science as well as theories of how race and culture affected immunity. “The germ theory of disease comes online in the 1870s and 1880s. Robert Koch and Pasteur were lucky enough to find anthrax bacteria,” says Osborne. Later, advances in parasitology and virology put an end to debates about whether malaria and yellow fever were separate illnesses or manifestations of a single disease.

“The big thing is the great flu pandemic at the end of World War I — ‘the great teacher,’ as it’s called in the literature of public health,” adds Osborne.

Infectious diseases still challenge governments and health-care organizations. The Zika virus, for example, is carried by the same mosquito (Aedes aegypti) that spreads yellow and dengue fevers. “Think about the (lifetime) cost of caring for the babies born with microcephaly. It’s at least $1.5 million per child,” says Osborne. “When we think about the cost of health care, especially in epicenters of Zika infection such as Brazil, where termination is not an option even if the fetus is determined to have microcephaly, that just leaps out at us.”

As of last spring, Zika had been found in Latin America, the Caribbean, the South Pacific, Southeast Asia and the United States (Florida and south Texas). Innovative prevention measures include reducing mosquitoes through the release of one that is bioengineered to cause a population collapse, but such steps carry risks. Removing one species could open opportunities for others to replace it.

“The fear is that we could still have mosquitoes carrying Zika that bite during the day, and you might have another reservoir of mosquitoes that bite at night. In terms of human epidemiology, it could be much worse than it is now,” says Osborne. “So there are these imponderables.”

What’s clear from these historical examples, however, is that government and health care will remain closely entwined, no matter how the debate about compulsory insurance unfolds.