HSCI 230 — Lesson 1

The History of
Epidemiology

Evaluating Epidemiological Research — HSCI 230

Kiffer G. Card, PhD, Faculty of Health Sciences, Simon Fraser University

Learning objectives for this lesson:

  • Trace the historical development of epidemiological thinking from ancient Greece to the modern era
  • Identify key figures and innovations that shaped the discipline
  • Explain how networks of science, technology, culture, and bureaucracy drove progress in public health
  • Describe global improvements in human health using evidence-based perspectives
  • Apply critical theory to interrogate the colonial and racial origins of epidemiological knowledge
  • Evaluate who has benefited from — and who has been harmed by — the advancement of epidemiology

This course was developed by Dr. Kiffer G. Card, Faculty of Health Sciences, Simon Fraser University.

Section 1

From Ancient Plagues to Modern Sanitation

⏱ Estimated reading time: 15 minutes

Learning Objectives

  • Identify the earliest roots of epidemiological thinking in the ancient world.
  • Describe the key innovations in vital statistics, vaccination, and sanitation from the 17th–19th centuries.
  • Explain how networks of actors, technologies, and institutions — not lone geniuses — produced advances in public health.

Ancient Roots: Disease and Environment

Long before the germ theory of disease, humans tried to make sense of why some people fell ill and others did not. The earliest epidemiological thinking emerged from the recognition that disease is not random — it follows patterns connected to the environment, seasons, and ways of life.

Hippocrates (c. 460–377 BC) is often cited as the first epidemiologist. In his treatise Airs, Waters, and Places, he urged physicians to consider the effects of climate, water quality, and geography on health. He introduced the terms epidemic (diseases that visit a community) and endemic (diseases that reside within a community), distinctions that remain foundational today.

Key Concept: Environment and Disease

Hippocrates' insight was radical for its time: rather than attributing disease to the wrath of gods, he proposed that illness had natural causes tied to environmental conditions. This shift — from supernatural to natural explanations — was one of the most consequential intellectual moves in the history of medicine.

For centuries after Hippocrates, the dominant Western theory of disease was miasma theory — the idea that illness was caused by "bad air" rising from rotting organic matter and swamps. While wrong in its mechanism, miasma theory was productive: it motivated sanitation reforms, clean water infrastructure, and drainage projects that genuinely improved health, sometimes for the right reasons applied in the wrong theoretical framework.

The 17th Century: Counting the Dead

The emergence of epidemiology as a quantitative discipline required a seemingly simple innovation: the systematic recording of births and deaths.

John Graunt (1620–1674), a London haberdasher with no formal medical training, published Natural and Political Observations Made upon the Bills of Mortality in 1662. Graunt analyzed London's weekly death records and discovered remarkable patterns: the regularity of sex ratios at birth, seasonal variations in mortality, and differences in urban versus rural death rates. He constructed the first known life table, estimating the probability of survival to each age.

Actor-Network Insight: Bureaucracy as a Tool of Knowledge

Graunt's breakthrough was only possible because of an existing bureaucratic infrastructure: the London Bills of Mortality, which had been compiled since 1532, originally to track plague outbreaks. The administrative apparatus of parish clerks recording deaths, week by week, created a dataset that no individual physician could have assembled. This is a pattern we will see repeatedly: epidemiological advances depend on networks of institutions, technologies, and actors — not just individual brilliance.

The 18th Century: Vaccination and Empirical Observation

Edward Jenner (1749–1823) is celebrated for developing the first vaccine in 1796, when he demonstrated that inoculation with cowpox material protected against smallpox. But Jenner's innovation, too, emerged from a broader network. Dairy workers had long observed that milkmaids who contracted cowpox seemed immune to smallpox — this was folk knowledge, circulating informally for decades before Jenner tested it systematically.

Jenner's contribution was to formalize this observation into an experimental test and to advocate for its widespread adoption. The success of vaccination also depended on state infrastructure: governments needed to organize distribution, manage public trust, and enforce compliance. In 1840, Britain made vaccination free, and in 1853 it became compulsory — an early example of how public health requires the coordination of science, governance, and social institutions.

A Note on Whose Knowledge Counts

Jenner's story also illustrates a recurring tension in the history of science: the folk knowledge of working-class dairymaids was essential to the discovery, yet it is Jenner — the physician who formalized and published the finding — who receives the credit. Throughout the history of epidemiology, we will encounter this pattern of knowledge extraction, where the observations and experiences of ordinary people (often marginalized communities) become the raw material for scientific advances attributed to elite professionals.

The 19th Century: The Golden Age of Sanitation

The 19th century saw the emergence of epidemiology as a recognizable discipline, driven by industrialization, urbanization, and devastating epidemics of cholera, typhus, and tuberculosis.

John Snow (1813–1858)

Often called the "father of modern epidemiology," Snow famously traced a London cholera outbreak in 1854 to a contaminated water pump on Broad Street. By carefully mapping cases and comparing water sources, he demonstrated that cholera was spread through contaminated water — not miasma.

Snow's work was groundbreaking because it used systematic evidence to identify a cause of disease, even without understanding the underlying biological mechanism (the cholera bacterium would not be identified until 1884). His approach — mapping cases, comparing exposed and unexposed groups, identifying a specific source — established the basic logic that epidemiology still follows.

However, Snow's work did not happen in a vacuum. His investigation depended on the Registrar General's mortality records (compiled by William Farr), on the cooperation of local officials, and on a growing public demand for sanitary reform. The removal of the Broad Street pump handle was as much a political act as a scientific one.

Ignaz Semmelweis (1818–1865)

Working in the maternity wards of Vienna General Hospital, Semmelweis observed that the mortality rate from puerperal (childbed) fever was dramatically higher in the ward staffed by physicians than in the ward staffed by midwives. He hypothesized that physicians were transmitting "cadaverous particles" from the autopsy room to laboring women.

In 1847, Semmelweis introduced mandatory handwashing with chlorinated lime solution, and the mortality rate dropped from approximately 10–18% to around 1–2%. Despite this dramatic evidence, his findings were rejected by much of the medical establishment, who were offended by the suggestion that their hands could be instruments of death.

Semmelweis's story illustrates how institutional resistance, professional ego, and entrenched hierarchies can delay the adoption of life-saving knowledge. His ideas were only widely accepted after germ theory was established by Pasteur and Koch decades later.

William Farr (1807–1883)

As the first compiler of statistical abstracts for the Registrar General of England and Wales, Farr developed standardized methods for classifying diseases and analyzing mortality rates. He created techniques for comparing death rates across populations — the forerunner of modern vital statistics and disease surveillance.

Farr's work made epidemiology quantitative and systematic. By standardizing how diseases were categorized and deaths were recorded, he built the infrastructure that allowed Snow and others to do their investigative work. Farr also demonstrated the concept of excess mortality — comparing observed deaths to expected deaths — a technique that remains central to epidemiology today.

Florence Nightingale (1820–1910)

Nightingale is best known as a nursing reformer, but she was also a pioneering statistician who used data visualization to advocate for public health reform. Her famous "coxcomb" (polar area) diagrams showed that far more British soldiers in the Crimean War died from preventable infectious disease than from battle wounds.

After the war, Nightingale turned her statistical methods to studying disease in British India, collaborating with investigators, publishing papers, and developing theories about sanitation and disease transmission. She was among the first to use statistical evidence systematically to drive policy change, demonstrating that data could be a powerful tool for advocacy.

Networks of Progress: An Actor-Network Perspective

A common way to tell the history of epidemiology is as a series of "great men" moments — Hippocrates, Graunt, Jenner, Snow. But this narrative obscures the networks of actors that made each breakthrough possible.

Actor-Network Theory (ANT), developed by sociologists Bruno Latour and Michel Callon, offers a richer framework. ANT proposes that scientific advances emerge from networks that include not just human actors (scientists, bureaucrats, patients) but also non-human actors: technologies (the microscope, the printing press, the water pump), institutions (the Registrar General's office, hospitals, parish churches), documents (bills of mortality, death certificates), and even organisms (cholera bacteria, cowpox virus). No single actor produces knowledge alone; innovation is the product of a network.

Thinking with ANT

Consider the Broad Street pump investigation. Snow's "discovery" required: (1) a bureaucratic system producing mortality records; (2) cartographic technology to map cases spatially; (3) a cultural context of sanitary reform that made people receptive to environmental explanations; (4) a water infrastructure that connected households to identifiable pumps; (5) local knowledge from residents about their water habits; and (6) political will to act on the findings. Remove any node from this network and the "breakthrough" does not happen.

This perspective is important because it helps us see that progress in public health is never inevitable. It depends on the alignment of scientific knowledge, technological capacity, institutional infrastructure, cultural readiness, and political will. When these networks function well, health improves. When they break down — or when they are organized to benefit some populations and not others — people suffer.

Knowledge Check — Section 1

1. Hippocrates' key contribution to early epidemiological thinking was:

Hippocrates shifted thinking about disease from divine punishment to natural causes related to environment, climate, and geography — a foundational move for epidemiology.

2. What infrastructural element made John Graunt's 1662 analysis of mortality patterns possible?

Graunt's work depended on the pre-existing bureaucratic system of parish clerks recording weekly deaths — a reminder that data infrastructure makes epidemiology possible.

3. From an Actor-Network Theory perspective, John Snow's Broad Street pump investigation succeeded because of:

ANT emphasizes that breakthroughs emerge from networks of human and non-human actors. Snow's investigation depended on bureaucratic records, technology, institutions, and political context — not his intellect alone.
✦ Pass the knowledge check with 100% to continue

References & Further Reading

Hippocrates. Airs, Waters, and Places (c. 400 BC). Translated in Adams, F. (1849). The Genuine Works of Hippocrates. London: Sydenham Society.

Graunt, J. (1662). Natural and Political Observations Made upon the Bills of Mortality. London.

Snow, J. (1855). On the Mode of Communication of Cholera. 2nd ed. London: John Churchill.

Latour, B. (2005). Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford University Press.

Nightingale, F. (1858). Notes on Matters Affecting the Health, Efficiency, and Hospital Administration of the British Army. London: Harrison and Sons.

Section 2

The Modern Era & the Case for Optimism

⏱ Estimated reading time: 15 minutes

Learning Objectives

  • Describe the major phases of epidemiological development in the 20th and 21st centuries.
  • Explain the shift from infectious to chronic disease epidemiology.
  • Use an evidence-based perspective to evaluate claims about global health progress.

The Evolution of Modern Epidemiology

The 20th century transformed epidemiology from a discipline focused primarily on tracking infectious disease outbreaks into a sophisticated science of population health. This transformation occurred in recognizable phases, each building on the networks of knowledge, technology, and institutions established by predecessors.

Preformal Phase (17th–19th Century): Building the Data Infrastructure

This phase, which we covered in Section 1, established the foundational practices of systematic data collection. The Bills of Mortality, vital registration systems, and census taking created the raw materials that epidemiology would analyze. The key innovation was not any single discovery but the creation of bureaucratic systems that generated population-level data over time.

Early Phase (1900–1945): Refining Investigation Methods

The early 20th century saw the development of formal epidemiological investigation methods. In 1927, Kermack and McKendrick published the SIR (Susceptible-Infected-Recovered) model, establishing mathematical techniques for predicting the spread of infectious diseases through populations. This model — and its many descendants — would later become essential tools during HIV/AIDS, SARS, and COVID-19.

This era also saw the rise of occupational epidemiology, studying how working conditions affected health, and the beginnings of cohort study methodology.

Classic Epidemiology (1945–1976): The Chronic Disease Revolution

As infectious diseases were brought under better control through antibiotics, vaccination, and sanitation, epidemiology turned its attention to chronic diseases — heart disease, cancer, stroke, diabetes — which became the leading causes of death in industrialized nations.

The landmark study of this era was the British Doctors' Study, launched in 1951 by Richard Doll and Austin Bradford Hill. By following over 40,000 physicians over decades, they established the causal link between cigarette smoking and lung cancer — one of the most consequential findings in public health history. This study also helped establish the prospective cohort study as a powerful tool for investigating chronic disease causation.

In the United States, the Framingham Heart Study, begun in 1948, similarly followed an entire community over time and identified major cardiovascular risk factors including high blood pressure, high cholesterol, smoking, obesity, and diabetes. These are concepts and risk factors you have almost certainly heard of — they became common knowledge because of this era of epidemiological research.

Bradford Hill also formalized the Bradford Hill criteria (1965) for evaluating causation from epidemiological evidence — a set of considerations (strength, consistency, specificity, temporality, biological gradient, plausibility, coherence, experiment, and analogy) that remain widely used today.

Modern Epidemiology (1976–Present): Complexity and Causal Inference

Since the mid-1970s, epidemiology has become increasingly methodologically sophisticated. Key developments include:

  • Causal inference methods: Directed Acyclic Graphs (DAGs), counterfactual reasoning, and potential outcomes frameworks for reasoning about causation more rigorously
  • Molecular epidemiology: Integration of genetic and biomarker data to understand disease mechanisms at the molecular level
  • Social epidemiology: Formal study of how social structures, institutions, and inequalities shape health outcomes
  • Global health epidemiology: Tracking and responding to pandemics, from HIV/AIDS to SARS to COVID-19
  • Data science and computational methods: Machine learning, large-scale administrative data linkage, and real-time disease surveillance systems

The randomized controlled trial (RCT), while first developed in the late 1940s (the 1948 streptomycin trial for tuberculosis is considered the first modern RCT), became established as the "gold standard" for testing interventions during this period.

The Case for Optimism: Evidence-Based Progress

Looking at the arc of history, there is a strong — if sometimes surprising — case to be made that the world has gotten dramatically healthier, wealthier, and safer. This is the argument made powerfully by the late Swedish physician and statistician Hans Rosling in his book Factfulness: Ten Reasons We're Wrong About the World – and Why Things Are Better Than You Think (2018).

Rosling argued that most people, including the highly educated, hold systematically distorted views of the world — believing things are worse than they actually are. His work centered on letting the data tell the story. Consider some of the evidence:

Child Mortality: From 43% to 3.7% Click to explore
Maternal Mortality: A Dramatic Decline Click to explore
Extreme Poverty: 85% to Under 10% Click to explore
Life Expectancy: More Than Doubled Click to explore
Violence: The Long Decline Click to explore
Infectious Disease: Smallpox to COVID Click to explore

Rosling's Core Message

Rosling did not argue that the world is fine or that we should be complacent. He argued that the world is both better than most people think and still deeply flawed — and that recognizing progress is essential for making further progress. If we believe nothing works, we become fatalistic. If we see that vaccination campaigns, sanitation investments, and education programs have produced measurable improvements, we have reason to invest further in evidence-based interventions.

His key insight for epidemiology students: data literacy matters. The ability to read data accurately, resist cognitive biases, and communicate evidence clearly is itself a public health skill.

Networks of Progress in the Modern Era

From an actor-network perspective, the health improvements of the past two centuries were not produced by epidemiology alone. They emerged from the alignment of multiple systems:

  • Science and technology: Germ theory, antibiotics, vaccines, diagnostic tools, statistical methods
  • Infrastructure: Clean water systems, sewage treatment, food safety regulation, cold chain logistics
  • Governance and bureaucracy: Vital registration, disease surveillance, public health agencies (WHO, CDC), regulatory frameworks
  • Culture and education: Literacy, scientific thinking, norms around hygiene and health-seeking behavior
  • Economic development: Rising incomes, improved nutrition, housing, and working conditions
  • Social movements: Sanitary reform, women's suffrage (leading to investment in maternal and child health), labor rights, anti-tobacco advocacy

When these networks operate in concert, health improves — sometimes dramatically. But this optimistic narrative, while grounded in real data, tells only part of the story. In the next section, we will ask: whose health improved, at whose expense, and what knowledge was built on the suffering of the marginalized?

Reflection

Think about the global health progress described above. Are you surprised by any of these statistics? Why do you think most people tend to underestimate how much the world has improved? What might be the consequences of this "negativity bias" for public health policy?

Reflection saved.
Knowledge Check — Section 2

1. The British Doctors' Study by Doll and Hill was historically significant because it:

The British Doctors' Study (1951 onward) was a landmark prospective cohort study that established smoking as a cause of lung cancer, marking the shift toward chronic disease epidemiology.

2. According to Hans Rosling, a key reason people systematically underestimate global health progress is:

Rosling's central argument is that cognitive biases (negativity instinct, gap instinct, etc.) combined with poor data literacy cause systematic misperceptions about global trends.

3. Which of the following best reflects the actor-network perspective on health improvements?

ANT emphasizes that no single actor or factor produces change. Health improvements result from the alignment of multiple interacting systems — and when those systems fail or exclude groups, disparities result.
✦ Complete the reflection and pass the knowledge check with 100% to continue

References & Further Reading

Doll, R. & Hill, A. B. (1954). The mortality of doctors in relation to their smoking habits. British Medical Journal, 1(4877), 1451–1455.

Rosling, H., Rosling, O. & Rosling Ronnlund, A. (2018). Factfulness: Ten Reasons We're Wrong About the World – and Why Things Are Better Than You Think. Flatiron Books.

Pinker, S. (2011). The Better Angels of Our Nature: Why Violence Has Declined. Viking.

Kermack, W. O. & McKendrick, A. G. (1927). A contribution to the mathematical theory of epidemics. Proceedings of the Royal Society A, 115(772), 700–721.

Hill, A. B. (1965). The environment and disease: Association or causation? Proceedings of the Royal Society of Medicine, 58(5), 295–300.

Dawber, T. R., Meadors, G. F. & Moore, F. E. (1951). Epidemiological approaches to heart disease: The Framingham Study. American Journal of Public Health, 41(3), 279–286.

Section 3

The Shadow History: Colonialism, Race, and the Making of Epidemiology

⏱ Estimated reading time: 20 minutes

Learning Objectives

  • Apply Michel Foucault's concepts of biopower and biopolitics to the history of epidemiology.
  • Describe how colonialism, slavery, and war contributed to the development of epidemiological knowledge.
  • Identify specific historical cases of research exploitation and their lasting consequences.
  • Analyze how historical injustice produces contemporary health disparities.

Complicating the Narrative

In the previous two sections, we told a largely optimistic story: epidemiology emerged from ancient roots, was refined through centuries of innovation, and contributed to dramatic global health improvements. That story is true — but it is incomplete.

A critical examination of the history of epidemiology reveals a shadow history: one in which the accumulation of medical knowledge depended on the suffering, exploitation, and dehumanization of enslaved people, colonized populations, and marginalized communities. The institutions and systems that produced "progress" for some populations simultaneously produced harm for others. Understanding this is not about rejecting epidemiology; it is about understanding the discipline more honestly and building a more just practice going forward.

Foucault, Biopower, and the Politics of Population Health

The French philosopher Michel Foucault (1926–1984) developed a set of concepts that are essential for thinking critically about the history of epidemiology. In The History of Sexuality, Volume 1 (1976) and his lectures at the Collège de France, Foucault introduced two interrelated ideas:

Biopower

Foucault used the term biopower to describe a form of political power that emerged in the 18th century: the power to manage life itself at the level of entire populations. Unlike earlier forms of sovereign power (the king's right to kill or let live), biopower operates through the regulation of birth rates, mortality rates, fertility, reproduction, health, and longevity. Biopower is the "set of mechanisms through which the basic biological features of the human species became the object of a political strategy."

Biopolitics

Biopolitics refers to the specific political practices that emerge when life and population become objects of governance. Census-taking, vital statistics, public health campaigns, quarantine measures, immigration screening, and disease surveillance are all biopolitical practices — they manage populations by monitoring, categorizing, and intervening in their biological existence.

Why does Foucault matter for epidemiology? Because epidemiology is, at its core, a biopolitical science. It emerged alongside — and in service of — the modern state's need to know, count, categorize, and manage populations. This is not inherently sinister; surveillance systems save lives. But Foucault invites us to ask uncomfortable questions:

  • Who gets counted? Whose deaths are recorded and whose are invisible?
  • Who gets studied? Whose bodies become the raw material for scientific knowledge?
  • Who benefits? Do the populations studied also benefit from the knowledge produced?
  • Who decides? Who controls the categories, the surveillance systems, the research agendas?

As we will see, the answers to these questions reveal deep inequities in how epidemiological knowledge was produced and for whom.

Maladies of Empire: Epidemiology's Colonial Origins

In Maladies of Empire: How Colonialism, Slavery, and War Transformed Medicine (2021), historian Jim Downs argues that the standard origin story of epidemiology — centered on John Snow and the Broad Street pump — obscures a deeper, more troubling history. The systematic study of disease patterns in populations did not begin in London; it was developed through the infrastructures of colonial empire, the slave trade, and military campaigns.

The Plantation as Laboratory

Colonial plantations created conditions that functioned as unintentional epidemiological laboratories. Large numbers of people, concentrated in defined geographic areas, under systematic observation by plantation owners and their physicians, exposed to identifiable environmental conditions — these were the conditions necessary for studying disease patterns at the population level.

Downs documents how physicians working in slave societies in the Caribbean and American South developed theories about the transmission of yellow fever, cholera, and other infectious diseases by observing patterns of illness among enslaved populations. These observations — made possible by the total control and surveillance that slavery afforded — contributed to medical knowledge that benefited free populations while doing nothing for the enslaved people whose suffering generated it.

Before Snow: Colonial Investigations of Cholera and Yellow Fever

One of Downs' most striking arguments is that the epidemiological methods typically credited to Snow were used earlier in colonial settings. British physician James McWilliam investigated a yellow fever outbreak on the island of Boa Vista (Cape Verde) and aboard the ship Eclair, interviewing more than 100 people and assembling an explanatory framework for yellow fever transmission — more than a decade before Snow's Broad Street investigation.

Similarly, Gavin Milroy drew attention to the water supply as a source of cholera in Jamaica before Snow traced cholera to the Broad Street well in London. These investigations are largely absent from standard histories of epidemiology because they took place in colonial settings, involving colonized and enslaved populations whose contributions were not valued or recorded.

War, Empire, and Medical Knowledge

Downs further argues that military campaigns — themselves instruments of colonial expansion — produced bureaucracies that collected health data on unprecedented scales. Between 1756 and 1866, colonialism, slavery, and war created administrative systems that allowed physicians to develop theories about disease causation, transmission, and prevention.

Florence Nightingale's pioneering statistical work on disease in the British military, for instance, drew heavily on data generated by Britain's colonial and military infrastructure. Her study of sanitary conditions in India depended on the colonial bureaucracy's capacity to track sickness and death across a vast colonized territory.

The central argument is that the knowledge systems we celebrate as "modern epidemiology" were built, in significant part, on the suffering of people who were enslaved, colonized, or conscripted into wars of imperial expansion. Acknowledging this is not about discrediting the knowledge itself, but about recognizing whose labor and suffering produced it.

Research as Exploitation: Case Studies

The entanglement of epidemiology with racial exploitation did not end with the colonial era. The 20th century saw some of the most egregious abuses of research ethics, with lasting consequences for trust in medical institutions.

The Tuskegee Syphilis Study (1932–1972)

In one of the most infamous episodes in the history of medical research, the United States Public Health Service conducted a 40-year study on 399 African American men with syphilis in Macon County, Alabama. The men were told they were being treated for "bad blood" but were in fact deliberately denied treatment — even after penicillin became the standard cure in the late 1940s — so that researchers could observe the natural progression of the disease.

By the time the study was exposed by whistleblower Peter Buxtun in 1972, 28 men had died directly of syphilis, 100 had died of related complications, 40 wives had been infected, and 19 children had been born with congenital syphilis.

The Tuskegee Study led directly to the National Research Act (1974) and the Belmont Report (1979), which established the foundational ethical principles of modern research: respect for persons, beneficence, and justice. In 1997, President Bill Clinton issued a formal apology to the surviving participants.

The legacy of Tuskegee is not merely historical. Research consistently shows that African Americans report lower trust in medical institutions and clinical research, and this distrust has been linked to lower rates of participation in clinical trials and, more recently, to COVID-19 vaccine hesitancy. The harm done in Tuskegee reverberates across generations.

Nutritional Experiments on Indigenous Children in Canada (1942–1952)

Documented by historian Ian Mosby (2013), these experiments involved at least 1,300 Indigenous people, approximately 1,000 of whom were children in six residential schools across Alberta, British Columbia, Manitoba, Nova Scotia, and Ontario.

Government researchers, aware that the children were already malnourished, divided them into experimental and control groups. Some received vitamin and mineral supplements; others were deliberately kept on deficient diets to serve as controls. In some cases, dental care that had previously been available was withdrawn so that researchers could observe the progression of dental disease unchecked.

No consent was sought from the children or their families. The children were already confined in residential schools — institutions designed to forcibly assimilate Indigenous peoples by removing children from their families and cultures — making any notion of voluntary participation meaningless.

These experiments were part of a broader pattern. Historian Maureen Lux documented how experimental BCG tuberculosis vaccines were tested on Cree and Nakoda Oyadebi infants in Saskatchewan in the 1930s–40s, partly because vaccines were cheaper than improving the appalling conditions on reserves and in residential schools. In Canada's racially segregated "Indian Hospitals," patients were subjected to experimental surgical and drug treatments for tuberculosis — including lung removal — while being denied standard antibiotics available to non-Indigenous patients.

In 2023, the Canadian Medical Association formally apologized to Indigenous Peoples for its role in medical racism and research misconduct since 1867.

The Long Shadow: From Slavery to Heart Disease

One of the most striking examples of how historical injustice produces contemporary health disparities involves the Black Belt of the American South — and it begins, remarkably, in the Cretaceous period, 100 million years ago.

A 100-Million-Year Chain of Causation

During the Cretaceous period, a shallow sea covered much of what is now the southeastern United States. Over millions of years, the remains of marine organisms were compressed into a crescent-shaped band of unusually rich, dark soil stretching from eastern Mississippi through central Alabama into western Georgia. This soil — the Black Belt, named for its color — was ideal for growing cotton.

Because it was ideal for cotton, it became the region where enslaved Africans were concentrated in the largest numbers. After the Civil War, these counties retained large Black populations but experienced little economic development. Structural racism, Jim Crow laws, disinvestment, and the systematic exclusion of Black communities from economic opportunity created persistent poverty that continues to this day.

The epidemiological consequences are measurable. A landmark study by Kramer et al. (2017), published in SSM – Population Health, found that southern counties with higher concentrations of enslaved people in 1860 experienced significantly slower declines in heart disease mortality in the late 20th century. The mechanisms linking slavery to modern heart disease include persistent poverty, lower educational attainment, limited healthcare access, food deserts, environmental exposures, and the chronic stress of ongoing racial discrimination.

A 2022 study by Rebbeck in Health Equity extended this analysis, finding that Black Belt counties had significantly higher age-adjusted mortality rates (181.8 per 100,000) compared to non-Black Belt counties (171.6 per 100,000). Rebbeck argues that "geohistorical" factors — the chain from ancient geology to slavery to structural racism — represent fundamental causes of health inequity that cannot be addressed by individual-level interventions alone.

Thinking Structurally

The Black Belt example illustrates a key principle of social epidemiology: health disparities are not natural facts. They are produced by historically specific systems of power, exploitation, and exclusion. Geology created the conditions for a particular kind of agriculture; that agriculture depended on slavery; slavery created demographic patterns; those patterns were maintained by structural racism; and structural racism produces the poverty, stress, and lack of access that drive health disparities today. No amount of individual behavior change can undo this chain without also addressing its structural roots.

Biopower Revisited: Who Gets to Be Healthy?

Returning to Foucault, we can now see how the concept of biopower illuminates the full history of epidemiology. The same systems of surveillance, classification, and population management that produced genuine health improvements also functioned as instruments of control and exploitation:

  • Vital statistics counted some populations and ignored others. Indigenous deaths in residential schools were often unrecorded or attributed to "natural" causes.
  • Disease surveillance in colonial settings served the health of colonizers, not colonized peoples.
  • Research subjects were drawn disproportionately from populations that had no power to refuse — enslaved people, institutionalized children, prisoners, and racialized communities.
  • The benefits of knowledge flowed primarily to the populations that controlled the research apparatus, while the harms fell on those who were studied.

This is not an argument against epidemiology. It is an argument for a more self-aware, ethical, and equitable epidemiology — one that asks not only "what causes disease?" but also "who benefits from this knowledge, who is harmed by the research process, and whose priorities shape the research agenda?"

Reflection

Consider the relationship between the "optimistic" narrative of Section 2 and the "critical" narrative of this section. Are they contradictory, or can they both be true simultaneously? How should epidemiologists hold both of these perspectives as they conduct research today?

Reflection saved.
Knowledge Check — Section 3

1. Foucault's concept of "biopower" refers to:

Biopower, as Foucault defined it, is the "set of mechanisms through which the basic biological features of the human species became the object of a political strategy" — governing life at the population level.

2. Jim Downs argues in Maladies of Empire that:

Downs documents how colonial physicians like McWilliam and Milroy conducted epidemiological investigations in colonial settings before Snow, but these contributions were marginalized because they involved colonized and enslaved populations.

3. The connection between the Black Belt region and contemporary health disparities illustrates:

The Black Belt example demonstrates that health disparities are structurally produced. A chain from ancient geology to cotton agriculture to slavery to structural racism to persistent poverty explains contemporary disparities in heart disease mortality.

4. The Tuskegee Syphilis Study (1932–1972) directly led to which major development in research ethics?

The exposure of the Tuskegee Study led to the National Research Act (1974) and the Belmont Report (1979), which established the ethical principles that govern human subjects research today.
✦ Complete the reflection and pass the knowledge check with 100% to continue

References & Further Reading

Foucault, M. (1978). The History of Sexuality, Volume 1: An Introduction. Translated by R. Hurley. New York: Pantheon Books.

Downs, J. (2021). Maladies of Empire: How Colonialism, Slavery, and War Transformed Medicine. Cambridge, MA: Harvard University Press.

Mosby, I. (2013). Administering colonial science: Nutrition research and human biomedical experimentation in Aboriginal communities and residential schools, 1942–1952. Histoire sociale/Social History, 46(91), 145–172.

Kramer, M. R., Black, N. C., Matthews, S. A., & James, S. A. (2017). The legacy of slavery and contemporary declines in heart disease mortality in the U.S. South. SSM – Population Health, 3, 609–617.

Rebbeck, T. R. (2022). What geohistory can teach us about fundamental causes of health inequities. Health Equity, 6(1), 727–733.

Reverby, S. M. (2009). Examining Tuskegee: The Infamous Syphilis Study and Its Legacy. Chapel Hill: University of North Carolina Press.

Lux, M. (2016). Separate Beds: A History of Indian Hospitals in Canada, 1920s–1980s. University of Toronto Press.

Section 4

Lesson 1 — Final Assessment

⏱ Estimated time: 20 minutes

Bringing It All Together

This lesson has taken you through the full arc of epidemiology's history — from Hippocrates' environmental theories to the colonial laboratories of empire, from the optimism of global health gains to the persistent disparities produced by structural racism. As you complete this final assessment, draw on all three sections.

Key Takeaways from Lesson 1

  • Epidemiology is the study of disease patterns, causes, and effects in populations — it emerged over centuries through the convergence of science, technology, and governance.
  • Progress in public health has been driven by networks of actors and institutions, not by individual geniuses alone.
  • Global health has improved dramatically by many measures — child mortality, life expectancy, poverty — and recognizing this progress is essential for sustaining it.
  • The same institutions that produced health improvements also produced exploitation: colonialism, slavery, and war were central to the development of epidemiological methods.
  • Historical injustices — from the Tuskegee Study to nutritional experiments on Indigenous children — have lasting consequences for health, trust, and equity.
  • A critical epidemiology asks not only what causes disease but who benefits from knowledge, who is harmed by research, and whose priorities shape the agenda.

Final Reflection

Epidemiology has been both a tool for improving population health and a tool for managing and exploiting populations. As a student entering this field, how do you think epidemiologists should navigate this tension? What responsibilities do researchers have to the communities they study? How might the history you have learned today shape how you evaluate epidemiological research going forward?

Reflection saved.
Final Assessment — Lesson 1 (15 Questions)

1. Hippocrates introduced the terms "epidemic" and "endemic" in his treatise:

Airs, Waters, and Places is the Hippocratic text that most directly addresses environmental causes of disease and introduces the terms epidemic and endemic.

2. John Graunt's 1662 contribution to epidemiology was significant because he:

Graunt pioneered quantitative epidemiology by analyzing London's Bills of Mortality, discovering patterns in sex ratios, seasonal mortality, and urban-rural differences, and creating the first life table.

3. The SIR model, developed by Kermack and McKendrick in 1927, is used to:

The SIR (Susceptible-Infected-Recovered) model provides a mathematical framework for predicting infectious disease dynamics in populations.

4. Semmelweis's handwashing intervention was initially rejected by the medical establishment primarily because:

Despite dramatic evidence (mortality rates dropping from ~10-18% to ~1-2%), Semmelweis's findings were rejected because they challenged the professional identity and authority of physicians.

5. The Framingham Heart Study, begun in 1948, is best described as a:

The Framingham Heart Study followed an entire community over decades and identified risk factors including high blood pressure, high cholesterol, smoking, obesity, and diabetes.

6. Actor-Network Theory (ANT) suggests that scientific advances are best understood as:

ANT holds that innovations emerge from networks of both human and non-human actors. Technologies, documents, institutions, and organisms all play active roles in shaping scientific knowledge.

7. Hans Rosling's Factfulness argues that recognizing global health progress is important because:

Rosling argued that awareness of progress sustains motivation and investment. Believing nothing works leads to fatalism; seeing evidence of effective interventions supports continued effort.

8. Foucault's concept of "biopolitics" describes:

Biopolitics refers to the governance of populations through practices like census-taking, disease surveillance, quarantine, public health campaigns, and immigration screening — managing life at the collective level.

9. In Maladies of Empire, Jim Downs argues that epidemiological investigations in colonial settings:

Downs documents investigations by McWilliam (Cape Verde) and Milroy (Jamaica) that used systematic epidemiological methods before Snow, but these were marginalized because they occurred in colonial settings.

10. The Tuskegee Syphilis Study involved:

The Tuskegee Study (1932–1972) deliberately withheld treatment from African American men with syphilis so researchers could observe the disease's natural progression. Participants were told they were being treated for "bad blood."

11. Ian Mosby's research on nutritional experiments in Canadian residential schools revealed that:

Mosby documented how approximately 1,000 Indigenous children at six residential schools were divided into experimental and control groups, with control children deliberately kept malnourished and dental care withdrawn to observe disease progression.

12. The connection between the Black Belt's geology and contemporary health disparities illustrates:

The Black Belt example demonstrates how geological features shaped agricultural practices, which shaped the geography of slavery, which created persistent structural inequalities that produce measurable health disparities today.

13. The Belmont Report (1979) established which three core principles of research ethics?

The Belmont Report, developed in response to the Tuskegee scandal, established respect for persons (autonomy and informed consent), beneficence (do no harm, maximize benefits), and justice (fair distribution of research burdens and benefits).

14. From Foucault's perspective, epidemiological surveillance is an instrument of biopower because it:

For Foucault, surveillance systems are biopolitical tools — they enable governance by producing knowledge about populations. This can serve beneficial purposes (disease prevention) but also enables control, raising questions about who is surveilled and who benefits.

15. A critical approach to the history of epidemiology suggests that researchers today should:

A critical approach does not reject epidemiology — it enriches it by asking questions about power, equity, and whose interests are served, alongside traditional questions about disease causation.
✦ Complete the final reflection above before submitting

Cumulative References

Downs, J. (2021). Maladies of Empire: How Colonialism, Slavery, and War Transformed Medicine. Cambridge, MA: Harvard University Press.

Foucault, M. (1978). The History of Sexuality, Volume 1. Translated by R. Hurley. New York: Pantheon Books.

Kramer, M. R., Black, N. C., Matthews, S. A., & James, S. A. (2017). The legacy of slavery and contemporary declines in heart disease mortality in the U.S. South. SSM – Population Health, 3, 609–617.

Mosby, I. (2013). Administering colonial science: Nutrition research and human biomedical experimentation in Aboriginal communities and residential schools, 1942–1952. Histoire sociale/Social History, 46(91), 145–172.

Rebbeck, T. R. (2022). What geohistory can teach us about fundamental causes of health inequities. Health Equity, 6(1), 727–733.

Reverby, S. M. (2009). Examining Tuskegee: The Infamous Syphilis Study and Its Legacy. University of North Carolina Press.

Rosling, H., Rosling, O. & Rosling Ronnlund, A. (2018). Factfulness. Flatiron Books.

Lux, M. (2016). Separate Beds: A History of Indian Hospitals in Canada, 1920s–1980s. University of Toronto Press.