Improvements in Health and the Organization and Development of Health Care and Health Insurance Markets
This chapter describes the gains in health in the twentieth century and the development of the markets of health care and health insurance. It first provides an overview of the literature documenting the gains in public health that led to mortality transition in the United States in the late nineteenth and early twentieth centuries. Clean water, sanitation and electrification helped reduce mortality, as did food and milk inspection, the elimination of parasites such as malaria and hookworm, and food fortification. As the century progressed, advances in science and technology, combined with reforms in physician education and licensing led to improvements in medical care and health. These changes increased the cost of medical care and led to the development of health insurance markets.
KeywordsHealth care and insurance Public health Hospitals Medical education
In the late nineteenth and early twentieth centuries, infant mortality and life expectancy improved as mortality transitioned from infectious disease mortality to chronic disease mortality beginning in 1915 (Meeker 1972; Haines 2001). Early in this period, gains in longevity can be largely attributed to gains in public health. For the most part, medical care did not contribute to improved mortality because it was largely ineffective. A large body of research in economic history shows that gains in public health during this period led to significant improvements in health outcomes. Clean water, sanitation, and electrification helped reduce mortality, as did food and milk inspection. The eradication of parasites such as hookworm and malaria, and the fortification of foods to eliminate nutritional deficiencies also played roles in decreasing mortality and improving other outcomes.
In the first few decades of the twentieth century, advances in science and technology, coupled with improvements in physician education, led to improved health and quality of life. Discoveries such as diphtheria antitoxin, salvarsan for syphilis, and antibiotics led to more awareness of the effectiveness of medical care. As these discoveries increased the demand for medical care, they led to increases in medical costs and the development of the U.S. health insurance system. Occupational licensing of physicians and other medical professionals evolved both as medical schools sought to improve the quality of new physicians, and to protect consumers from unqualified practitioners. In this chapter, we focus on summarizing the literature in these different strands of research. We first discuss the literature on the improvements in health generated by public health initiatives, and then summarize studies that examine how changes in physician education and licensure affected the demand and supply of medical services. Finally, we provide an overview of studies that look at the emergence and impact of health insurance in the United States.
Improvements in Public Health
It is important to emphasize the contributions of economic history to understanding the disease elimination aspects of the early public health movement. One of the main findings of the literature is that much of the improvement in health early in the twentieth century came from public health interventions. Early initiatives that nearly universally improved health included water purification and sanitation. Later in the century, nutritional fortification also improved health for Americans. Other public health efforts focused on eradicating parasites, such as hookworm and malaria.
Water Purification and Sewage Systems
Public health interventions such as water filtration, sanitation, refuse collection, and food inspection played a significant role in the decline of mortality in the early twentieth century. Using city-level data, David Cutler and Grant Miller (2005) find that water filtration and chlorination account for almost half of the mortality reduction in major cities, 75% of the decrease in infant mortality, and two-thirds of the decrease in child mortality. Similarly, Werner Troesken (2004) finds that deaths from typhoid and other waterborne diseases were nearly eliminated by the expansion of public water and sewer services. Looking at Massachusetts, Alsan and Goldin (forthcoming) show that the combination of safe water and sewerage lowered child mortality by 26.6 log points and was more effective than either intervention alone. Joseph P. Ferrie and Werner Troesken (2008) find that reducing typhoid also led to a reduction in nonwaterborne diseases, including death rates from gastroenteritis, tuberculosis, pneumonia, influenza, bronchitis, heart disease, and kidney disease (known as the Mills-Reincke phenomenon).
While water purification systems reduced deaths from typhoid, they may have led to increased morbidity and mortality in cities that used lead pipes to deliver the water, depending on the age of the pipe and the acidity of the local water supply. In two studies, Troesken (2006, 2008) reports that lead resulted in increases of 25–50% for infant mortality and stillbirths in 1900 for the average Massachusetts town. As cities abandoned lead water pipes in the early 1900s, the deleterious effects of lead exposure on infant mortality were reduced, although a lack of data makes it difficult to estimate the size of the reduction. Karen Clay et al. (2014) follow up on this research and look at the effect of waterborne lead exposure on infant mortality in U.S. cities over the first two decades of the twentieth century. They use city-level variation in water acidity and the type of pipe used by cities (lead, iron or concrete), and confirm that lead pipes led to significant increases in infant mortality. Specifically, they find that increasing pH from the 25th percentile to the 50th percentile (since water becomes less acidic, it leaches lead more slowly) would reduce infant mortality by 7–33%.
Early public health improvements occurred nearly exclusively at the state and local levels (Preston and Haines 1991). State and local public health departments operated to reduce mortality in a variety of ways. Louis P. Cain and Elyce Rotella (2001) use data from 48 U.S. cities to examine the impact of sanitation expenditures on death rates. They find that spending on sewer systems and refuse collection reduced death rates from typhoid, dysentery, and diarrhea. Municipalities also engaged in street cleaning and the distribution of diphtheria antitoxin (Condron and Crimmins-Gardner 1978; Meckel 1990). State and local legislation was aimed at protecting food and milk and preventing the spread of disease. Mokyr and Stein (1996) state that by 1905, 32 states had laws preventing the adulteration of milk.
Several different types of laws were also enacted to stem the spread of tuberculosis. Between 1900 and 1917, state and local governments passed laws that required TB reporting. They also enacted disinfection laws, spitting bans, and common drinking cup bans (Anderson et al. forthcoming). Anderson et al. examine these laws and find that reporting laws led to a 6% reduction in the TB mortality rate, and that the establishment of a state-run sanatorium reduced TB by 4%. This finding is similar, but larger in magnitude to that found by Alex Hollingsworth (2013) in his study of sanitaria in North Carolina. However, a paper by Karen Clay et al. (2018) suggests that community-based health interventions targeted at reducing tuberculosis did not reduce tuberculosis relative to control cities over time, although they did reduce infant mortality.
Scholars interested in examining state and local spending and their impact on outcomes can rely on several sources of data. At the state level, Financial Statistics of States provides similar information to that of Financial Statistics of Cities beginning in 1915. Richard E. Sylla et al. (1995) have published relevant data in State and Local Government Sources and Uses of Funds: Twentieth Century Statistics (ICPSR Study 6304). For municipalities, the Bureau of Labor Statistics Bulletins #24, 30, 36, and 42 (1899–1902) provide data on municipal finances for 1899–1902, and Census Bulletin #20 (United States Bureau of the Census 1904) provides similar information for 1902–1903. The United States Census Bureau published Statistics of Cities (1907, various years) to report annual data on municipal-level health related expenditures for cities over 30,000 between 1905 and 1908. The United States Census Bureau also published Financial Statistics of Cities (1909, various years), which provides similar information for the years 1909–1913, 1915–1919, and 1921–1930. The specific categories related to health include information on health conservation and sanitation cost payments; health conservation and sanitation outlays; charities, corrections, and hospital cost payments; and charities, corrections, and hospital outlays.
Public Health Education and Information
In addition to direct public health spending on refuse collection, infrastructure, and the passage of laws designed to facilitate the spread of disease, some public health initiatives revolved around educating families about hygiene and infant and child care. Grant Miller (2008) suggests that the enfranchisement of women led to large shifts in public spending on hygiene campaigns that led to increases in child survival. While Miller focuses on data from Financial Statistics of Cities (so that his mechanism about how public expenditures led to reductions in child mortality is a black box), other studies have used data on public health activities to identify the channels that were most effective in improving health outcomes. For example, Carolyn Moehling and Melissa A. Thomasson (2014) examine activities undertaken by public health authorities under the Sheppard-Towner program, in which the federal government gave matching funds to states to engage in infant and child care and hygiene. They find that the most effective interventions were those that provided one-on-one care, such as nurse visits and health centers, compared to activities such as classes, conferences, and demonstrations.
The Eradication of Parasites
In the early 1900s, up to 30% of the population was infected with malaria (Kitchens 2013a). The effects of malaria vary from stunted physical stature and impaired cognitive development to death (Hong 2007; Bleakley 2010). Several New Deal agencies and programs contributed to the decline of malaria. By paying farmers to take land out of cultivation, the Agricultural Adjustment Act (AAA) led farm laborers to leave mosquito-breeding grounds (Humphreys 2001). Alan Barreca et al. (2012) estimate that this out-migration accounts for about 10% of the decline in malaria between 1930 and 1940. Carl Kitchens (2013a, b) examines the impact of other New Deal programs on malaria. Using county-level data from Georgia between 1932 and 1941, Kitchens (2013a) demonstrates that drainage projects constructed under the auspices of the Works Progress Administration (WPA) explain more than 40% of the observed reduction in malaria over the period.
Kitchens (2013b) finds different results when he looks at the construction of dams under the Tennessee Valley Authority (TVA). He uses county-level panel data from Alabama and Tennessee, and finds that the dams created a vast increase in coastline suited for mosquito breeding. Despite subsequent efforts of the TVA to control mosquitos, Kitchens calculates that the TVA led to a significant increase in loss of life due to malaria that reduced the fiscal benefit of dam construction by 24%.
Other public health efforts focused on eradicating another Southern parasite: hookworm. Transmitted through the soil, hookworm eventually lodges in the intestines of its victim. It causes lethargy and anemia, but is rarely fatal. Nevertheless, its effects can lead to decreased productivity and make it difficult for children to cognitively focus. For example, Garland Brinkley (1997) shows that the sharp decline in Southern agricultural output after the Civil War can be attributed to increased rates of hookworm infection.
In 1910, the Rockefeller Sanitary Commission (RSC) estimated that 40% of Southern schoolchildren were affected with hookworm. It engaged in an eradication campaign, and sent health care workers to dispense de-worming medication. Hoyt Bleakley (2007) finds that children living in areas with greater rates of hookworm infection prior to the RSC’s campaign showed greater gains in school enrollment, attendance, and literacy than those living in areas with lower rates of infection. Thus, eradicating hookworm could account for closing up to half the literacy gap between the North and the South, and reducing the income gap by up to 20%.
Improvements in Diet
Nutritional insufficiency – whether stemming from parasitic infection or poor diet – is correlated with reduced economic outcomes. Numerous economists and historians have noted that nutritional improvements (measured by caloric and/or protein intake) are correlated with gains in both income and health (Higgs 1971; Fogel 1994; Steckel 1995; Floud et al. 2011). Most recently, Gregory Niemesh (2015) adds to this literature by measuring the impact of the first federal requirement in 1943 to fortify bread with iron. Iron deficiency in infants and children causes developmental delays and behavioral problems, and reduces productive capacity in adults. Niemesh uses pretreatment variation in iron consumption, and shows that the law led to increases in income and educational attainment in areas with lower levels of iron consumption prior to the mandate. James Feyrer et al. (2017) similarly examine the impact of salt iodization in the U.S. in the 1920s on later cognitive outcomes. Their findings suggest that iodized salt raised IQ for those who were most deficient, but also increased thyroid-related deaths, particularly among older individuals. Karen Clay et al. (2018) examine the rise and fall of pellagra, a disease caused by niacin deficiency in the American South that is characterized by dermatitis, diarrhea, and dementia. They argue that cotton production in the South displaced local food production, leading to increases in pellagra rates. Using a difference-in-differences framework, they leverage the arrival and spread of the boll weevil to identify the relationship between cotton production and pellagra. They find that pellagra death rates fell between 23% and 40% more in high cotton counties than in low cotton counties after the boll weevil arrived. They also find that after 1937, state fortification laws helped to eliminate pellagra.
The Growth of the Market for Medical Care
Reforms in Medical Education and the Changing Public Perception of Hospitals
At the turn of the twentieth century, the formal health care market was much simpler than today and made up a smaller portion of economic activity. Consumer expenditures on medical care in 1900 (about $384 million) accounted for around 2% of Gross Domestic Product (Craig 2006; Sutch 2006), compared to around 18% in 2017 (Centers for Medicare & Medicaid Services 2018). Medical care was inexpensive because it was ineffective, so people did not need health insurance to pay for medical expenses (Thomasson 2002). Hospitals were generally shunned by people of means, and functioned as almshouse and places for those without families to care for them. Respectable people of means had physicians visit them in their homes.
At the turn of the twentieth century, physician training in the U.S. was largely substandard compared to medical education in Europe, but several medical schools had already made significant strides in reforming the quality of medical education. Johns Hopkins, Harvard, Michigan, and Pennsylvania had all increased admissions requirements, lengthened their terms, and moved from apprenticeships to a focus on clinical instruction. By the time Abraham Flexner (Flexner et al. 1910) published his well-known report on the quality of medical education in the United States, the reform movement was well underway (Moehling et al. 2018). A key component of modern medical education was the alliance between medical schools and clinical training in hospitals. As physician training moved to hospitals, so too did patients. At the turn of the twentieth century, few people even considered going to hospitals and preferred to have physicians visit their homes. This changed significantly as the public gradually became aware of scientific progress and visited physicians trained under the new regime. Although hospitals were considered second-rate and germ-ridden in 1900, by the second decade of the century, some products (such as Lysol) featured advertising boasting how they were used in hospitals. As news from Europe during World War I showed doctors saving lives on the battlefield, the public perception of hospitals grew more favorable.
Occupational Licensing of Health Care Providers
While the first physician licensure laws in the United States were passed in the 1870s, most were very lax until the 1910s. For example, even as late as 1906, 13 states still allowed people who had not graduated from medical school to become physicians (Ludmerer 1985). Supporters of medical education reform had long pushed for stricter licensing requirements, but Flexner’s report spurred legislators to implement higher standards to ensure that would-be physicians could meet new standards. States continued to ratchet up the entry and graduation requirements throughout the 1910s. Term lengths were lengthened to 4 years, and states began to require premedical education of at least 2 and sometimes 4 years. Many schools did not have the resources to invest in reforms, and subsequently closed or merged with other schools. In response, the number of medical schools fell from 160 in 1900 to 81 in 1922 (American Medical Association, Council on Medical Education 1922, p. 633).
Economic historians have tried to explain the early adoption of occupational licensure laws by health professions, as well as the effect of widespread coverage of occupations across the various states. Their work focuses on the impact of licensing on supply, price, and quality, and they leverage their results to distinguish between two economic theories that explain the existence of occupational regulations. One theory is that professions are motivated by their own self-interest to lobby governments to enact occupational licensure requirements. As requirements for entry into the profession become stricter, competition for incumbent practitioners becomes more limited. Supply falls, so that prices and wages increase above what would have occurred in the absence of a license system. A related idea is that of regulatory capture, which occurs when practitioners take control of the licensing apparatus meant to regulate entry into an occupation. For example, physicians make up the majority of members of state medical boards. With the licensing apparatus captured, boards could enact regulations in the interest of their profession (e.g., restrict supply and increase wages) at the expense of consumers (e.g., higher prices and less access). Underlying this view is that licensure provides no benefit to the consumer by increasing physician quality, or at least any quality gains that do occur do not fully compensate for the loss in welfare from higher prices and lower quantity of health services provided.
Alternatively, licensure might have emerged to solve an asymmetric information problem between the sellers of health services and patients. Physician quality may be unobservable. Since patients do not have the ability to discern “quacks” from good doctors, occupational licensure can help solve the asymmetric information problem by eliminating the low-quality providers from the market. Education requirements or an entrance exam to the profession proves a provider is of high quality. With the low-quality competition gone, the remaining high-quality providers may see an increase in wages. However, in contrast to the pure self-interest of regulatory capture, the consumers benefit on net from the quality increases in this case.
Law and Kim (2005) document the effects of physician licensure laws on the supply and quality of doctors during the Progressive Era using licensure data spanning 1870–1930. They use cross-state variation in the timing of when specific parts of a licensure law were passed in a state-level decadal panel. In general, passage of a licensure law or an increase in entry requirements reduced entry to the profession and reduced the supply of doctors per capita. The requirements most related to declining entry were the implementation of 2-year and 4-year premedical education requirements; they find little evidence that the initial licensure statutes of the 1870s affected entry.
In addition, they find evidence that physician licensure may have reduced asymmetric information and improved patient outcomes. Using the same empirical framework, they find suggestive evidence that physician quality improved in response to the premedical education requirements during the 1910s. States with premedical requirements experienced relatively larger declines in maternal mortality and appendicitis mortality, negative outcomes that physician behavior might be able to mitigate during this period. However, the author’s find no effect of greater licensure requirements on either overall mortality or the infant mortality rate.
Law and Kim (2005) provide additional evidence that that licensure reduced information asymmetry and improved quality. Professions that a priori one would expect to more acutely suffer from informational asymmetries were more successful in restricting entry (physicians, dentists, and veterinarians as opposed to plumbers, electricians, and barbers). Their results suggest this increased quality came at the expense of a reduction in entry into the profession.
While these studies look at the effect of licensing on physician supply, focusing solely on the state-level supply impacts of licensure masks the geographic redistribution of physicians that may occur as the market for physician services adjusts to a new spatial equilibrium. Moehling et al. (2018) examine the rural/urban location choices of physicians in the first decades of the twentieth century to test whether they were related to the changes in medical education during the period. Physicians were becoming a more urban profession over the course of the early twentieth century, as was the population of the country as a whole. However, physicians trained in medical schools that required premedical education were significantly more likely to set up practice in an urban area than physicians that graduated from schools without premed education requirements. Physicians trained in “modern” medical schools were more strongly attracted to professional amenities located in urban areas such as hospital beds and larger physician communities than were physicians trained in lower quality schools. As states enacted increasingly strict admission requirements to medical school, not only did the total supply of physicians decrease, but the composition of new graduates was more likely to locate in an urban area. These two factors combined led a reduction in access to physician services in rural areas, increasing an already existing urban-rural divide.
The studies above describe the effect of medical licensure on physicians. Yet during the Progressive Era, midwives also became more highly regulated. At the turn of the twentieth century, half of all births were attended by midwives, with the other half attended by physicians. Only 5% of births (mostly those by indigent women or unwed mothers) took place in hospitals (Wertz and Wertz 1989). Midwives during this period were (almost entirely) women who learned the craft through helping more experienced midwives. Formal training was not provided. Moreover, the market for midwives was wholly unregulated prior to a string of licensure laws enacted in the early twentieth century in an effort to increase the quality of midwifery services.
Anderson et al. (2016) analyze the supply and mortality outcomes associated with 22 states and at least a dozen municipalities enacting licensing requirements over the 1900–1940 period. The stringency of entry requirements varied tremendously across the states. “Applicants for licenses in Mississippi were judged based on their character, cleanliness and intelligence, but were not required to take an exam or graduate from a school of midwifery. By contrast, midwives in California, Washington and Wisconsin were required to graduate from a recognized school of midwifery and to pass an examination administered by their State Board of Medical Examiners” (Anderson et al. 2016, p. 3). The introduction of licensing requirements for midwives was associated with a 6–7% reduction in maternal mortality.
The Impact of Medical Care on Health
As medical education improved and medical care advanced, it became more effective. Thomasson and Treber (2008) show that the while medicalization of childbirth did not initially reduce maternal mortality, maternal mortality declined once sulfa drugs became available in 1937. Sulfa provided physicians with their first effective treatment against a range of bacterial infections. Jayachandran et al. (2010) show that in addition to reducing maternal mortality, sulfa use also reduced deaths from pneumonia and scarlet fever. They estimate that sulfa reduced overall mortality from 2% to 3%, and added 0.4–0.7 years of life expectancy.
Medical Costs and the Development of the Health Insurance Market
The fundamental function of any kind of insurance is to reduce financial uncertainty associated with catastrophic events by pooling risk. People pay a fixed amount of money over time, and receive a payout if they experience a loss. The market works because on average, the amount paid in premiums by the group is less than the amounts paid out in benefits. Prior to the late 1920s, health insurance in the U.S. did not develop for two reasons. First, the demand for health insurance was low. When medical care was inexpensive, people did not need health insurance to pay unexpectedly high bills (Thomasson 2002). Instead, wage earners needed to cover wage-loss associated with disability, so they obtained disability insurance (called “sickness” insurance) through their firms, unions, or through fraternal societies (Emery 1996; Emery and Emery 1999; Murray 2007). The lack of need for actual health insurance is reflected in the first failure of an attempt at health insurance reform during the Progressive Era. In 1916, the American Association for Labor Legislation (AALL) published a draft bill in which they proposed comprehensive sickness and medical benefits for low income workers. Under the plan, local mutual insurance companies would manage premium contributions shared by employers, workers, and the state. Employers and workers would each contribute 40% of the plan’s premium, while the state would contribute the remaining 20% (Chasse 1994). Entrenched interests, including insurance companies, druggists, and physicians opposed the bill, and popular support was lacking, as demonstrated by the results of a 1918 referendum in California in which the plan was defeated with 358,324 votes against to 133,858 in favor of the bill (Murray 2007).
The lack of demand for health insurance was joined by a lack of insurance companies willing to underwrite “health.” Commercial insurance companies did not view health as an insurable product. In order for insurance markets to function well, two conditions must hold. First, the losses that insurance companies cover must be able to be measured and observed. Insurance companies wondered how they would monitor health and pay claims for ill health. Would they be able to tell who was really sick versus just malingering? If losses are hard to measure, they can result in moral hazard – when having insurance makes it more likely that an insured person would have a claim. Second, insurance companies can only offer insurance if they are able to measure the likelihood of someone having a claim. If policyholders have private information about their likelihood to have a claim, insurance companies will not collect enough money in premiums to be able to pay for losses. This problem, known as adverse selection, can prevent insurance markets from functioning. In the early twentieth century (and even today), insurance companies worried that they might not be able to tell who was likely to have a health claim and who was not, and as a result, they did not feel like they could profitably offer health insurance coverage.
The demand for health insurance did not rise until the overall cost of medical care increased and became more variable as medical treatment shifted to hospitals. By 1929, a nationwide study by the Committee on the Costs of Medical Care (CCMC) showed that the average American family had medical expenses totaling $108, with hospital expenditures contributing to about 14% of medical expenses (Falk et al. 1933). This average disguised significant variation; urban families with incomes between $2,000 and $3,000 per year had mean medical expenses of $67 without hospitalization but expenses of $261 if someone had been admitted to the hospital (Falk et al. 1933).
As the costs associated with hospitalization increased, some families had difficulty paying their bills. In response, hospitals began to organize payment plans, and in so doing, unwittingly mitigated the problems of adverse selection and moral hazard, and set the stage for the broad sale of health insurance. In 1929, Justin Ford Kimball, a former superintendent of schools, was an administrator at Baylor University Hospital. He worked with Dallas teachers to develop a plan, later known as Blue Cross, based on the principles of insurance to help them pay their bills: Baylor would provide each teacher with 21 days of hospital care for an annual fee of $6.00. These plans reduced adverse selection by selling insurance to groups of workers healthy enough to work. They reduced moral hazard because Blue Cross plans reimbursed hospitals directly and patients generally could not admit themselves to hospitals. During the Great Depression, more and more hospitals began to develop these plans as hospital occupancy rates – and revenues – fell.
Blue Cross plans also benefited from state-level legislation called “enabling laws” that allowed them to form as nonprofit corporations and enjoy tax-exempt status, as well as being exempt from certain insurance regulations such as reserve requirements and assessment liability. Thomasson (2002) shows that these laws increased the amount of health insurance at the state level.
Although physicians were somewhat slower than hospitals to develop prepaid plans, the American Medical Association (AMA) feared national health insurance and attempted to forestall its development by encouraging state and local medical societies to form their own prepayment plans, which later became known as Blue Shield (Thomasson 2002). By 1940, 9% of the U.S. population was covered by health insurance, largely through the Blue Cross and Blue Shield plans (Thomasson 2003). After Blue Cross demonstrated, it had overcome the problems of adverse selection and moral hazard, commercial, for-profit companies began to move rapidly into the market.
In the 1940s, a series of factors led to the rapid expansion of health insurance. The National War Labor Board limited the ability of firms to raise wages to secure labor, even as the U.S. entry into World War II led to a shortage of workers. Health insurance was exempted from the ruling, and firms began to offer health benefits to attract workers. An administrative tax court ruling in 1943 (later codified under the 1954 Internal Revenue Code) exempted employer contributions to worker health insurance premiums from employee income taxes. Thomasson (2003) finds that the tax change increased the likelihood that a household would have coverage by 9%, and increased the amount of coverage purchased by 9.5%. By 1957, about 76% of the population held some form of private health insurance coverage.
The Impact of War on Poverty on Health Insurance
The development of employment-based insurance as a means to alleviate adverse selection and its accommodation by U.S. tax policy makes it difficult for people without jobs to receive health insurance coverage. The elderly, disabled, and unemployed often had a difficult time finding health insurance and paying for medical expenses. Their need for financial assistance was recognized in Congress. In 1950, amendments to the Social Security Act allowed the federal government to provide matching funds to states to pay doctors and hospitals for providing medical care to welfare recipients. In 1960, these “vendor payments” were expanded to include the elderly who were not welfare recipients (Moore and Smith 2005).
Despite the effort to support the elderly who could not afford medical care under the Medical Assistance for the Aged Act, Congressional hearings in the early 1960s concluded that “…increasing numbers of our older people are confronted with financial catastrophe brought on by illness” (U.S. Congress. Senate. Special Committee on Aging 1964). In 1965, Medicare became law. Under Part A, the elderly were automatically enrolled in the compulsory hospital insurance program upon reaching age 65. Part B provided insurance for physicians’ services. Research on Medicare shows both its costs and benefits. David Card et al. (2008) demonstrate that because of Medicare, insurance coverage increases at age 65. By leveraging this discontinuous change, they are able to examine differences in health care utilization and outcomes across different groups (such as educated whites compared to less educated minorities). They find that health insurance coverage does affect utilization of medical services, and find a small increase in self-reported health status. Looking at the impact of Medicare on mortality, Amy Finkelstein and Robin McKnight (2008) found that while Medicare did not reduce the mortality of elderly individuals, it significantly reduced their out-of-pocket expenses and financial risk.
Medicaid (Title XIX of the Social Security Act) was also enacted with Medicare to provide health insurance coverage to non-elderly populations in need of assistance with medical bills. In contrast to Medicare, which was federally funded and provided uniform benefits to all enrollees, states’ participation in Medicaid was voluntary. States that participated in Medicaid received some federal funds to provide means-tested benefits originally to recipients of public assistance, although legislative changes over the years have expanded eligibility. While the federal government specified minimum standards for eligibility and benefits, states have the option to make eligibility for coverage or benefits more generous. Under the Affordable Care Act of 2010, the federal government provides additional funds to states seeking to expand Medicaid eligibility. There is a very large literature on the impact of Medicaid on both physical and financial health of enrollees. Thomas Buchmueller et al. (2016) provide a summary of the program and a very thorough review of the literature.
Directions for Future Research
This chapter enumerates a wide variety of studies that examine the effect of public health initiatives on health and well-being, as well as the development of health care and health insurance markets. Nevertheless, significant gaps in the literature point to areas where future research is needed. For example, we know little about the impact of vendor payments paid to hospitals and doctors on behalf of welfare recipients under the 1950 amendments to the Social Security Act, yet research suggests the effects may have been substantial. For example, work by Martha Bailey and Andrew Goodman-Bacon (2015) shows that Community Health Centers (rolled out as part of the War on Poverty) significantly lowered age-adjusted mortality among older Americans. Also of interest is more work in the area of the impact of both de jure and de facto racial segregation and prejudice. Douglas Almond et al. (2006) demonstrate that the elimination of segregation in Southern hospitals after 1964 reduced infant mortality among blacks. A 2018 study found that underrepresentation by blacks in the medical profession may lead to excess mortality among black men (Alsan et al. 2018), which echoes results from Alsan and Wanamaker (2018), who found that the exploitative Tuskegee Study contributed to racial disparities in health among its victims and those close to them. Both of these studies suggest that more work can be done by economic historians to examine the causes and effects of racial health disparities.
- Alsan M, Goldin C (forthcoming) Watersheds in child mortality: the role of effective water and sewerage infrastructure, 1880 to 1920. J Polit EconGoogle Scholar
- American Medical Association, Council on Medical Education (1922) Medical education in the United States. J Am Med Assoc 79Google Scholar
- Anderson DM, Charles KK, Las Heras Olivares C, Rees DI (forthcoming) Was the first public health campaign successful? The Tuberculosis Movement and its effect on mortality. Am Econ J Appl Econ. https://doi.org/10.1257/app.20170411
- Buchmueller T, Ham JC, Shore-Sheppard LD (2016) Economics of means-tested transfer programs in the United States, vol 1. University of Chicago Press, ChicagoGoogle Scholar
- Cain LP, Rotella EJ (2001) Death and spending: urban mortality and municipal expenditure on sanitation. Ann Demogr Hist 1:139Google Scholar
- Centers for Medicare & Medicaid Services (2018) NHE-Fact-Sheet. In: NHE Fact Sheet. https://www.cms.gov/research-statistics-data-and-systems/statistics-trends-and-reports/nationalhealthexpenddata/nhe-fact-sheet.html. Accessed 20 Aug 2018
- Clay K, Egedesø P, Hansen CW, Jensen PS (2018) Controlling tuberculosis? Evidence from the mother of all community-wide health experiments. SSRN Electron J. https://doi.org/10.2139/ssrn.3144355
- Clay K, Schnick E, Troesken W (2018). The rise and fall of pellagra in the American South. National Bureau of Economic Research Working Paper 23730Google Scholar
- Craig L (2006) Consumer expenditures. In: Carter SB, Gartner SS, Haines MR, Olmstead AL, Sutch R, Wright G (eds) Historical statistics of the United States: millennial edition, vol 3. Cambridge University Press, CambridgeGoogle Scholar
- Duffy J (1992) The sanitarians. University of Illinois Press, UrbanaGoogle Scholar
- Emery G, Emery JCH (1999) A young man’s benefit: the Independent Order of Odd Fellows and sickness insurance in the United States and Canada, 1860–1929. McGill-Queen’s University Press, Montreal & KingstonGoogle Scholar
- Falk ISC, Rorem R, Ring MD (1933) The cost of medical care. The University of Chicago Press, ChicagoGoogle Scholar
- Flexner A, Carnegie Foundation for the Advancement of Teaching, Pritchett HS (1910) Medical education in the United States and Canada; a report to the Carnegie Foundation for the Advancement of Teaching. New York CityGoogle Scholar
- Fogel RW (1994) Economic growth, population theory, and physiology: the bearing of long-term processes on the making of economic policy. Am Econ Rev 84:369–395Google Scholar
- Higgs R (1971) The transformation of the American economy, 1865–1914: an essay in interpretation. Wiley, New YorkGoogle Scholar
- Hollingsworth A (2013) The impact of sanitaria on pulmonary tuberculosis mortality: evidence from North Carolina, 1932–1940. Unpublished working paperGoogle Scholar
- Hong, Sok Chul (2007) The Health and Economic Burdens of Malaria: The American Case. Ph.D. diss., The University of ChicagoGoogle Scholar
- Humphreys M (2001) Malaria: poverty, race, and public health in the United States. The Johns Hopkins University Press, BaltimoreGoogle Scholar
- Ludmerer KR (1985) Learning to heal: the development of American medical education. Basic Books, New YorkGoogle Scholar
- Meckel RA (1990) Save the babies: American public health reform and the prevention of infant mortality, 1850–1929. The Johns Hopkins University Press, BaltimoreGoogle Scholar
- Moehling CM, Niemesh GT, Thomasson MA, Treber J (2018) Medical education reforms and the origins of the rural physician shortageGoogle Scholar
- Moehling CM, Niemesh GT, Thomasson MA (2018) Shut Down and Shut Out: Women Physicians in the Era of Medical Education Reform. Unpublished working paperGoogle Scholar
- Mokyr J, Stein R (1996) Science, health and household technology: the effect of the Pasteur revolution on consumer demand. In: Bresnahan TF, Gordon RJ (eds) The economics of new goods. The University of Chicago Press, ChicagoGoogle Scholar
- Moore JD, Smith DG (2005) Legislating Medicaid: considering Medicaid and its origins. Health Care Financ Rev 27:8Google Scholar
- Murray JE (2007) Origins of American health insurance: a history of industrial sickness funds. Yale University Press, New HavenGoogle Scholar
- Steckel RH (1995) Stature and the standard of living. J Econ Lit 33:1903–1940Google Scholar
- Sutch R (2006) National income and product. In: Carter SB, Gartner SS, Haines MR, Olmstead AL, Sutch R, Wright G (eds) Historical statistics of the United States: millennial edition, vol 3. Cambridge University Press, CambridgeGoogle Scholar
- Sylla RE, Legler JB, Wallis J (1993) Sources and Uses of Funds in State and Local Governments, 1790–1915: [United States]. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor]. https://doi.org/10.3886/ICPSR09728.v1
- Troesken W (2006) The great lead water pipe disaster. The MIT Press, Cambridge, MAGoogle Scholar
- U.S. Congress. Senate. Special Committee on Aging (1964) Blue Cross and private health insurance coverage of older Americans. Government Printing Office, 88th Congress, 2d. SessGoogle Scholar
- United States Bureau of Labor Statistics (1899) Bureau of Labor Statistics bulletin, #24, 30, 36, and 42Google Scholar
- United States Bureau of the Census (1904) Census bulletin, #20Google Scholar
- United States Bureau of the Census (1907) Statistics of cities having a population of over 30,000, 1905 (and through 1908 annually). United States Government Printing Office, Washington, DCGoogle Scholar
- United States Bureau of the Census (1909) Financial statistics of cities having a population of over 30,000, 1909 (and 1910–1913, 1915–1919, 1921–1930). United States. Government Printing Office, Washington, DCGoogle Scholar
- United States Commissioner of Education (1895–1900) Report of the Commissioner of Education. United States Government Printing Office, Washington, DCGoogle Scholar
- Wertz RW, Wertz DC (1989) Lying-in: a history of childbirth in America. Yale University Press, New Haven/LondonGoogle Scholar