Asked why Americans are healthier than we were 100 years ago, one might answer: better antibiotics, new cancer and HIV medications, or heart bypasses and neonatal intensive care performed at gleaming new academic centers. One might note the passage of Medicaid and Medicare, then the Affordable Care Act, that expanded financial access to advanced care.
Therapeutic and coverage advances have improved millions of lives. Yet in many ways these mattered less than more mundane practices: proper sanitation, food inspection, tobacco control, universal vaccination, HIV prevention. We take for granted an entire apparatus of infectious disease surveillance and screening, outbreak investigation, and so much more.
These and other essential activities, which we group under the rubric “public health,” are performed through a complex and fragile infrastructure operated through the Centers for Disease Control and Prevention (CDC) and other, lesser-known federal agencies which act in partnership with a myriad of state and local health departments. Meanwhile, the daily operations of public health mostly operate in the background. They are incredibly boring—until they are not.
Public health prevention activities are boring, under normal circumstances, for many reasons. They mainly help diffuse, unidentified constituencies. When they work properly, they are basically invisible. We don’t call up to thank the health department when we eat a pork chop without getting sick, or when we have sex without contracting chlamydia. Meanwhile, acute medical services vividly serve identifiable human beings in desperate need—and who receive that help through some of the most well-financed and influential institutions in American life. It’s the way of the world that fancy lung cancer therapies will always receive greater attention and political support (not to mention the most generous insurance reimbursement) than some smoking cessation helpline that costs maybe 1 percent as much.
It’s only when emergency strikes that public health takes center-stage, whether it’s been adequately funded or not. In that terrifying moment, monies will flow. Yet as Columbia political scientist Larry Brown observes, the accompanying political urgency proves impatient and fleeting. Facing a crisis, we focus on mounting an effective immediate response, not on preparing a solid infrastructure that will ready us for the next crisis, after the spotlight has faded.
COVID-19 has laid bare the human and economic costs of our neglect. At last count, more than 173,000 Americans have died since February from an epidemiological catastrophe whose specific timing could not have been predicted, but whose broad outline and perils were anticipated with uncanny accuracy by Bill Gates, Barack Obama, and George W. Bush, and many others. We are witnessing, in real time, one of the most lethal and comprehensive governance failures in American history.
The extent of our financing failures is ironically underscored by heartwarming anecdotes of Americans supporting GoFundMe efforts to donate masks, gloves, and other supplies to academic medical centers and local health departments across the country. America spent $3.8 trillion on medical care last year. So how could we fall desperately short in providing basic masks, which can be manufactured for maybe a dollar each? To take one of a million examples, Chicago’s Rush University Medical Center features a gleaming $600 million butterfly-shaped tower, within whose walls one can obtain the latest in high-tech care. How is it even conceivable that such a facility would need or accept donations from local high schools for basic PPE? And why are homeless people, across America, vulnerable to COVID sleeping two feet apart on the floors of repurposed gymnasiums, within eyeshot of gleaming hospitals festooned with million-dollar lobbies and dedicated aquariums?
If one investigates particular reasons for our failure, the mystery resembles the old Murder on the Orient Express: There are many suspects, and they’re all guilty. The Trump Administration’s incompetent and polarizing response helped turn what might have otherwise been a frightening set of fairly localized outbreaks into a national catastrophe. The Centers for Disease Control and Prevention, the government of China, and the World Health Organization all faltered. Many governors and mayors across the country hesitated and made key mistakes, too.
Yet the most damaging failures go beyond any one mistake, any one official or administration. Yes, even this one. They reflect deeper systemic factors, factors that tilt all of our health-care spending toward the individualized and reactive, toward acute care for injury and illness and away from preventive measures and emergency preparedness that advance public health. We must use this moment of national catastrophe and humiliation to craft a politically durable approach to financing public health in America.
Two specific changes would make a big difference:
1. Our two largest health entitlements, Medicaid and Medicare, must adopt public health within their core missions.
By some estimates, aggressive case-finding, contact tracing, and supported isolation for COVID patients to the end of 2021 would cost $75 billion. That’s an eye-popping number in the context of public health. It’s maybe five times the CDC’s total budget for the same period. It’s also a very small number in the overall context of American medical care. It’s about 2 percent of our annual $3.8 trillion health-care bill. It’s about one-tenth of federal and state Medicaid expenditures over the same period, an even smaller fraction of Medicare’s comparable spend. Each of these programs spends about as much in a single week as the CDC spends all year on public health.
A sustainable public health funding model requires that we deploy Medicare and Medicaid’s huge dollars and administrative capacity more intentionally and more automatically to protect public health beyond acute patient care.
Following this model, Medicare and Medicaid should reimburse health departments and others for public health services that help beneficiaries. These programs should pay when their recipients are involved in case-finding and contact tracing, and whenever recipients require physical isolation or other measures to prevent further transmission. In the same way these programs finance care for cancer patients and newborns, Medicare and Medicaid should automatically cover services for whoever needs them, rather than providing essential public health services as we do now—until a fixed budget runs out.
A better financing system would help states and localities hit hard by public health challenges feel that they can proceed with confidence. They need the ability to plan and execute long-term public health strategies knowing that the required public health interventions will be properly reimbursed.
Medicare and Medicaid’s organizational infrastructure and information technology could also fill critical gaps. Johns Hopkins health services researcher Brendan Saloner notes over email that the companies and non-profit managed-care organizations that administer Medicare, Medicaid, and much private insurance coverage can be more active in identifying and engaging their most vulnerable members. These high-tech organizations are adept in the application of big-data predictive-analytics to process warehouses of electronic patient records, and to identify people who face the highest infection risks.
By comparison, health departments remain far behind. Public health is a technologically lagging sector, embarrassingly dependent on obsolete paper- and fax-based reporting systems to identify infected people and to locate others at-risk. Medicare and Medicaid must help to change that by supporting common data standards and IT investments to help public health authorities catch up.
This model extends beyond infectious diseases to other key services. I was recently on a call discussing how to create a sustainable financial model for trauma recovery services for domestic violence survivors. Here’s one takeaway on how to do that: Medicaid and Medicare could simply reimburse social service agencies that provide these services when they are needed. That’s how we pay when a new prostate cancer drug is discovered, or when a Medicare recipient needs a new wheelchair. Are public health activities and services really so different?
Medicaid and Medicaid are also entitlements, for patients, for providers—and no less important, for state and local governments. If more Medicare recipients are diagnosed with lung cancer we don’t run out of money to pay for radiation treatment. There is no waiting list. We don’t cap enrollment when recessions hit. We automatically spend more to meet the need. Yet if we need public information campaigns for young people warning them of the dangers of smoking—or if more people need HIV or opioid overdose prevention, or if more children need protection against lead poisoning—well, that’s a different story. Currently, these needs must be addressed within existing budgets, through new appropriations every year, or through politically fragile grant programs such as the Affordable Care Act’s Prevention and Public Health Fund, which takes repeated funding hits. Votes must be taken. The budget must be adequate. And that all better be true the year after, too, or support will again be yanked away.
All of which calls to mind a more radical suggestion.
2. Public health itself should become an entitlement, financed through a state-federal partnership similar to what we now do with Medicaid.
Despite many of Medicaid’s shortcomings, the program’s shared spending model encourages states to be more generous than they would otherwise be. ACA’s Medicaid expansion is the extreme example. It pays 90 percent of the costs of covered services. And once something is covered, it generally stays covered. We don’t need to re-litigate this every fiscal year. That is a huge incentive for states to cover dental care and other services. The federal government isn’t as generous in other parts of Medicaid, though the formulas still subsidize state programs. It matches regular Medicaid expenditures dollar-for-dollar in wealthy states such as Massachusetts and New York, and three-to-one or better in West Virginia and Mississippi. This gives states a big incentive to cover new services.
There is no equivalent matching model for public health spending, on which we have left the main burden on state and local governments. When a state or locality considers spending more money on (say) restaurant food inspection or case finding for infectious disease, it typically bears all the additional costs. In HIV prevention and other public health domains, federal agencies provide fixed-dollar grants for specific activities. States and localities use these rather-small grants as best they can. When that money runs out, communities face little financial incentive to do much more.
A state looking to finance services for low-income citizens would much rather expand Medicaid services, for which the federal government will pay at least half the bill, than expand public health prevention efforts that bring no additional federal money. That’s a huge disincentive.
State and local health departments also face the tender mercies of annual appropriations processes. These processes typically include balanced budget requirements that force painful cuts at precisely the moments of economic stress when public health services are most needed. Only the federal government can deficit spend on health crises during recessions.
Imagine, therefore, a world in which this were different, in which the federal government paid state governments the same matching rate for qualified public health efforts that it now pays for Medicaid.
The federal government pays $0.73 on the dollar when Alabama provides nursing home care for diabetes, lung cancer, and COVID patients through its Medicaid program. If we think that’s sensible—which it is—maybe the federal government should pay that same $0.73 on the dollar when Alabama health departments pursue measures to control COVID. The same model should apply when Birmingham deploys public health workers to prevent drug overdoses among the homeless, when it monitors local restaurants to prevent food poisoning, or when it runs social media campaigns to discourage alcohol and tobacco misuse.
At this point, it’s tempting to claim that increased public health spending will pay for itself or even save money. In particular moments and cases—say, if we contain the next pandemic—maybe it will. Most of the time, though, it won’t. But that’s okay. Breakthrough cancer therapies rarely save money. They merely allow millions of people to live longer and healthier lives. When the therapies work, this is good value for the money, even when medical care costs a bit more. We should apply the same principle to public health and prevention.
Right now, states and localities spend an estimated $100 billion on public health and related services every year, about $300 for every American. Let’s suppose hugely ambitious new federal supports for public health roughly doubles that, with all of the $100 billion increase coming from the federal government. Let’s ignore the corresponding fiscal subsidy to states and localities that desperately need the money. Let’s further suppose that there are no offsetting spending reductions from improved health, no magic macroeconomic budgetary asterisks. It’s just a pure $100 billion increase in the annual federal deficit.
This seems like a big number, but it’s not. Such a program would be a far smaller contributor to federal debt than the Trump Administration’s 2017 tax bill. It would represent only about one-sixth of the comparable annual increase in Medicare spending we can expect by 2029. Dollar-for-dollar, such public health investments would probably accomplish much more.
America’s model to finance public health is broken. We should use this emergency to enact a more durable and ambitious replacement. No one-time cash infusion will make us ready for the next public health threat.
Public health must become an entitlement. We can’t afford to do anything less.