How can that be? According to public health researchers, the biggest reasons are behavior and environment. Costa Ricans consume about half as many cigarettes per person as we do. Not surprisingly, they are four times less likely to die of lung cancer. The car ownership rate in Costa Rica is a fraction of what it is in the United States. That not only means that fewer Costa Ricans die in auto accidents, but that they do a lot more walking, and hence they get more exercise. Thanks to a much lower McDonald’s-to-citizen ratio, the average Costa Rican thrives on a traditional diet of rice, beans, fruits, vegetables, and a moderate amount of fried food–and therefore enjoys one of the world’s lowest rates of heart disease and other stress-related illnesses.
The simple comparison between the health of Costa Ricans and Americans suggests a whole new way to think about how to fix America’s increasingly dysfunctional health-care system–a system that these days seems to combine spiraling costs, declining coverage, and growing dissatisfaction with the quality of care. But instead of offering new ideas, both political parties in Washington are stuck in a hopeless rut, each trying to hawk plans that essentially expand the current system.
The battle over a Medicare prescription-drug benefit is a classic example. In March, President Bush unveiled a plan to provide partial drug discounts to all seniors, but full discounts only to those who leave traditional “fee-for-service” Medicare and join an HMO. Democrats derided the plan as a stealth attempt to “privatize” Medicare and argued instead for a much more generous plan that would give full discounts to all seniors, including those who remain in traditional Medicare.
Both parties should pause and reflect. For all the additional money we’re throwing into medicine, Americans aren’t getting much healthier. Maybe it’s time to try a different approach. The biggest opportunities for improving the health of Americans–and restraining health-care costs–lie in keeping people healthy, rather than treating them once they become sick. So instead of simply adding more benefits to a health-care system that is already financially unsustainable, or using new benefits to herd people into HMOs, why not offer a more sensible deal: Bribe people into taking better care of themselves. For instance, why not offer seniors who exercise bigger drug discounts than those who don’t?
This may sound radical, and it is. But the more Americans learn about the costs and failings of contemporary medicine and the extraordinary benefits they can reap from simple behavioral changes like exercising, the more such plans will begin to make sense.
To understand the value of this approach, it is important to clarify a common misperception about health care. During the 20th century, the health and life expectancy of the average American improved dramatically. A child born today can expect to live a full 30 years longer than one born in 1900. Improvements in medicine, however, played a surprisingly small role in this achievement. Public health experts agree that it contributed no more than five of those 30 years.
This may seem counterintuitive given the attention society pays to medical breakthroughs. But the changes in living and working conditions over the last century are the real reason. American cities at the turn of the last century stank of coal dust, manure, and rotting garbage. Most people still used latrines and outhouses. As recently as 1913, industrial accidents killed 23,000 Americans annually. Milk and meat were often spoiled; the water supply untreated. Trichinellosis, a dangerous parasite found in meat, infected 16 percent of the population, while food-borne bacteria such as salmonella, clostridium, and staphylococcus killed millions, especially children, 10 percent of whom died before their first birthday.
During the first half of the 20th century, living and working conditions improved vastly for most Americans. Workplace fatalities dropped 90 percent. This, combined with public health measures such as mosquito control, quarantines, and food inspections, led to dramatic declines in premature death. In 1900, 194 of every 100,000 U.S. residents died from tuberculosis. By 1940, before the advent of any effective medical treatment, reductions in over-crowded tenements combined with quarantine efforts had reduced the death rate by three-fourths.
As the century progressed, medical care grew enormously more sophisticated and effective, particularly in managing pain and preventing sudden death from traumatic injury, infection, and heart attack. But the overall gains to public health remained modest. The greatest gains came from strategic vaccination campaigns, which have virtually eliminated once-common diseases, including diphtheria, tetanus, poliomyelitis, smallpox, measles, mumps, rubella, and meningitis. But even these triumphs involved treating people before they became sick. Modern medicine’s ability to actually cure people is quite depressing. The consensus estimate, accepted by the Centers for Disease Control (CDC), is that medicine has contributed just two of the seven years in added life expectancy achieved since 1950.
The reason is that, strictly speaking, medicine doesn’t “save” lives, but extends them. If you’re like my son, who spent the first 60 days of his life in a neonatal intensive care unit, medical intervention could extend your lifespan 90 years or more–but that number diminishes if you’re 50, much more so if you’re 90.
This gets at an important truth about the role medicine plays in public health–it is concentrated primarily on the elderly, who consume about 38 percent of all health-care dollars, yet account for just 12.4 percent of the population. By definition, the elderly have fewer years of life to extend than the young. This simple fact goes a long way toward explaining medicine’s modest role in improving life expectancy: It cannot stop aging.
Sure, many best-sellers and newsweeklies tout the “longevity revolution” prompted by advances in cutting-edge medicine. But overall longevity is due more to dramatic reductions in infant mortality, which allow more people to grow old, than to modestly extended lives among the elderly. Since 1950, life expectancy at 65 has increased by just 3.45 years; among women over 65, it has actually declined slightly since 1992.
Another reason for the medical system’s limited role in extending life is that, frankly, it kills so many people. Each year nearly two million patients in U.S. hospitals get an infection, about 90,000 of whom die as a result. According to the CDC, the largest preventable cause is doctors and nurses with dirty hands. Then there is the Institute of Medicine’s well-publicized finding that “more people die in a given year as a result of medical errors than from motor vehicle accidents (43,458), breast cancer (42,297), or AIDS (16,516).” Such errors cause 2 to 4 percent of all deaths and derive not just from doctors’ indecipherable handwriting or mix-ups in the lab, but also from a lack of the same kinds of systematic quality control procedures that are commonplace in workplaces from automakers to Domino’s Pizza chains. Had the Institute considered deaths caused by medical errors outside of hospitals–in doctors’ offices, pharmacies, or outpatient clinics–the fatality rate would be even higher.
Overmedication and adverse reactions to prescription drugs also cause unnecessary deaths. In 1994, these accounted for 106,000 deaths, according to the Journal of the American Medical Association. More people are killed by adverse reactions to prescription drugs than by pulmonary disease or accidents. In fact, prescription drug deaths are surpassed only by heart disease, cancer, and stroke. The elderly, whose bodies often can’t tolerate the dosages and combinations of pills doctors prescribe them, are particularly susceptible.
Moreover, many of the treatments the medical system provides are unnecessary, further limiting their effect. Consider the wide regional disparity in the intensity of care given to patients. In Miami, the average Medicare patient is treated by 25 specialists during the last six months of life; in Minneapolis, such patients see only four specialists. Yet the result is exactly the same: death within six months. Where specialists are abundant, they find elders to treat–and Medicare pays, spending, for example, $50,000 more per patient in Miami than Minneapolis, as my colleague Shannon Brownlee recently wrote in The Atlantic. But according to John Wennberg of Dartmouth Medical School, elder persons living in regions where the use of specialists is high have no greater life expectancy than their counterparts in regions where it is low. Wennberg and his colleagues estimate that nearly 20 percent of Medicare expenditures provide no benefit in terms of survival, nor does evidence show improvement in quality of life.
Then there is the growing problem of “pseudo-disease,” defined by medical researchers Elliot S. Fischer and H. Gilbert Welch as “disease that would never become apparent to patients during their lifetime were it not for diagnostic tests.” Most Americans have a binary view of illness: Either you have a disease or you don’t. But the truth is often more subtle. Autopsy studies have shown that a third of adults have cancer cells in their thyroid; up to 40 percent of women in their 40s have ductal carcinoma in situ in their breasts; and half of men in their 60s have adenocarcinoma of the prostate. Yet each of the subjects died of other diseases. In other words, they died with their cancer, not from it, suggesting that many who have small cancers will never develop symptoms because they will die of something else before their cancers become noticeable.
Yet if your doctor discovers that you have cancer, there are two likely results: First, you will experience extraordinary and prolonged stress from the diagnosis, along with the attendant risks to health. Second, you and your doctor will try to fight the disease through radiation, chemotherapy, or surgery. Though it is difficult for a doctor and patient to know, even in terms of probability, whether such treatment is necessary, it is clear that for the broader population, the spread of diagnostic testing is causing an epidemic of “pseudo disease”–and vast commitments of medical resources that result in little, if any, gain in public health.
But what if we could get doctors and nurses to wash their hands, fix the errors in the medical system, and adapt sensible, evidence-based medicine to prevent over-treatment, overmedication, and adverse drug reaction? This would dramatically improve our health-care system and prevent millions of deaths. But the overall effect on the health and life expectancy of Americans, and on the future demand for health care, would remain startlingly small. That’s because the health-care system kicks in after most people are already ill. As the poet Joseph Malines aptly put it, it’s like an ambulance waiting at the bottom of a cliff. By the time most people receive treatment, their bodies are already compromised by stress, indulgent habits, environmental dangers, and injury. As Maline wrote in his poem, “A Fence or an Ambulance”: “If the cliff we will fence, we might almost dispense/ With the ambulance down in the valley.”
In a recent issue of Health Affairs, three researchers from the Robert Wood Johnson Foundation examined scores of studies dating back to the 1970s on what factors cause people to die prematurely. They reported that genetic predispositions account for 30 percent of premature deaths; social circumstances, 15 percent; environmental exposures, 5 percent; behavioral patterns, 40 percent; and shortfalls in medical care, 10 percent. As they note, these proportions are easily misinterpreted. Ultimately, nearly everyone’s health is determined by a combination of factors. For example, while only about 2 percent of human diseases are caused by inherited genetic mutations alone, nearly everyone carries various genetic dispositions that, when combined with a hazardous environment or unhealthy lifestyle, can contribute to ill health. But this only underscores the relatively small role medicine plays in preventing premature death.
Consider the startling difference in mortality between Utah and Nevada. These two contiguous states are similar in demographics, climate, access to health care, and average income. Yet Nevada’s infant mortality rate is 40 percent higher than Utah’s, and Nevada adults face an increased likelihood of premature death. As health-care economists Victor Fuchs and Nathan Rosenberg have pointed out, it’s hard not to attribute much of that difference to the fact that 70 percent of Utah’s population follows the strictures of the Mormon Church, which requires abstinence from tobacco, alcohol, premarital sex, and divorce. Nevada, with its freewheeling, laissez-faire culture, has the highest incidence of smoking-related death in the country; Utah the lowest. Utah has the nation’s highest birthrate, but the lowest incidence of unwed teenage mothers. Culture and behavior seem to trump access to health care in improving human life span.
Similarly, when comparing life expectancy in the United States to other countries, it becomes clear that the vast sums we spend on health care buy very little health. The roughly $4,500 per person the United States spends annually on health care far outpaces any other country. Yet three-fourths of developed countries outrank America in life expectancy and infant mortality. Indeed, for all our high-tech medicine, Jamaican seniors outlive American seniors. According to the World Health Organization, life expectancy at age 65 is roughly equal, and at 85 it’s longer in Jamaica. An argument for medical marijuana? No, it’s an argument for walking. Dr. Denise Eldemire of the University of West Indies notes that 60 percent of Jamaica’s elderly live in rural areas, where “walking is the only reliable means of transport.” According to her studies, 78 percent of Jamaican elders walk daily. By contrast, just 60 percent of the entire U.S. adult population exercises at all.
Further evidence of medicine’s limited effect is the slow pace of progress against cancer. The percentage of the U. S. population dying of cancer, while modestly improved in recent years, remains higher than in 1973, while the incidence of many specific forms of cancers, including non-Hodgkin’s lymphoma, melanoma, and female breast and lung cancer have gotten worse. Headlines often celebrate how many more Americans are surviving cancer, but the underlying data offer little to cheer about. The five-year survival rate for men diagnosed with prostate cancer has improved–but mainly because doctors are able to detect it earlier, including cases that may never have proven lethal or been so only at advanced ages. The five-year survival rate for lung cancer is unchanged since the early 1970s. Breast cancer survival rates have improved by a matter of months, but like prostate cancer, much of this is due to earlier diagnosis, not to the success of treatment. Though there has been real progress in detecting and treating cancer, much of the claimed advance in survivability is really just an increase in the incidence of pseudo-disease. Cancer still kills 1,500 Americans a day.
Mortality from diabetes, liver, and kidney disease, meanwhile, has hardly changed since the 1960s–while infectious diseases continue to grow more numerous and deadly. Thirty years ago, the surgeon general declared it time to “close the book” on infectious disease. Since then, at least 20 that were once thought conquered, from tuberculosis to salmonella, have reemerged, while 29 new ones have been identified, including HIV/AIDS, Lyme disease, and hepatitis C. Meanwhile, antibiotic-resistant strains of all sorts of microbes are cropping up, largely because doctors keep dispensing antibiotics to treat what are actually viral infections.
In the face of such trends, even a Cadillac health- insurance plan plays little, if any, measurable role in improving health and life expectancy. A RAND Corporation study compared two groups of families over 15 years, one with full medical coverage, the other with a large deductible. The families with full coverage consumed 40 percent more health-care dollars than the other groups, but researchers couldn’t detect any measurable differences in health.
These results may seem odd until one considers that the eight leading causes of death in the United States–heart disease, cancer, stroke, pulmonary diseases, accidents, pneumonia/influenza, diabetes, and suicide–are closely tied to living conditions and behavior. According to the Institute of Medicine, social and behavioral factors such as smoking, diet, alcohol use, and sedentary lifestyles contribute to approximately half of all deaths in this country. Scientists estimate that up to 75 percent of all cancer deaths result from behavior such as smoking, diet, and lack of exercise. Though modern medicine can help stave off death from such behavior, rarely can it mitigate these factors altogether. Chemotherapy, for example, may put a smoker’s lung cancer into remission. But he’ll continue to face the risk of dying from heart disease or other chronic conditions brought on by his behavior and environment–including the damage his body suffers from chemotherapy itself.
In contrast, large-scale changes in social arrangements or the environment do have profound effects on health. There is powerful statistical evidence, for instance, that hierarchy and inequality are among the major contributing causes of premature death. The first hint of this came in a famous 1967 study of British civil service workers, which found that, within a given office mortality rates would increase, step by step, as one moved down the organization chart. Those at the bottom suffered three times the death rate of those at the top. Since everyone had equal access to health care under Britain’s universal, socialized system, the study suggested that one’s socioeconomic status is a key determinant of health.
Since then, a cascade of studies has confirmed the relationship between equality and health. The healthiest states, such as Utah, Iowa, and New Hampshire, are also those with the least disparity of income, while states such as Louisiana, Mississippi, and New York lead the nation in both poor population health and income inequality. Similarly, wealthy nations with low income inequality, such as Sweden and Japan, have higher life expectancy than wealthy countries in which income is less evenly shared, such as the United States and Britain.
This phenomenon isn’t associated simply with extreme concentrations of poverty or wealth. Across nations and races, under both single-payer systems that provide universal care and market-driven systems, life expectancy gradually increases according to socioeconomic status. There is a raging debate over why this is so. Some researchers suggest that a widening gap between the rich and everyone else leads to deepening stress, frustration, and ultimately self-destructive behavior among people struggling unsuccessfully toward the top. (Imagine the unhappy American salesman who relieves his stress with booze, cigarettes, and occasionally compulsive unprotected sex with strangers.) Others speculate that political support for government services critical to health, such as clean water and police protection, erodes when too many of a society’s resources are controlled by a narrow elite.
Others turn the question on its head, suggesting that the rich get ahead because they are, on average, healthier than everyone else to begin with and smart enough to know how to stay that way. Or it may be that education plays a key role, too. Those who do well in school may learn a greater awareness of how to lead a healthy life, and they may also have greater discipline and ability to defer gratification. In any event, those with a bright financial future certainly have more to lose, in a monetary sense at least, by indulging in unhealthy behavior.
But there is one point of agreement among all serious students of public health, which is that environment and social conditions play an overwhelming role in determining the prevalence of diseases and premature death. Indeed, a study published in the Journal of the American Medical Association estimates that 40 percent of all deaths are caused by behavior patterns that could be prevented. And yet, approximately 95 percent of the $1 trillion dollars the nation spends on health goes for direct medical care services to individuals. Only 5 percent goes for measures designed to promote more healthy behavior among the population as a whole.
Persuading Americans to take better care of themselves is no easy task. As prohibition and the drug war demonstrate, simply criminalizing unhealthy behavior goes only so far. Moreover, most of the unhealthy behavior we’re talking about–say, eating Big Macs–shouldn’t be criminalized in the first place. Imposing “sin” taxes, while somewhat effective, can only do so much without creating black markets. And most Americans are appropriately resentful of government efforts to penalize them for lifestyle choices. That’s why, instead of punishing citizens for unhealthy behavior, the government should concentrate on reducing the major environmental causes of premature death–not just pollution, but poverty and hazardous living conditions–while also paying you to clean up your act. Here are three ideas on how to do it:
Drugs for Jumping Jacks: The benefits to older people of even moderate exercise are overwhelming. As a report sponsored by the AARP and other health and aging groups concludes: “Scientific evidence increasingly indicates that physical activity can extend years of active independent life, reduce disability, and improve the quality of life for older persons.” And yet approximately 34 percent of those ages 50 and older are sedentary, and fewer than half of older adults report that their physician has suggested exercise.
Meanwhile, with Medicare’s insolvency looming in 2030, both political parties are competing to offer a plan that would subsidize prescription drugs for seniors. These plans attempt to meet a real problem: Higher prescription drug costs are eating away at the economic well being of many moderate-income seniors. There’s little evidence, however, that such an entitlement would increase longevity. According to the Department of Health and Human Services, only 2 percent of the nation’s elderly report being unable to obtain a needed prescription drug even once in the course of the year. Moreover, an estimated 17 percent of all hospital admissions among persons over 70 result from harmful combinations of prescriptions drugs. Overmedication in hospitals and nursing homes is a leading form of elder abuse.
So if we’re going to expand Medicare to cover prescription drugs, let’s extract a quid pro quo to help defray the cost while giving seniors more years of active, independent life: offer every American over 50 a voucher to join a gym or exercise program. Those who use it and can demonstrate attendance will become entitled to heavily subsidized prescription drugs, regardless of financial need–think of it as drugs for jumping jacks. So will those too frail to exercise. But let those who are willfully unhealthy pay for their own drugs.
Death by Sprawl: On a statistical basis, what’s most likely to get you killed in the next year: (A) living in Israel during the Intifada; (B) living in crime-ridden, inner-city Baltimore, Chicago, Dallas, Houston, Milwaukee, Minneapolis-St. Paul, Philadelphia, or Pittsburgh; or (C) living in the bucolic outer suburbs of those cities? The answer is overwhelmingly C. A recent study by University of Virginia professor William H. Lucy found that Americans’ migration into sprawling outer suburbs is actually a huge cause of premature death. In the suburbs, you’re less likely to be killed by a stranger–unless you count strangers driving cars. Residents of inner-city Houston, for example, face about a 1.5 in 10,000 chance of being killed in the coming year by either a murderous stranger or in an automobile accident. But in the Houston suburb of Montgomery County, residents are 50 percent more likely to die from one of those two causes because the incidence of automobile accidents is so much higher.
Sprawling, auto-dependent suburbs are unhealthy in other ways, too. In such an environment, almost no one walks–and for good reason. In 1999, 4,906 pedestrians died, 873 of them children under 14. Not surprisingly, metro areas marked by sprawling development and a high degree of auto dependency–Orlando, Tampa, West Palm Beach, and Memphis, among others–are the most dangerous regions to walk in.
But rarely walking or riding a bike can also be deadly. Largely because of sprawl, the number of trips people take on foot has dropped by 42 percent in the last 20 years. This is particularly true among children. In 1977, children ages 5 to 15 walked or biked 15.8 percent of the time. By 1995, the rate dropped to only 9.9 percent. Seventy percent of all trips children take today are in the back seats of cars. So sprawl not only substantially increases the odds of dying in an auto crash, it also discourages routine exercise.
This is no small matter. Walking 10 blocks or more per day reduces the chance of heart disease in women by a third. The risks associated with a sedentary lifestyle rival those of hypertension, high cholesterol, diabetes, and even smoking. According to the surgeon general, the economic costs of obesity total $117 billion a year, about 9.4 percent of health-care spending. Americans who never exercise cost the health-care system $76.6 billion a year. Sprawl does not fully account for our increasingly sedentary lives, but it is a major factor, and therefore a leading cause of premature death.
Sprawl also leads to high levels of social isolation, which has its own public-health implications. Lonely individuals who are cut off from regular contact with friends and neighbors face highly elevated risks for heart diseases and other disorders. What’s cause and effect is not entirely clear, but Robert Putnam, a professor of public policy at Harvard University, has found that an isolated individual’s chances of dying over the next year fall by half if he joins a group, two-thirds if he joins two.
The good news is that reducing subsidies for sprawl is among the biggest policy levers available to improve public health. This includes reforming gas taxes that are currently nowhere near high enough to recoup the environmental costs of driving, let alone to compensate for the losses to the economy caused by auto-related deaths and injuries. And it includes ending overinvestment in new roads and highways, and directing more toward mass transit, bike trails, and sidewalks. Thanks to the surgeon general’s warnings and vastly increased tobacco taxes, millions of Americans have overcome their addiction to nicotine. It’s equally important for the federal government to warn Americans about the health hazards of auto-dependent sprawl and provide financial incentives to encourage a healthier environment and lifestyle.
Instead of paying a fare, for example, transit users should receive a dollar’s credit on their swipe cards for up to three rides a day, financed by drivers who will enjoy less traffic, cleaner air, and a smaller burden on the health care system. The government could also offer greater home mortgage deductions to homeowners who move to cities and developments served by mass transit. These measures might at first seem politically unfeasible, but presented to an aging population as a way to improve public health and fix a failing health-care system, they may gain real political traction.
The Americans Without Disabilities Act: The Americans With Disability Act mandates everything from how parking lots and public bathrooms are arranged to how employers organize workplaces. Yet it does nothing to prevent disability. Why not adapt parallel legislation that would prevent Americans from becoming disabled in the first place?
For instance, the National Cancer Institute recommends at least five servings of fruits and vegetables a day–but prices for fruits and vegetables have increased more than any other food category in recent years. Expand the Food Stamp program so that everyone is entitled to generous, free weekly allowances of fruits and vegetables. Or how about creating an Interstate Bicycle Highway System using abandoned railroad right-of-way? Instead of charging tolls, pay cyclists according to the number of miles they’ve pedaled. Or how about mandating that companies that employ 25 or more workers provide on-site exercise rooms or tax-free benefits to cover gym membership? Or offer a $200-a-month benefit increase to obese welfare recipients who shed at least 20 pounds, using the subsequent decrease in Medicaid expenditures to meet the cost? The ideas are practically limitless (see sidebar).
How might American life change for the better if we took this approach? Consider the problem of the uninsured. Currently, the cost of health care is outpacing economic growth, so maintaining the number of insured people would seem enough of a challenge. But the question of what health care costs depends overwhelmingly on how much is needed–and that is determined largely by how Americans conduct their lives. How fat are we? How sedentary? How much pollution do we create? How much do we suffer from loneliness, depression, and social isolation? How much do we smoke, drink, or abuse drugs? How productively do we age? What the Costa Rican example shows us is that with the right behavioral changes in lifestyle and social environment, we too could lower health-care costs–maybe not to $273 per person, but low enough to afford universal health-care access. And Americans wouldn’t even need to forego superfluous treatments; Costa Rica boasts world-class plastic surgeons and cosmetic dentists and still offers free universal health.
That would, however, require more time walking. And some of us would have to be bribed to take better care of ourselves. And there would be big expenses for building better transit systems, and more compact, socially cohesive, less-polluted communities. But which system seems like the better bargain?
There are clear signs that Americans are becoming fed up with the current health-care system and open to bold new approaches. Marcus Welby would be shocked, for example, to know what Americans think of doctors these days. In the late 1960s, when of millions of viewers tuned in to watch the avuncular M.D. offer sage advice to his patients about the root causes of their illnesses, more than 70 percent of Americans had confidence in medical leaders; today, only 40 percent trust doctors. A mere 29 percent of the public agrees with the statement: “The health-care system would work better if doctors had full control of the system.”
And it seems the more people know about health care, the less faith they have in doctors and their remedies. While half the public now says it lacks trust in “scientific solutions” for health care, nearly 80 percent of health-care policy professionals share this doubt. According to a study that appeared recently in the medical journal, Milbank Quarterly, the largest single factor driving down trust in doctors–among the general public, but especially among health-care-policy experts–is mounting concern about the ineffectiveness of modern medicine.
In Greek mythology, the god of medicine, Asclepios, had two daughters. Hygeia was the daughter responsible for prevention, while, Panacea was responsible for cure. Today, to the detriment of our nation’s health, we’re fixated on the idea that medicine will produce a panacea. It?s time to listen to her more powerful sister.