On October 21, government vans, chartered buses, and even neighborhood mail trucks delivered a steady stream of postal workers to the ambulance bay of D.C. General Hospital. Nearly 4,000 people lined up to have their noses swabbed for anthrax spores and to pick up Ziploc bags of antibiotics. The parking lot of the 195-year-old public hospital swarmed with TV crews, network reporters, and Salvation Army volunteers in full uniform handing out Gatorade before faded signs that read: “D.C. General: Where Miracles Happen Every Day.” Even New York Times columnist Maureen Dowd, who’d never before seen fit to write about the hospital, claimed to have dropped in.

Many of the luminaries seemed oblivious to an irony not lost on the postal workers, who were surprised to have been summoned to D.C. General. “I thought it was closed,” said Anita Burrell, an employee of the Washington Square post office.

Indeed, it was. In June, the District of Columbia government had officially ceased to operate the city hospital, which had been bleeding money for years and, at the end, could barely keep X-ray machines operational due to lack of investment. As with other public hospitals nationwide, D.C. General had always been viewed by the city’s elite as a grimy, mismanaged dump, grudgingly kept open so that less desirable elements could be steered clear of the better hospitals, which routinely dumped their own poor patients there. (D.C. General was once the only hospital in the city that would admit blacks.) It’s no coincidence that the hospital shares a compound with the morgue, the sexually transmitted disease clinic, the detox clinic, and the jail.

Yet even in its neglected state, the hospital served a critical function, treating inmates with tuberculosis and drunks with feet broken from staggering in front of cars, and nursing tiny crack-addicted babies. During the early years of the AIDS epidemic, D.C. General took in hundreds of middle-class white gays kicked out of private hospitals when their insurance ran out. Five minutes from Capitol Hill, the old hospital even treated Russell Weston, the schizophrenic man who, in 1998, shot up the Capitol and killed two security guards.

During the recent boom times, though, the city’s congressional overseers decided that D.C. General’s $50-million annual city subsidy was government waste in need of trimming. Members of a Republican Congress led the charge to shut it down. Of course, almost every other hospital in the city was bleeding money, too, even private ones like Georgetown University Medical Center, which lost $90 million last year. Tax-exempt Georgetown was also subsidized by the federal government. But no one suggested closing it down. After all, Georgetown catered to the Establishment. By contrast, D.C. General was a place the Establishment never anticipated needing. Four months after it closed, though, the wisdom of that decision was seriously challenged, since no other hospital in the city has the capacity to handle the influx of postal workers in need of anthrax tests.

A rich society with a viable public health system would have directed the postal workers to a gleaming, modern facility configured to handle afflicted populations, complete with microbiology labs and cafeteria. Instead, the jittery federal employees ended up at a crumbling, closed public hospital where health authorities relied on improvisation and the Salvation Army to battle terrorism.

The makeshift operation exposed the toll that 30 years of neglect has taken on our collective ability to provide everything from standard emergency medical service to disease surveillance. Prior to September 11, people of means never treated the breakdown as a crisis, in large part because they thought they could buy their way around the holes in the public-health system. Those who fell through the holes—the poor—landed at places like D.C. General and were never heard from again.

The recent rash of bioterrorism, however, has made the poor, the middle class, and Maureen Dowd equally vulnerable to the failings of our public health system. Those very same un-derfunded health services that would have helped the poor before September 11 are exactly what’s needed to save the rest of us now.

Government officials trying to assuage our fears of a bioterrorism attack have frequently pointed to the successful containment of the New York City smallpox outbreak of 1947. Back then, New York hadn’t experienced such an outbreak in almost 50 years, and much like today, few doctors and nurses had ever seen a smallpox case. City health officials lacked the proper tools even to diagnose it.

But the New York public health department had a lot of other things working in its favor, including the remarkable capacity to manufacture its own smallpox vaccine. The city health commissioner sent the lab into 24-hour-a-day production, turned every public building into a vaccination station, and sent nurses out to knock on doors and track down the infected. The city managed to vaccinate six million people in just a few weeks, limiting the outbreak to 11 cases and two deaths.

The containment was a tremendous response by American public health officials, but there’s no guarantee of a repeat performance today. In 1947, the country had not yet launched its hospital construction boom, and it boasted a fairly comprehensive network of neighborhood public health clinics that offered well-baby stations where nurses counseled mothers on proper care and feeding of their babies, as well as immunizations, disease surveillance, and other medical services. David Rosner, a public health historian at Columbia University, says much of the success of the 1947 smallpox campaign stemmed from the critical fact that the public never really panicked. The immunizations were an orderly process, says Rosner, largely because New Yorkers knew a system was in place that would ensure an equitable distribution of medical resources.

Americans today can be pretty sure that any government response to a terrorism-induced epidemic will be anything but orderly. The great public health infrastructure that existed in 1947 has badly eroded. In part, public health has become a victim of its own success, through its tremendous progress in eradicating deadly infectious diseases. But it has also suffered from 30 years of political assaults on health and welfare programs that make up the national safety net. Politicians didn’t believe that voters valued traditional public health services often associated with the problems of the poor. They helped fuel what Rosner calls the “individual culpability movement,” the notion that health is a personal responsibility, not a public one. The government health message, says Rosner, became: “Save yourself, quit smoking.”

Ronald Reagan delivered the biggest blow to public health with his war on federal funding for health related anti-poverty programs. As he inflated defense budgets, Reagan eliminated the entire U.S. Public Health Service Corps, which provided doctors to rural hospitals, Indian reservations, and other underserved areas, and shuttered Public Health Service Hospitals. The Indian Health Service and Office of Refugee Health were decimated and Medicaid rolls slashed. By 1993, funding for public health programs as a proportion of the American healthcare budget had plunged 25 percent.

Public health took more whacks when Republicans regained control of Congress in 1994. In their campaign against the federal government, lawmakers delivered federal health funds directly to the states in block grants with few strings attached. States seized on the opportunity to divert the money to other needs. For instance, in 1995, Arizona Gov. Fife Symington used health and welfare funds to increase spending on police, prisons, and highways, and almost halved public health, according to Laurie Garrett in her tremendous book on global public health, Betrayal of Trust. The advent of the AIDS epidemic somewhat stanched the bleeding as affluent white gay men discovered firsthand how bad the public health system had become. Their appearance at D.C. General in the early’90s spurred some brief improvements, but those disappeared when they did.

In the next few months, Health and Human Services Secretary Tommy Thompson may be able to round up enough vaccine to inoculate a few million Americans against smallpox. But he’ll still face the problem of how to distribute it. Without the public network of clinics and nurses that existed in 1947, Rosner says a similar program for smallpox vaccinations today would launch “ugly battles over who gets it.”

The outrage among postal workers neglected for treatment, and the run on the anthrax drug, Cipro, may be an inkling of what lies ahead. Thanks to our lack of preparedness, the now-famous “Dark Winter” simulations have predicted that a bioterrorism attack in a major urban center would cause civil unrest that would make the 1968 riots look like a frat party. At a 1998 Senate hearing, bioterrorism expert Michael Osterholm warned, “A single case of meningitis in a local high school causes enough fear and panic to bring down a whole community. Now imagine you’re telling people,’This is going to unfold for eight weeks, and I can’t tell you if you’re going to die.”’

Infectious disease is a great equalizer. As one expert notes, if smallpox were to break out in New York City, the military would close the bridges before the rich could flee to the Hamptons. The potentially infectious of all classes would be too dangerous to move around, and likely would end up in one place, which, in D.C., could be the parking lot of shuttered D.C. General. This prospect helps explain the sudden and extreme anthrax paranoia among highly educated and privileged people like Georgetown socialite and Washington Post columnist Sally Quinn, who has publicly admitted to buying gas masks and other survivalist equipment. In more peaceful times, Quinn could be reasonably safe in assuming that her money and position would buffer her from infectious disease, or, were she exposed, would ensure her the best private medical care.

Terrorism changes that equation. Why else would Maureen Dowd try, as she has admitted in her column, to hoard Cipro? It’s the fear of competition for limited resources, for which the privileged are accustomed to being the first in line. Terrorism raises the possibility that to get their Cipro or smallpox shots, Quinn and Dowd would have to fight their neighbors along with the great unwashed likely to be piled up already at the local emergency room—an unpalatable situation for anyone, but especially for those accustomed to being above the fray.

Bioterrorists don’t need rare pathogens like smallpox or glanders to inflict misery. In many parts of the United States, old-fashioned whooping cough, measles, or even polio germs would do the trick. Years of healthcare cuts for the poor and the disappearance of public health infrastructure has left the U.S. with some childhood immunization rates that rival some Third World countries.

At the beginning of the Clinton administration, former HHS Secretary Donna Shalala begged American doctors to improve childhood vaccination. At the time, only 44 percent of all American children were fully vaccinated, and diseases once thought eradicated were resurfacing with a vengeance. In 1990 in Los Angeles, where fewer than half of the children under five years old were vaccinated, an outbreak of measles hospitalized 107,000 and killed 40, mostly poor, minority, and uninsured kids.

The Clinton administration launched a drive to raise immunization rates, and by 1999, those rates reached record highs. But in cities like Houston, with huge numbers of uninsured, that still meant that, according to the CDC, some 40 percent of children weren’t being fully immunized. Yet attempts to bolster those numbers were undercut by the revival of a serious anti-immunization movement.

This movement included Gulf War veterans who believed, without much evidence, that they had been sickened by a military anthrax vaccine. But the most influential opposition to national immunization programs came from a vocal group of middle-class parents, who claimed (again, without any scientific evidence) that their children had developed autism as a result of vaccinations for measles, mumps, and diphtheria. The movement received national endorsement from Republican bulldog Rep. Dan Burton of Indiana, whose granddaughter is autistic. Burton, who has held hearings on the purported connection between autism and childhood vaccines, has led a charge in Congress to bash the Centers for disease Control (CDC) about vaccine research and to spread misinformation about the safety of these immunizations.

The publicity prompted many parents to begin opting out of mandatory immunization programs. In 1998, for instance, more than 11,000 children in Colorado legally avoided getting immunized because of their parents’ philosophical and religious opposition. Even more alarming, in the late 1990s, between 60 and 70 percent of all cases of measles, mumps, and whooping cough occurred among white children with health insurance—a completely new phenomenon.

Immunization programs work best if everyone participates, particularly to the advantage of the poor, who tend to be most vulnerable to outbreaks of infectious diseases. Likewise, the anti-immunization movement endangers everyone’s babies; infants too young to be immunized are vulnerable to dying from just the diseases (whooping cough, measles) that are easily spread by older, middle-class white kids whose parents opted out of vaccinations.

The movement also undermined the nation’s ability to manufacture critical vaccines. Producing vaccines has never been a very profitable enterprise. Most pharmaceutical plants would prefer to make Rogaine—more money, less risk. Because vaccines do carry some risk of injury, although far lower than the risk of the diseases they protect against, vaccine makers have always been subject to lawsuits. But in 1988, the litigation became so burdensome that the nation’s vaccine manufacturers threatened to shut down all together. To keep the manufacturers afloat, the government created a special victim-compensation fund to limit their liability.

Though the fund helped maintain some production capacities, anti-immunization activists have continued to pummel vaccine makers with class-action lawsuits alleging that vaccines caused everything from autism to diabetes. Not every suit is covered by the fund. As a result of such pressures (and the lack of financial incentives), there is only a single manufacturer of measles, mumps, and rubella vaccine. Earlier this year, one of the country’s two tetanus-vaccine makers ended production, creating a critical national shortage. Similar problems surround flu vaccines.

Anti-immunization activists have also weakened state mandatory immunization laws and fought a Clinton initiative to create computerized vaccination registries in every state that would help ensure that all kids get their shots. For instance, two years ago in Idaho, which has the second-worst immunization rate of any state, immunization opponents, including the Christian Coalition, fought such a state registry. The Idaho legislature finally passed a bill establishing the registry—and then refused to fund it. In Texas, activists forced public health officials to purge the names of 700,000 kids from that state’s registry.

Such systems would not only critically raise immunization rates among poor kids, they might have proven useful in containing any outbreak of bioterrorism.

Since September 11, Americans demanding new smallpox and anthrax vaccinations have drowned out the anti-immunization activists. (A Montana veterinarian friend of my father’s recently received a call from a woman begging for the anthrax vaccine used on cattle. She wanted to use it on her kids.) But some damage has been done. The movement undoubtedly undermined public health networks that today might serve as the “front line of defense” against bioterrorism.

When those 3,500 postal workers showed up at D.C. General for their Cipro, it was fortunate that most weren’t actually sick. A few dozen anthrax cases could have completely overloaded the city’s hospital system. Even on a regular day, if you call 911 for an ambulance in D.C., there’s a good chance you’ll be in for a long ride before it finds some place to deliver you. Since D.C. General closed, city hospitals have seen their emergency rooms overrun by poor and uninsured patients. The wait in some private hospital ERs now exceeds several hours, and they routinely close their ERs to ambulances when patients start to back up, a scene familiar across the country.

Over the past four years, government Medicaid and Medicare cuts, managed care, and an increase in the number of uninsured Americans have combined to shrink the nation’s supply of available hospital beds and left many hospitals financially ailing. In some places, this has been a welcome contraction, as the glut of beds increases healthcare costs. But market forces, not public health officials, have determined the pattern of downsizing, and the market hasn’t done a very good job of providing for emergency capacity or uninsured clientele. As a result, in nearly every city considered a likely target of bioterrorism, the hospitals that would be on the front lines can barely handle flu season, much less an outbreak of smallpox.

In the last five years, California has closed more than 23 hospitals and 50 emergency rooms, and more than half the remaining hospitals are losing money. When a flu epidemic swept Los Angeles in 1998, the county had to implement disaster plans used for earthquakes and other crises. Massachusetts lost 24 percent of its hospital beds between 1988 and 1998. In a recent one-week period in Boston, the city’s 17 major hospitals were operating at an unheard of 96.2 percent occupancy rate, and emergency rooms have closed to ambulances on a regular basis. In Cleveland, four of the region’s leading hospitals last year were in bankruptcy; the high-level trauma center at Mt. Sinai was closed and its teaching program shuttered. In the month of May, metro Cleveland’s 22 emergency rooms were simultaneously closed to ambulances for almost 10 percent of the month due to the lack of space.

All told, American hospitals lost 103,000 staffed beds and 7,800 medical/surgical beds during the last decade, even as the population grew by 10 percent. And the CDC reports that 370 emergency departments disappeared between 1994 and 1999.

This trend is likely to continue. Thomas Prince, a health economist at the Kellogg School of Management at Northwestern University, who has studied the hospital industry’s contraction, predicted in 1999 that 800 health care facilities would close within five years. His predictions are proving remarkably accurate. Needless to say, this doesn’t make the prospect for a public health response to bioterrorism very reassuring.

The hospital industry recently asked Congress for $11 billion for such disaster-relief equipment as radios, bullhorns, gloves, masks, a few hazmat suits, ID badges, decontamination rooms, portable shower systems, outdoor decontamination tents, antibiotics for a 24-hour period, and other supplies like wheelchairs and stretchers. While it may comfort some to know their local hospital has a hazmat suit, stocking bankrupt, understaffed hospitals with such gear won’t much improve their ability to respond to a terrorist strike.

Real homeland defense would require untangling the nation’s emergency room problems, and that can only be accomplished by dealing with the plight of the 43 million uninsured people who rely on emergency rooms for primary medical care. Even as emergency rooms have gone dark across the country, ER visits jumped 14 percent between 1994 and 1999, largely because the number of uninsured Americans grew by 10 million over the past decade. Coupled with a nursing shortage, the overload in the ER has become a life-or-death issue even without a terrorist attack. In Los Angeles, where nearly one in three people is uninsured, and nursing vacancy rates top 20 percent, dozen of emergency rooms in the heart the city are regularly closed to ambulances. The state is now is investigating three deaths in Los Angeles, after emergency dialysis treatments were delayed, according to U.S. News & World Report.

Things are no better on the other coast. Mohammad Akhter, executive director of the American Public Health Association and the former D.C. Public Health director, says the hospital system would be quickly overwhelmed in a terrorist attack. In fact, the only reason New York hospitals could respond to the World Trade Center attack is that there were so few injuries—most of the victims simply died. “There is no community in the U.S. where there is capacity to deal with 500 very sick people all at once,” says Akhter. “We are now really at a point where we have cut our health care to the bone.”

The lack of health insurance and adequate healthcare for all Americans has created another, perhaps more menacing problem. Dozens of studies have shown that people who don’t have health insurance tend to delay seeking treatment when sick—until they’re in critical condition. This fact never moved most lawmakers to address the problem before September 11. But what if some deranged lunatic does unleash smallpox on Washington, D.C.? We know the first people likely to get it will be the poor and uninsured, because they were less likely, when young, to get vaccinated and because poor diet and weakened immune systems leave them most vulnerable. In fact, a smart terrorist wouldn’t release smallpox into the air vents at the State Department, he’d do it at the Community for Creative Non-Violence (CCNV), the 1,400-bed homeless shelter nearby.

CCNU is stocked to the gills with HIV-infected, pneumonia-wracked, drug-addled, and mentally ill poor living in close quarters, and smallpox would spread through it like wildfire just as drug-resistant TB once did. And instead of getting quick medical care (and isolation) the way State Department employees would, those homeless residents would fan out across the city, riding buses and subways, visiting libraries and churches, and even sleeping on the steam grates at the State Department. They would be walking time bombs, nearly impossible to track. Healthcare for the homeless could become a matter of national security. Wouldn’t Reagan love that?

When cholera morbus arrived in East London in 1832, the local townships had little in the way of public health infrastructure to help combat an epidemic. Initially, in a faint echo of the way American leaders reassured postal workers, East London business interests denied that cholera had even arrived in Sunderland, 250 miles away, thereby justifying inaction. Later, the rich and the Whig radicals thought it best to simply ignore the poor suffering the brunt of the epidemic. But after factory workforces were decimated by cholera—800 people in Sunderland eventually died of the disease—and the threat to the middle class finally became evident, the government grudgingly began to act. Officials created public health boards empowered to take action to clean up the slums believed to be the breeding ground of the disease. They ordered proper disposal of infected bodies, the cleaning of houses, and better sewer systems—improvements that became permanent.

Cholera was not actually the main killer in the slums during this time. TB killed more people, and the poor were lucky just to reach adulthood among the horrors of the East London slums. But cholera was mysterious, and, like many of the possible bioweapons we know about today, it produced a quick and nasty death that even the rich could not escape. The hysteria surrounding cholera, even more than the death toll, proved to be a great motivator. The sanitary movement eventually made its way to the U.S., as did the message about the perils of ignoring the health of the poor.

As Mohammad Akhter notes, “Rarely do public health initiatives start because of charity.” As with the response to the cholera epidemics, great public-health developments are always tied to the self-interest of the establishment. In the U.S. during World War I, for instance, the government discovered a tremendous amount of disability among draftees that prevented many from serving. After that, the military engaged in a healthcare campaign to guarantee its potential supply of soldiers. Similarly, after World War II, Congress passed the Hill-Burton act in 1946, which set off the federal government’s move to spend $4 billion modernizing and building hospitals around the country. The spending spree laid the groundwork for today’s remaining public hospitals. Similar tales describe the creation of health insurance and occupational health clinics, both formed to keep workers on the job. “It wasn’t,” Akhter points out, “because we wanted to do something for the poor.”

Sadly, it’s taken the threat of anthrax in the subway for the Establishment to rediscover the value of those old public hospitals and neighborhood baby stations. Just as few could escape the cholera epidemics of the 19th century, few Americans today will be safe from bioterrorism. Even if the rich and powerful could somehow wall themselves off from disaster, Dick Cheney-style, their status remains heavily dependent on the good health of middle-class workers, who won’t have the luxury of escape. If Maureen Dowd and Sally Quinn really want to protect themselves, they’d do well to forget the gas masks and spend a little less time chronicling the Washington cocktail party scene and a little more time championing the renaissance of D.C. General.

Stephanie Mencimer is an editor of The Washington Monthly.

Stephanie Mencimer

Stephanie Mencimer is a senior reporter at Mother Jones and a Washington Monthly contributing editor.