A few months ago I spent a couple of days watching open-heart surgery being performed. The operating room was full of people: a supervising nurse, a scrub nurse, a few residents who just watched, and three surgeons doing the actual cutting and sewing.

Besides all of them, there were two others. At the head of the operating table, dressed in green surgical scrubs, stood an anaesthesiologist, who kept an eye on the patient’s heartbeat, blood pressure, brain waves, potassium level, and other vital signs, and sometimes, at the suggestion of the lead surgeon, injected one drug or another into his bloodstream. Over to the side, also dressed in green surgical scrubs, sat a perfusionist, whose job it was to operate the heart-lung machine, which pumps and oxygenates a patient’s blood during open-heart surgery so as to allow the heart to lie still for a time. Like the anaesthesiologist, the perfusionist kept an eye on various vital signs and occasionally, at the surgeon’s request, made an adjustment or two.

The perfusionist looked young and wore a beard, so I assumed he was a resident and was impressed that someone in only his seventh or eighth year of post-college training would be given such an important job to do. Once after an operation I approached him outside the operating room and said, “Do they always let residents run the heart-lung machine?”

He looked surprised. “I’m not a resident,” he said.

“Oh, so you’ve finished your training?”

“No, I’m not a doctor. I’m a medical technician. I only went to school for two years.”

“How much money do you make?”

“Fifteen thousand dollars.”

“How much does the anaesthesiologist make?”

“About $150,000.”

Anyone who spends much time around a hospital will see non-doctors and doctors working together so closely that the difference between their functions is sometimes invisible to the naked eye. What you can’t see is that while the work is often similar, the money they make isn’t even in the same ballpark: everybody involved in the medical world who isn’t a doctor earns somewhere between a fifth and a tenth of what doctors earn. Having been to medical school seems to affect one’s income a lot more than it affects one’s ability to treat patients effectively.

It’s difficult for most people to accept the idea that many matters of health care are fairly simple and can be done well by people without much training, but it’s true. In less delicate kinds of surgery than open-heart, one sometimes sees a nurse-anaesthetist, rather than an anaesthesiologist, at the head of the table. If radiologists and pathologists didn’t wear long white coats it would be difficult to tell them apart from x-ray and lab technicians just by looking at what they do. Ordinary registered nurses can be seen giving shots, inserting intravenous lines, and setting splints—and in some remote rural clinics, they do all of that without a doctor even being present. The lordly surgeon has no non-doctor equivalent who is allowed to cut and sew—but in 1977, when it was discovered that William MacKay, a salesman of prosthetic devices, had performed an artificial hip replacement at Smithtown General Hospital on Long Island, the hospital’s defense was that MacKay did this kind of thing all the time. There are some medical arts, like sewing up wounds, that simply don’t correspond to educational attainment—whether or not you can sew up someone is a matter of how good your hands are, not how smart you are. Some people who aren’t doctors in fact do doctors’ work all the time now; those people (indeed, anyone) ought to be legally allowed to perform medical tasks according to their ability to master them, not their educational credentials.

Non-doctors can do many of the doctors’ tasks just as well, but the law reserves responsibility for the work for the doctors. That means it’s very difficult to explore which medical matters can be handled perfectly well by people without an M.D.—that is, where the line should fall between sewing up a cut and doing open-heart surgery. It means non-doctors are often prohibited from doing things that everybody knows they’re competent to do. It means that non-doctors can’t perform medical services independently, directly charging patients their own bargain rates instead of doctors’ sumptuous ones. And that’s one reason why your medical bills are so high.

What doctors know that non-doctors don’t seldom has a direct bearing on what they actually do all day in their practices. Most American doctors have taken two years of chemistry, one of physics, one of biology, and one of math in college; two years of further classroom training in the basic sciences in medical school; two more years of medical school spent rotating among the medical and surgical specialties to get a taste of each one; one horrible, grueling, sleepless year as an intern; two slightly more pleasant years as a resident; and possibly more advanced specialty training after that. The reason for all this training is, first, that doctors provide a more important service than anyone else—and in matters of life and death it’s best to err on the side of overpreparing people. Second, in the same way that knowing Latin and Greek was once thought to subtly enrich the performance of adult occupations completely unrelated to those languages, it’s thought today that the year the obstetrician spent taking organic chemistry, or his three months on a neurology rotation, will give him a breadth of knowledge and understanding that make him do a better job. Most doctors will readily admit that what they actually do all day doesn’t require seven years of training to learn. But they usually say there are four or five moments in the course of a month when having been through all that gives them an appreciation of some delicate shade of meaning that the technician at their side wouldn’t be able to pick up.

So in an ideal world, all that training is probably a good idea, just as it would probably be a good idea to require all government employees to read Remembrance of Things Past. Certainly there are some areas of medicine, like brain surgery, where the idea of someone going into practice after a couple of years of training is pretty terrifying. But for much of the rest, it’s a question, as the economists say, of marginal utility: are the subtle benefits of requiring so much training of those who practice medicine worth the costs?

One of the costs is medical education, which is not only enormously expensive but also is almost completely financed by federal, state, and local governments, and by private philanthropy. And all that is only a fraction of the cost of medical education, since having the degree gives a doctor the license to charge so much for his services. There’s also the problem of distribution: the longer a doctor trains, the more likely he is to become a specialist in a big city, and that has left us short on primary care in rural areas—exactly the kind of work that paramedics can do well, and inexpensively. Finally, there’s a problem of opportunity: medicine as it is now structured is open almost exclusively to those who have decided on it as a career by age 18 or 19, and who are mature enough at that point to be able to do well in advanced science courses. Every doctor can tell heartwarming stories of the 32-year-old housewife who went back to school, took her pre-med courses, and became a doctor, but by and large medicine is more closed to the late bloomer than any other field. By the same token, those who do become doctors do so only at a high price in lost breadth of non-medical experience and human contact.

The Flexner Influence

In a society where the reasons for just about everything seem impossibly complex, the reason doctors have to train so long is relatively simple. The structure of medical education and licensing in America today are to an amazing extent the product of the assiduous efforts of one man, Abraham Flexner, who died in 1959 at the age of 92. Understanding what Flexner did helps explain why medicine became so dependent on a long training period and therefore so expensive. And because Flexner was very much a product of his times, it also helps explain why the professions were created in the first place and how they became so powerful and so remunerative.

Abraham Flexner was born in Louisville in 1866, one of nine children of an immigrant German merchant. When he was 17 years old his older brother, a druggist, gave him a thousand dollars and sent him to the Johns Hopkins University in Baltimore to be educated. This was, he later wrote, “the decisive moment of my life.” He went through Hopkins in two years and developed a boundless admiration for the institution, for higher education in general, for the president of Hopkins, Daniel Coit Gilman, and, especially, for Gilman’s proudest creation, the Johns Hopkins Medical School. Growing up in Louisville, one of Flexner’s most vivid memories was of the primitive state of medicine. “The wdid hygiene was not a part of our vocabulary,” he wrote in his memoirs. “Infectious and contagious diseases were rampant and were accepted as matters of course.” At Hopkins, Flexner saw Dr. William H. Welch building a school that would change all that.

After Hopkins Flexner returned to Louisville and started a school; left that and took an extended tour of Europe; and on his return to the United States settled in New York and went to work for the brand-new Carnegie Foundation. His first assignment there was to travel around the United States and Canada and write a report on the state of medical education.

Flexner set out on his journey in 1908, at the height of the Progressive era. The nation’s increasing urbanization and the wild excesses of late nineteenth-century free-market capitalism had created a nation that Henry Adams likened to a dynamo spinning out of control. All right-thinking people perceived a desperate need for a greater ordering of society, and a new class of professionals and experts (of whom Flexner himself was certainly one) was rising to meet the need. In the 20 years preceding Flexner’s trip, the great foundations had begun to spring up and to finance and propagandize for a more organized, professionalized system of education. Lawyers founded the American Bar Association and began to administer state bar examinations. Compulsory school attendance laws were passed, and the states began to require specific training of their public school teachers. Academic tenure became popular. Social workers started the National Federation of Settlements. Journalism, once wildly

partisan, became objective, and schools were founded to teach the new technical newsgathering skills. The American Medical Association had been founded back in 1846, but between 1900 and 1910 its membership grew from 8,400 to more than 70,000. In the federal government, civil servants came to outnumber political appointees.

In medicine, no less than all the other fields, the reason for all this systematizing was that without it important technical work would be left in the hands of charlatans and hacks. That meant more general instability, ill-served students, patients, and clients, and less prestige for the professions. In the course of his travels, Flexner visited 155 medical schools in the business of turning out all manner of “doctors”—homeopaths, osteopaths, eclectics, botanics, Hahnemanians, empirics, steamers, hydropaths, Grahamists, and many others. The schools were generally run as a sideline by practicing doctors, who packed them with students because the tuition money went directly into their pockets. They often had no labs and no affiliated hospitals; many had no admission requirements, not even a high school diploma. The length and course of study varied wildly, and the woods were full of self-proclaimed doctors who had never been near even one of these woefully inadequate medical schools.

Flexner moved quickly through the country, for, as he later wrote, “in the course of a few hours a reliable estimate could be made respecting the possibilities of teaching modern medicine in almost any one of the 155 schools I visited.” He would examine the records for the students, and find them unprepared; look at the labs, and find them filthy; inspect the hospital beds, and find them insufficient; and inquire about the faculty, and find it second-rate. All in all, he found American medical education to be “sordid, hideous, unintelligent,” with the single noble exception of his beloved Johns Hopkins Medical School.

His prescription was that 115 of the 155 schools should simply shut down, and that the remaining 40 be assiduously reorganized on scientific principles, with strict admissions requirements, full-time faculties, up-to-date labs, and plenty of affiliated hospital beds. He realized full well that this would mean fewer doctors, and that was fine with him; under his scheme the care would be better, and the profession would be able to offer greater financial inducements in order to attract people of quality. As for the “poor boy” who might be shut out of the medical profession by the requirement of college attendance, “he need only take thought in good season, lay his plans, be prudent, and stick to his purpose,” and he could enter medicine the same as anyone else.

Flexner’s report, published in 1910 as Bulletin Number Four of the Carnegie Foundation, was an immediate success—partly because its findings were so shocking and its suggestions so sensible, and partly because it was music to the ears of the fast-growing AMA, which saw in Flexner’s program better health care, more respect from the public, and, of course, more money. Flexner himself spent much of the next 20 years putting his own recommendations into effect. He must have been a very persuasive man, and he spoke from total conviction in the virtues of professionalism, education and private philanthropy. “No foundation of any importance,” he wrote, “has ever interested itself in perpetuating the educational, social, or economic status quo.” By 1933 there were only 66 medical schools in the country, and those that survived were much richer, thanks in no small part to the fund-raising talents of Flexner. He had a genius for presenting his vision of the professional land of milk and honey to the older generation of robber barons, now finished making their fortunes and ready to do good. His autobiography is full of moments like these:

“What would you do,” asked Mr. Gates, “if you had a million dollars with which to make a start in the work of reorganizing medical education?”

“Well,” said Colonel Ullman, “Can we do anything with $300,000?”

“Very well,” said Mr. Rosenwald, “I will start the subscription with half a million.”

“I’ll give you $5 million, including the dental clinic valued at $1 million, if the Board will give $5 million.”

In his later years, Flexner founded the Institute of Advanced Study and then spent his retirement writing biographies of some of his idols in the medical, educational, and foundation worlds, and being profusely honored. At his ninetieth birthday dinner, a representative of the AMA credited him with having made the greatest single contribution in history to medical education.

In Love With Technology

Until the end of World War II, the world that Flexner made functioned pretty much as it was supposed to: schools produced doctors of high quality who went primarily into general practice and spread themselves fairly evenly around the country, charging generous but hardly ruinous fees.

Then, in the late forties, things began to change very fast. Rich, victorious, optimistic, in love with technology, and determined to provide a better life for its citizens, the government began to pour money into the structure of professions and institutions that had been born in the first decade of the century and that seemed to be working so perfectly. With the help of true believers like Mary Lasker and Senator Lister Hill, medicine received more of the government’s money and attention than any other area. Biomedical research was funded by the new National Institutes of Health. Hospitals were built under the Hill-Burton Act. The passing years brought OEO grants, health manpower grants, Medicare and Medicaid, nursing homes, clinics, funding for medical schools, grants and loans for medical students, and the enormous growth of private health insurance. In 1930 the nation spent $3.6 billion on health, 13 per cent of which came from the government; 40 years later our health bill was $69.2 billion, with 37 per cent coming from the government.

By the end of the war the physicians-per-thousand-population rate was less than half what it was when Flexner wrote his report, and even today it is substantially less; so naturally all this new money vastly increased the earnings of doctors. From 1960 to 1977 the consumer price index for doctors’ fees nearly tripled. Each citizen spent, on the average, $17.52 on doctors’ bills in 1950, $30.57 in 1960, and $145.84 in 1977. Between 1960 and 1976, the average income of doctors more than doubled. More and more doctors coming into practice were choosing specialties over general practice and the cities over small towns and rural areas, partly because specialties are more advanced and prestigious and partly because they’re better paid. In 1965, 22 per cent of American doctors were in general practice; in 1975, 12 per cent. Today there are about 350,000 doctors in the country, averaging about $ 100,000 a year before expenses and $60,000 after, which makes them by far the highest-paid occupational group in the country.

Those numbers are the reason why it’s now worth wondering whether Flexner’s wonderful system is still the answer. Today there are about a million licensed registered nurses in the country, making an average of about $12,000 a year. If their pay stays even in the same ballpark, obviously the more of what is now reserved for doctors that can be turned over to them, the less health care will cost. Organized medicine always resolutely opposes that kind of measure, and not only out of financial self-interest—most doctors also share Flexner’s horror of charlatans and butchers being allowed to practice, and just as you probably can’t imagine that somebody who hasn’t been to college could do your job, they can’t imagine that somebody who hasn’t been to medical school could do theirs. Of these fears, the fear of charlatans is the most legitimate, but with prices where they are it’s well worth figuring out a way to ensure competence via less training, or less of the kind of training that is an automatic passport to $60,000 a year. The functional difference between doctors and non-M.D. health personnel just isn’t anywhere as great as the economic difference.

For example: obstetricians are doctors who deliver babies, and nurse-midwives are nurses who deliver babies. The obstetricians train seven years, the nurse-midwives considerably less, and as a result the obstetricians think the nurse-midwives don’t know what they’re doing. But at one California hospital, when two nurse-midwives came in, the infant mortality rate dropped from 23.9 to 10.3 per thousand. When the doctors succeeded in kicking out the nurse-midwives, the rate went up to 32.1 per thousand. Similarly, lab technicians (with no M.D.) and pathologists (with an M.D.) often do the same work. So do x-ray technicians and radiologists. Psychiatrists and psychoanalysts go through four years of medical school and three of internship and residency, and then use almost none of what they learned there in their daily work. Optometrists (without an M.D.) have shown that they can do much of what ophthalmologists (with an M.D.) were for years successful in reserving for themselves. Nurse-anaesthetists can do much of the work of anaesthesiologists. Rural and inner-city clinics have had happy experiences using nurses to run the show completely, including prescribing some drugs.

More generally, the internship, which is so horribly unpleasant that it usually convinces doctors that no amount they’re later paid could be too much, need not be so pointedly tortuous. There is some point to doctors spending a year seeing a lot of patients, and even some point to doctors having to stay up all night two or three times. But every night for a year? This is a case where the older doctors ought to be given more, not less, work: they should have to share the burden of late-night hospital duty so that everyone would do it occasionally rather than a few doing it constantly.

Of course, there will always be difficult and specialized cases that the non-doctors really aren’t equipped to handle—the high-risk m other’s delivery, the elderly patient’s anaesthesia, the delicate eye surgery, the psychiatric emergency. It’s essential that non-doctors in medicine know how to spot a difficult case and send it on to someone who knows how to treat it, just as general practitioners send on their difficult cases to specialists. But to do the bread and butter, we just don’t have to pay what we’re paying now. Most doctors admit that they learned how to do their jobs mostly by standing around and watching other people do it. People who haven’t gone to medical school can learn in the same way.

In the 70 years since the Flexner report was published, our society has done an admirable job on all fronts of getting rid of the hacks and the charlatans. We live in a charlatan-free society— or anyway there is no charlatan who is not at least properly credentialed. But in those years every profession has taken the occasion of purifying itself to also restrict its numbers, up its rates, grant tenure to its members, and close itself to what Flexner used to call the “poor boys”—all this on the assumption that there is a perfect fit between strictly enforced educational requirements and professional competence. The flaw in Flexner’s perfect vision is that the fit is loose; it’s possible to do the job without having had the schooling, and to have had the schooling and not be able to do the job. Seventy years ago, when the professions were new, that was a minor point. Now that they control most of our services and a good portion of our money, it isn’t minor any more.

Nicholas Lemann

Nicholas Lemann is a professor at Columbia Journalism School and a staff writer for The New Yorker. His most recent book is Transaction Man.