I have long been a fan of David Brooks, both in the pages of The New York Times and on his weekly PBS’ News Hour commentary every Friday. He is thoughtful and tries to consider alternative viewpoints. For almost 20 years, he partnered with the clearly liberal Baltimore Post’s Mark Shields, often clashing on key philosophic issues. Now he is alongside Jonathan Capehart of The Washington Post. Capehart is less acerbic than Brooks but holds his own in some of their heady disagreements although Brooks, who began as an acolyte of the legendary conservative William F. Buckley, has become a less strident version of his former self. Brooks is still usually described as a conservative but I don’t think that is accurate anymore. He was an enthusiastic supporter of Barack Obama and then Hillary Clinton in their campaigns (https://en.wikipedia.org/wiki/David_Brooks_(commentator). Brooks has long been vehemently opposed to Trump (he once labelled him a “sociopath”). In many matters, he is decidedly moderate and even liberal. Brooks writes about politics, culture and society. Although he has the mild manner of a professor, when he is passionate about a topic you can see that he is a repository of almost-ready-to-erupt volcanos.

Almost ten years ago, August 16, 2014, I ‘published’ the first of my blog pages. It was entitled “Heartless hospitals – part 1” and ended with “to be continued” (https://stephenageller.com/2014/08/16/heartless-hospitals-part-1/). Although there have been more than 50 blogs since that time, I have never written a “part 2” despite often thinking about it. Part 2 is now prompted by David Brooks’ article, “Death by a Thousand Paper Cuts,” in the January 19, 2024 issue of The New York Times (https://www.nytimes.com/2024/01/18/opinion/american-life-bureaucracy.html).

The Times’ article deals with the increasing and, in Brooks view, increasingly negative, role played by administrators in medicine, education and society. This thoughtful and perceptive column, with which I mostly agree (with some caveats) prompts me to first offer my version of “the history of medicine” which is devoted to the rise of hospital administration to a point of almost complete power. In my view, the hospital/health care in today’s America has lost its focus on patients and currently provides the greatest benefit for an increasingly small part of the population.

My long-ago blog reviewed the history of hospitals and you might enjoy reading it to fill in the gaps of this very brief historical summary. Suffice it to say, my premise is that hospitals’ evolution to a business benefitting fewer and fewer people has occurred over the last two millennia. In the beginning the purpose was mostly to contain the ‘incurables,’ those sorry souls afflicted by some of the awful infectious diseases that once plagued mankind, such as smallpox, tuberculosis and cholera, just to name a few. By sequestering the sick within hospital structures, the whole rest of humanity was protected. There was a period—the 19th and the first half of the 20th centuries—during which physicians were the principal beneficiaries. Not in the financial sense—that would come in the last half of the 20th century—but rather in their freedom to learn and experiment and advance in so many exciting endeavors that lead to the extraordinary developments of the last 50 years.

After World War II, at least in the United States, Western Europe and as few countries of the Far East, doctors were so enthralled by their clinical and research activities that they increasingly resented the growing management responsibilities of the hospital. There was a time, particularly before World War II, when the administrative head of most of American hospitals—big and small—was a physician. The opposite is true now; most hospitals are managed by lay administrators. Indeed, it often seems, as David Brooks suggests, that hospitals now exist for the benefit of administrators. Some hospital administrators earn more than five or six million dollars a year. This financial benefit is not my concern, however. More importantly, administrators can have total or near-total control of many thousands of people, including physicians, without having been exposed to the moral and professional ethics that have been integral to the traditions of medicine since Hippocrates (460 B.C.E.-?).

In the first half of the 20th century most American hospitals were led by doctors, often for limited terms. They generally maintained their medical practices. Other administrators answered to them. Gradually, as healthcare costs rose and financial management became increasingly complicated, the lay administrator again became dominant.

The profession of medical administrator has grown almost exponentially since the 1950’s. Brooks, in his column, notes that administrators “redistribute power from workers to rule makers, and in so doing sap initiative, discretion, creativity and drive.” This is true, but not completely so, and the question that Brooks does not ask is: does it matter?

Over a third of all health care costs go to administration. As with all data, further details are necessary. This includes most of the hospital employees other than doctors and nurses, meaning those with M.B.A. degrees, but also clerks, engineers, maintenance people, food services, porters, et cetera. In 2021, according to the U.S. Bureau of Labor Statistics (BLS), there were more than 480,000 people engaged in healthcare administration. That is staggering, particularly when we consider that there are only about 930,000 physicians. The numbers for physicians are likely accurate but the numbers for administrators may be faulty with some studies indicating there may be as many 10 people engaged in administrative matters for every physician practicing medicine. The data for expenditures is more precise with multiple studies showing, in the 50 years from 1970 to 2019, healthcare costs in the United States, on a per-person basis, rose from $353 to $11,582.

Supporters of the dramatic increase in numbers of healthcare administrators argue that the driving forces are the changes in the healthcare industry, the increase in the number of agencies/insurers able to provide financial mechanisms to support health care expenditures, advances in technology, both medical and informational, new regulatory requirements coupled with unprecedented scrutiny by regulatory agencies.

Florence Nightingale (1820-1910) is often given credit for being the first hospital administrator. She certainly was one of the earliest but there were others before her, usually doctors but often someone from the community who sat on a Board advising the hospital. As one example, Benjamin Harrison (1734-1856) was the Treasurer of Guy’s Hospital in London and unequivocally was the all-powerful decision-maker responsible for all things, financial and non-financial, that happened at Guy’s, including selections to the medical staff as well as determining who was to be the principal physician. When the great Richard Bright (1789-1858) retired as Physician, the brilliant but slightly eccentric Thomas Hodgkin (1798-1865) wanted to succeed him. Harrison, who was an owner of the Hudson Bay Trading Company, resented Hodgkin’s humanitarian efforts, particularly his opposition to American slavery and the mistreatment of native Americans. Harrison blocked Hodgkin’s ascension and chose, instead, Benjamin Babington, a capable but far from brilliant physician. Hodgkin’s qualities as a physician, scientist and humanitarian, as well as his highly original ideas about health care in general and caring for the poor in particular, would have made him a marvelous leader.

Oscar Wilde (1854-1900) wrote: “The bureaucracy is expanding to meet the needs of the expanding bureaucracy.”

I once had a copy of, and read, the 1993 health care plan offered by Hillary Clinton and Ira Magaziner. One of its key components was drastically reducing the number of insurers, so there would be fewer administrators in health care had it passed. Would medical care be better? I think so, mostly because of the plan’s strong commitment to education and research, both of which suffered during the administration of the second Bush and, of course, during the awful Trump years. In the major teaching centers these things are, of course, supported but are far from priorities in most settings. One can’t be sure, however, since so many important advancements have occurred in the era of expanding administration, particularly in terms of technology (e.g. CAT scans, MRI), therapeutics (e.g. vaccines and specific medications for AIDS, malignancies, and more) and molecular biology (e.g. the human genome project, molecular diagnostics, molecular treatments for Sickle Cell disease and other disorders, and more).

Many have experienced the frustration when you must dispute a rejection by an insurer for a health care expenditure deemed necessary by a physician. Challenges from physicians and even patients can reverse this. That is the ‘cost’ to the health system of this seemingly trivial exchange? What is the cost to the doctor and his/her staff who must devote time and effort to deal with this? What is the cost to the patient in terms of their well-being.

Administrators are often pejoratively labeled as “bean counters,” concerned with the bottom line of the budget sheet and barely cognizant of the needs of the individual patient. It is no secret that many large healthcare systems dictate the amount of time doctors should spend with patients and make purchasing decisions based on cost-benefit determinations rather than seeking highest quality items.

With limited patient contact time increasingly the norm, traditional methods of diagnosis, which require physical contact, are no longer practiced. COVID obviously contributed to this but it was already a fact before the epidemic, in part because of technologic advances. Why spend time struggling to hear a heart murmur when you can easily (and relatively inexpensively) look at the heart structures directly with ultrasound? Increasingly, stethoscopes are becoming historical relics. Why inflate a rubber cuff to determine blood pressure when you can automate the process and also get the heart rate? Even in surgery, robotics, directed by technicians, are supplanting doctors.

During medical school years (1960-64) we learned about advances made by physicians studying, and sometimes experimenting on, themselves. These were presented as examples of courage for the benefit of mankind. Walter Reed, Jesse Lazear and co-workers in Cuba allowed mosquitos to bite them to prove the way in which Yellow Fever is transmitted. As another example, cardiac catheterization was developed by Werner Forssmann in the 1930s. He inserted a catheter into his own brachial vein and then visualized it in his right atrium with x-ray. Lazear unfortunately died from the experiment and Forssmann was twice fired for his research before winning the Nobel Prize. Barry Marshall swallowed a broth containing Helicobacter pylori organisms and had himself biopsied after he developed symptoms of gastritis to prove that association. He also won a Nobel Prize. Some renowned researchers felt that the only valid informed consent was when the experimenter, who understood the complexities of medical research better than could be possibly explained, experimented on themself. The Wikipedia page “Self-experimentation in medicine” lists many dozens of examples. But those things aren’t taught anymore because of several logical, but uninspiring, objections.

The ultimate issue is to determine if health care is worse than it ever was or better. Most would point to the overwhelmingly beneficial advances of the past half-century (newer and better medications, less-invasive surgical techniques, highly reliable imaging methods, newer laboratory testing methods including molecular testing, multi-organ—e.g. kidney, heart, liver, lung, pancreas, intestine, bone marrow and more— transplantation). The major health failures—famines, epidemics, infant mortality, maternal mortality, and more—are politically and economically based, and not driven by the medical establishment (although physicians are complicit for not, in general, doing enough to confront the situation).

When we moved from Los Angeles to New York eight months ago I needed physicians for myself and family. Specialists were easy to find, primary care physicians (PCPs) not so easy. I found that the many capable physicians are either not taking new patients or aren’t covered by all insurance plans. After weeks of research I finally found an excellent, caring and available physician. Recently she informed us that she was changing from the medical-school-based group practice in which she was a member to having a “concierge” practice, where we could contract with her for a yearly fee (a remuneration scheme first suggested by Thomas Hodgkin almost two hundred years ago). The webpage of her new group says: … we have redesigned the user experience to focus on the patient-provider relationship and provide convenient access to specialists.” And also: … “Our physicians are able to dedicate more time to understanding your medical history and partner on a personalized health plan tailored to your goals. Members also have a direct phone line to a designated Health Navigator, a trusted resource to coordinate any care-related needs from appointments, to VIP specialist referrals, insurance support, billing, and more.” Sounds great! I would have stayed with her as our physician but she is no longer geographically desirable. Do I have any bad feelings about a physician improving her way of life? I really don’t, but I can’t completely escape the environment in which I grew up.

My residency chairman, the legendary Hans Popper, M.D., Ph.D., started his work day between 8 and 9 AM (we, the residents started between 7:00 and 7:30 AM) and he finished about 9 PM (I usually left about 7:00 to 7:30 except on the evenings I got to use the electron microscope for research when I left at about 9:30 PM; usually once a week). He worked on Saturdays from 9 AM to 6 PM (many, but not all, of us were in from 7:30 AM until about 2 PM. Sunday, he came in about 9AM and left at noon. Even after residency I always worked on Saturdays and sometimes even on Sundays. My internship was far more time demanding. For 9 of the 12 months I worked 36 hours on and 12 hours off, on duty every other weekend. One month, when assigned to the emergency room, it was 24-hours on and 24-hours off. The other two months, on Pediatrics, was every day plus every third night.

My internship year was one of the best years of my life. I revered the idea of the hospital from the time I first worked in one as a 15-year-old volunteer. My internship year made me love and revere it even more. In my later years, one of my colleagues, then in his late 70s, was asked by a resident if he went to a synagogue or a church. He waved his hand, vaguely pointing to the walls of the hospital and said, “This is my temple.” So it was for me in every hospital in which I have been. I still recall walking back toward my room in the hours long after midnight during that internship after having, in some way, taken care of a patient. I was tired and needed sleep. The corridors were empty and the lights were dimmed. There were no sounds other than my footsteps intermittently punctuated by the sound of a door somewhere out of sight or a patient calling out for help or the elevator sliding open to take me for a few hours, or maybe just minutes, to get some rest. I was so happy!

In 1984, Libby Zion, an 18-year-old woman, died at The New York Hospital, one of the finest health care institutions in the world. Her death is now believed to be due to an interaction between a medication she was taking and one with which she was treated. At the time, however, another highly publicized cause was said to be that the residents taking care of her were overworked and made errors because of fatigue. Subsequently the Libby Zion law was passed that limits the amount of time residents can work in New York (later adopted throughout America) to 80 hours per week. Although I continue to be grateful for the long hours of my internship when I know I learned an almost infinite amount about taking care of patients, I know that some of my fellow interns were not able to handle the long, sometimes grueling, hours as I did.

The physician who was taking care of me and moved recommended a young internist in her group as a new primary care physician. I reviewed his background and saw that he finished his residency two years ago. I don’t know. Is it egomania for me to think, after those many hours of my internship and then more than 50 years as a pathologist, that I might know more than he does? I want my doctor to be smarter than I am. I’m no different than any other patient.
Is it better for young physicians today to not learn the work habits I learned? Many of them accomplish what they need to get done in whatever time they have. I’m just not sure I fully understand it. I guess young physicians today can also have that experience, that feeling I had after a long, exhausting day of patient care, but I’m not sure.

What has this diversion to do with the rest of this essay? It’s another way in which hospitals and health care have changed. As a pathology resident I did not spend as long hours in the hospital as I did as an intern so the Libby Zion law would not have affected me.

Physicians and medical students, increasingly, seek M.B.A. degrees. How will this change medicine? Will hospitals again be managed by physicians? Which component of their education, the medical or the administrative, dominate? And is Brooks slightly short-sighted? Free-standing hospitals are rapidly being swallowed by hospital “systems” where 20 or more institutions may be under the control of a single management entity. Venerable institutions lose their identity. Great hospitals serve as keystones in a complex bridge of multiple hospitals that traverse cities, states and the nation. When you get an M.B.A. do you experience sitting with a critically ill patient? Do you comfort a family after their loved one dies? Of course not. So what?

Hospital administration is increasingly specialized with financial experts, information experts, logistics experts, infrastructure experts, et cetera, et cetera, et cetera. But it is probably the leaders of the insurance industry who will rise to the top. I imagine very few physicians will fill the role.

Sooner that we can imagine, diagnoses will be the domain of data managers, especially imagers and laboratorians. Newer techniques will allow more meaningful visualization of body structures and their abnormalities. Newer blood, urine, sweat and spinal fluid tests, especially those based on molecular tests, will allow infectious conditions, inherited disorders, toxic/environmental-based illnesses and cancer to be diagnoses in a matter of minutes. But the primarily administrative people will likely be the power-players.

Not quite that soon, but eventually, the physician, whose education will be unlike anything we currently know but will be based on artificial intelligence (A.I.), incredible imaging and the most sophisticated ability to evaluate and modulate the human genome, normal and abnormal, will resemble ‘Bones,’ the chief medical officer of the U.S.S. Enterprise on Star Trek. He or she will use a simple hand-held device that will not only establish the unequivocally correct diagnosis but also render appropriate therapy. But he or she will just be another crew member on Captain Kirk’s ship.

David Brooks may have missed a key point: changes in healthcare aren’t necessarily bad, they’re often just different.