For all the glory and awe portrayed of hospitals and physicians in popular television, the real history of American hospitals and its physicians started out far from that reputation. Benjamin Rush, one of the founding fathers of the United States and a prominent physician, was so appalled by American hospitals that he called them “the sinks of human life.”
For most of the 19th century, hospitals were known as places where the poor and “insane” went to die.
At the same time, many physicians in the early 19th century were either quacks or snake oils salesman. Ironically, their model was none other than Benjamin Rush, who as a leading physician in America proposed the use of “Heroic Medicine”–bloodletting and purging to shock the body back to health after an illness.
The American public saw poverty as the cause of many diseases and not as a result of poor living conditions. Political leaders believed that low morals led to ill health. Therefore the poor were responsible for their own illnesses.
American hospitals in the early 19th century were mainly funded by wealthy citizens who donated money as part of their civic duties. These hospitals primarily treated the poor but offered very little actual medical therapy. Physicians were not paid by these hospitals but instead volunteered to treat the poor as a way to gain practice and to gain more prestige. They also provided a place of housing for the chronically ill. Surgery in these hospitals was unsafe rampant with infections. Dying, as a result, was very common. The affluent stayed away from hospitals, as they were treated in their homes by physicians.
After American independence, standards of medical education in this country declined dramatically as fewer physicians were going to Europe for training. Poorly trained doctors would open their own medical schools more as diploma mills so they could make money. To attract students, they eliminated many of the traditional academic requirements such as laboratory experience or even required literacy for admission. They didn’t see a need to teach anatomy. To compete, even the colleges with medical schools reduced their requirements.
These diploma mills were encouraged by a public that hated government regulation or any interference with the rights of the individual to do as he wished. There were no licensing requirements for physicians or professional oversight.
During this time, European physicians were starting to apply scientific discoveries and methods to the medical profession. However, many American physicians laughed at the idea that science could have any practical value in medicine. Many American physicians argued they were superior and did not have to follow the practices of their “weaker European forebears.”
By late 19th century, American physicians finally came around to the idea of scientific utility in medicine. Furthermore, between 1890 to 1920 hospitals in the U.S. started to gain more respectability as more hospitals were being built due to advances in medical science becoming more widespread. Surgery could now be performed painlessly aided with anesthesia.
The germ theory of disease propagated by Robert Koch in Germany and Louis Pasteur in France identified bacteria as the cause of spread for infections. In 1867, Joseph Lister in England published his work on antiseptic techniques using disinfectants, and the birth of sterilization in surgery had begun. As a result, hospital infections dramatically dropped and became safer places for patients.
In 1895, German physicist, Wilhelm Roentgen, took the first medical X-ray using his wife’s hand. Soon after, the X-Ray was being used as a diagnostic technique in hospitals.
There were social changes in America that were also leading to increasing use of hospitals. Because of urbanization, people did not have the social support of families to provide nursing and care, so hospitals became their only option when severely ill.
By the turn of the 19th century, hospitals were now providing medical care for an entire community. Every social class that could make payments and needed acute care would be treated in the hospital.
By 1910, there were over 4000 acute bed hospitals in America. In the east, wealthy citizens funded hospitals such as Massachusetts General Hospital and Johns Hopkins. In the Midwest, religious groups such as Catholics, Jesuits, Methodists, and Baptists opened up hospitals to treat immigrants. In the south, because of a lack of wealthy donors and immigrants, many hospitals were opened for-profit and were physician-owned.
As hospitals in the U.S. increased, there was no oversight or mass coordination. Each hospital would hire its own staff, regulate itself including its expenditure, set its own fees, determine how patients would pay and how they would collect the bills.
In the early 20th century, as hospitals became more professionalized, tensions grew between hospital administrators and physicians (who were still voluntary faculty). Administrators needed physicians to refer patients to their hospital, and to perform surgeries and diagnostic tests. Physicians needed these hospitals to admit their patients for acute illnesses, for diagnostic tests, and surgeries. Physicians resented that they were not paid by the hospitals and the lack of autonomy when treating their patients. And with the increase of prestige of hospitals, physicians were becoming unhappy that their wealthier paying patients were being “stolen” by hospitals.
In 1946, the federal government aided for the first time in hospital proliferation by passing The Hill-Burton Act of 1946. After World War II, President Truman urged Congress to pass a bill providing funds for constructing hospitals and clinics to serve a growing population. As a result of adopting this bill, many poorer states were able to build new hospitals for their communities.
After World War II, medical technology advancements made hospitals even more vital to health care. In the 1950s, intensive care units opened up with patients being able to be sustained on ventilators. In the late 1950’s chemotherapies were being used in cancer, and by 1970’s, chemotherapies were curing several cancers like leukemia and Hodgkin’s lymphoma.
In the 1960’s, hospitals were performing heart catheterizations to see blocked arteries, and by 1970’s they were opening up clogged arteries with this procedure.
Another major hospital transformation happened in 1965 with the passage of the Social Security Amendments of 1965 which created Medicare and Medicaid. Now hospitals had a substantial gain in capital which they could use to upgrade their hospital and their technology since they no longer were required to provide free care or subsidized care for the poor and elderly poor.
At that time, Medicare payments were generous, and virtually all hospitals profited from it. Medicare paid hospitals for their costs plus a percentage to compensate for capital expenditures related to expansion. This incentivized both increasing costs to increase profits, and hospital expansion that would be paid for. This inflationary payment system lasted until DRGs and prospective payment was introduced in the 1980s.
Through the 1970s to 1980s, advanced technologies in medical care were now the norm, with many hospitals able to treat many of mankind’s diseases.
But as hospitals grew substantially in size and complexity so they could treat many of the common conditions, things started to sour in the 1990s in the United States. The continued rise in healthcare costs, the greater awareness of healthcare performance issues especially with regard to quality and safety, and patient dissatisfaction of hospital care became significant concerns for regulators and the public.
In 1999, the “To Err is Human” report was released by the Institute of Medicine which showed that each year an estimated 44,000 to 98,000 people die as a result of a preventable medical error each year.
Hospital-acquired infections also became a significant problem. According to a CDC study of 2011, over 720,000 patients were diagnosed with hospital-acquired infections and 75,000 people died as a result of these diseases that year.
A 2012 survey by Robert Wood Johnson Foundation and Harvard School of Public Health found that one in six hospitalized patients were dissatisfied with their hospital care, and a major concern is a poor communication among nurses and doctors. Part of this growing dissatisfaction may be related to a significant healthcare transformation that took place in the 1990s and 2000s when many patients’ personal physicians stopped rounding in hospitals. This was either because their care would be provided by a hospitalist or their physician joined large practices that rotated physician rounding. As a result, physicians and patients would be unfamiliar with each other and lack the typical bond in the traditional physician-patient relationship. Therefore, patients would feel depersonalized with their providers as many different teams of doctors, nurses, and support staff could come and go during their stay.
Nonetheless, today’s American hospitals are the dominant player in the healthcare system, and they have a significant impact on the American economy. In 2012, $970 billion of healthcare expenditures went into the 4,895 acute care hospitals. That is roughly $200-$300 billion dollars more money than what was spent on social security or the national defense.
For the first time in history, the healthcare industry is now the largest employer in the United States. It has surpassed the retail and manufacturing industry which dominated employment for most of the 20th century. Many metropolitan hospital systems are now leading employers in many cities. For example in Houston, both Memorial Hermann Health System and MD Anderson Cancer Center have as many employees as United Airlines, Chevron, Shell, and Exxon corporations combined.
As you see, just in a short 100 years, hospitals evolved from what Benjamin Rush called “the Sinks of human life” to places of prestige and pride where scientific and technological advancements have made miracles happen for many common ailments. But in the last 25 years, due to the rising costs of healthcare and the sprawling expansion of hospitals increasing their size and complexity, there have been some detrimental effects. This is why many experts in healthcare see this as a time for a new transformation with innovations and reforms in healthcare so that the miracles of modern medicine remain a sustainable venture in the 21st century.
Sources and Additional Reading:
Reinventing American Health Care (Ezekiel Emanuel)
Lotions, Potions, Pills, and Magic: Health Care in Early America (Elaine G. Breslaw)