Matthew Coats
Doctors of the modern world have what doctors of the past would call a luxury. Doctors have offices in which to meet, unlimited access to tools to be effective at their job, and the knowledge of modern medicine, which makes their practice much easier to execute. Doctors of prior centuries would have dreamed of such “luxuries.”
Doctors of the 19th century mostly carried out their medical practice in private homes or occasionally an office. They had no status and received very little training. Although there were many hospitals, during the time of the Industrial Revolution, they were considered dirty and indeed they were. Although many would go to hospitals to be healed, many left with a new illness or died. Patients contracted diseases because doctors didn’t understand how diseases spread. Many who could afford it would have the doctor come to the privacy of their homes. Doctors worked everywhere and not just to the confines of their office or department. A doctor was expected to treat any illness including animals and travel on any terrain. This was highly inefficient. Because all doctors were expected to treat all the same things, they never specialized in anything. The idea of specialized doctors only came after about a century of this inadequate system. Although this system wasn’t very effective, doctors still continued to learn; specialized tools and procedures were developed leading to doctors eventually specializing in broad areas of medicine.
During the 19th century, doctors traveled by foot or horseback. Because of this, they were limited to the amount of tools and drugs one could carry in his hand-held bag or saddlebag. Due to the combination of limited tools and drugs, and the expectation of a doctor to treat a wide variety of ailments, the quality of care was poor. Examinations and treatments were all done in the patients home. Examinations were general observations of the patient’s body using a stethoscope to monitor the condition of the heart, lungs, and digestive track. The most popular treatment available was called bleeding the patient. There were many ways to bleed a patient, and it was done repeatedly over a short period of time. Other principle treatments included specific diet instructions, rest, baths, massage, blistering specific areas of the body, sweating, enemas, purging through use of diuretics and emetics like ipecac, and prescriptions such as anti-inflammation creams or herbal pills.
Another procedure carried out in the patient’s home was surgery. Surgery was limited to the surface of the body and the tolerance of pain of the patient. Anesthesia was not commonly used and what was used was usually chloroform, which presented risk of asphyxiation. In addition to this, the risk of infection was extremely prevalent due to the environment. In the United States, anti-septic was not common until the turn of the century, so the risk of infection from any surgery was high.
19th century doctors, like today, charged patients per procedure. They may have charged more depending on if it was an emergency in the evening or if it was a childbirth. A big difference in 19th-century doctors and modern doctors is 19th-century doctors were not usually paid in cash but “in kind.” This was whatever produce or goods was available to the patient.
The doctors of the 19th century were not as well-off as modern doctors. Doctors who served the poor barely made a living and the doctors who served the rich made an average living. There were three different groups of doctors: the “orthodox,” homeopaths, and eclectics. Orthodox doctors practiced based on natural philosophy and experimental science. Most of their remedies were harsh like bleeding or high dosages of mercury, which today are deemed lethal. Homeopaths believed in administering drugs in small dosages. They believed diluting the drug would make it more effective. They strongly pursued medical research and scientific testing. The last group, eclectics, combined the practice of using herbal medicine with the traditional practices of orthodox doctors. Although they used some of the same treatments as orthodox doctors, they strongly opposed their methods.
Near the end of the 19th century, medicine started to solidify into a united organization. The American Medical Association was established for this purpose in 1847. The founders made it their main priority to establish minimum requirements for medical training. The minimum requirement was four years of high school and four years of medical school and passing a licensing test. Along with this, physicians started to take their job more seriously and took on more responsibility. The American Medical Association implemented the code of ethics. The underlying principle of the code of ethics is stated in chapter 3, section 4:
It is the duty of physicians to recognize by legitimate patronage to promote the profession of pharmacy on the skill and proficiency of which depends the reliability of remedies, but any pharmacist who, although educated in his own profession, is not a qualified physician and who assumes to prescribe for the sick, ought not to receive such countenance and support. Any druggist or pharmacist who dispenses deteriorated or sophisticated drugs or who substitutes one remedy for another designated in a prescription ought thereby to forfeit the recognition and influence of physicians.
From the start of the American medical association, medicine became what we know today. Medicine started to progress quicker, doctors were valued more, and the quality of research improved. Although 19th-century doctors didn’t have the luxuries of modern medicine, they paved the road to make modern medicine possible.
Bibliography
“19th Century Doctors in the U.S.” Melnick Medical Museum. 2009. Web. 02 Mar. 2016.
“Browse History.” Judy Duchan’s History of Speech. Web. 02 Mar. 2016.
