What’s the Buzz

The Bee Healthy Blog

Western Medicine: Then and Now

image of medical students in a classroom interacting and studying

When England began colonizing North America, there were few regulations for care. Shiploads of people came, but they were mostly laborers, not educated in biological science, medicine, or any specialties of medicine.

Scientific Discoveries

Colonial practitioners had no knowledge of germs, bacteria, or the need for proper sterilization. As a result, diseases ran rampant and many deaths were due to infection. Medical practitioners were scientists who asked questions that led to big discoveries. Medical treatment began because scientists wanted to know why things happened (symptoms, death, infections). But the real breakthroughs happened when they began to wonder how things happened. Unfortunately, this knowledge was not passed on to the people actually providing ongoing care—the mothers, wives, and midwives.

Practicing Medicine

As families began to populate the new American colonies, it was the women that took over care for families and communities[1]. The only resources they had to help them were usually recipe books. Healthful diets could help prevent disease, tonics could treat symptoms, and this knowledge was passed down through families, woman to woman. These indispensable caregivers were utilized during the Revolutionary War and paid for their services as nurses. The battlefield was also the first place that surgeons were utilized on a large scale—professional surgeons with degrees, amateurs posing as professionals, and army medics.

Research and Development

It wasn’t until 1845 that a test was introduced as a standard to assess medical students[2]. Nearly two million recruits were tested using this standard during World War I, and clinical research grew in much the same way simultaneously. The introduction of the placebo in 1800 paved the way for controlled clinical trials, and in 1863 United States physician Austin Flint completed the first clinical trial using a placebo versus an active treatment[3]. The process of clinical research evolved until the 1940s when scientists discovered randomization of samples.

Regulations and Patient Care

As medical research progressed and more patients were involved in clinical trials, questions were raised about human experimentation and abuses. This started the development of regulations and law codes we live by today. The Food and Drug Administration (FDA) was founded in 1862 and has been a law enforcement organization since 1906. The Nuremberg Code was formulated in 1947 followed by the Helsinki Declaration in 1964, which is reevaluated every few years, even today.

Today, the FDA[4] regulates food products, drugs, biologics, medical devices, electronic products that give off radiation, cosmetics, veterinary products, tobacco products, advertising, alcohol, consumer products, drugs of abuse, meat and poultry, pesticides, animal vaccines, and water. The FDA is the entity that makes sure drugs are tested, used, and labeled properly. They make sure your generic drugs are equivalent to the name brands, and that medical treatment products are as regulated as the practitioners themselves.

The government is fully involved in western medicine, because it must be to maintain a standard of care. This is the type of healthcare that treats symptoms with research-based treatments, and results are often curative. Medicine of the past was trial-and-error, but today the cure rate is better, life expectancy is longer, and care is far more specific and controlled.

[1] https://www.historyisfun.org/pdfbooks/colonial_medicine.pdf

[2] http://www.imerg.org/wp-content/uploads/2016/08/Violato-2016-History-of-medical-practice.pdf

[3] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3149409/

[4] https://www.fda.gov/AboutFDA/Transparency/Basics/ucm194879.htm