Discussion about this post

User's avatar
Dallas E Weaver's avatar

Most MD's, even today, don't conduct any research. Medical doctors are to science as mechanics are to engineering, they just apply knowledge and experience to fix/heal things.

Midwives accumulated and passed on a lot of real knowledge and were probably involved in treating children from the many diseases that we didn't understand at the time. It really wasn't until the late 19th century that proof of the microbiological world became dominant in disease and Koch's postulates became widely known. It then took a few generations to make that reality and have alternative hypothesis from witchcraft and black magic to "bad air" fade into history.

Medicine didn't become scientific until modern times.

Expand full comment
Maria Comninou's avatar

I agree but I cannot help but smile when you say: "Science is universal. It is not ‘male’, nor ‘white’, or ‘western’."

May be, but scientists are not (were not) "universal". Until recently, medicine (if you consider it a science) was "male". Females (female rats actually) were considered too complicated to run experiments for them. Many diseases affecting primary females or having different symptoms in females were unknown, ignored or misunderstood. Women had to march to protest lack of research/funding for breast cancer.

Soon (already) women in the US will not be able to rely on medical science for abortion even for health reasons. They now seeking "alternative methods" relying on herbs and other concoctions. Unfortunately, the universality of science is entangled with its societal context.

Expand full comment
18 more comments...

No posts