Thursday, October 11, 2018

Gender bias in health care may be harming women's health

Gender bias in health care may be harming women's health: From the way medicines are researched and tested, to the approach doctors take diagnosing and treating diseases, more and more research shows a bias against women in the health care system.