#GAReads | Women Doctors Earn Less — and Not Because of the Jobs They Choose
"Women Doctors Earn Less — and Not Because of the Jobs They Choose":
America's women doctors aren't earning less than men because of the fields they've chosen. Even women in some of the highest-paying medical specialties make much less than their male colleagues.