When the nation’s first women’s colleges were established, it was during an era when women who earned advanced degrees were considered unconventional—and those who used their education to cultivate their independence were viewed as downright controversial. In the United States, women did not have access to higher education until the mid-1800s, and even then their coursework was often limited to classes in “domestic science” or “home economics.”
diversity-and-inclusion1Categories: Print