There's a lot of talk these days about how women can't "get ahead," but what if, amidst all the hand wringing, women are quietly doing just that?
Journalist Hanna Rosin argues in her book, The End of Men, (and in her article of the same name), that we are seeing the end of male dominance in the workplace, as well as in society in general. She shares some convincing statistics on women's advancement in the workplace at all economic levels:
- Women are more likely than men to gain college and advanced degrees.
- Women dominate 13 of the 15 job categories projected to grow the most over the next decade.
- Women hold 51.4 percent of managerial and professional jobs. They make up 54 percent of all accountants and hold about half of all banking and insurance jobs. About a third of America's physicians are women, as are 45 percent of associates in law firms—and both those percentages are rising fast.
- Female CEOs, while still relatively rare, out-earned their male counterparts by an average of 43 percent in 2009, and received bigger raises.
But the story doesn't stop with economic advancement. As women have made gains, Rosin argues, men are falling by the wayside, or even getting kicked to the curb. Rosin profiles a frustrated "support group" for working class men as they struggle to understand their new role in a world where women have taken over as the primary wage earners. She also takes a look at college admissions, where even elite universities struggle to maintain an "appropriate gender balance," not needing to boost the ranks of qualified women, but of qualified men.1