I think it's deeper than just that. Western women have been told for the last 80+ year that men are the enemy, the oppressors, the masters, they have been told men are unpredictable, violent, and hateful.
Simultaneously they have been told they are superior, are smarter, are just plain better...