Patriarchy, as in a society in which men hold the majority of positions of power. I know most people flinch or roll their eyes when they read this word, so I’ll be frank. I’m not using it as an attack on you, I am merely asking a question. I see many people either misuse this word or have misandristic connotations when they discuss the issue. Do men still hold the majority of the positions of power in western society, or is the patriarchy a feminist lie?

Leave a Reply
You May Also Like