What Women Really Want Is The Patriarchy

Women have blindly followed the feminist mantra and now find themselves lonely and confused. It’s time to welcome back the patriarchy. —>

shutterstock_324727937-998x588

(The Federalist) In today’s enlightened age, women think they know what kind of man they want, but in reality, most don’t. In fact, many women, unwittingly confused by the myriad feminist mantras bombarding them daily, seek the type of committed, romantic relationship with a man that will ultimately leave both her and him inherently dissatisfied.

This is as much due to the ideology behind feminism’s flawed ideas as the men who have been, over the years, subconsciously programmed to behave according to its dictates. It’s never too late to figure out that men need to own their patriarchal prowess. If they did so, they’d soon discover this is what women really want.

What Women Are Told

Women are now told from basically grade school through early adulthood that they can do anything, be anything, have anything—with a woman or a man, with anyone or no one—as long as they work hard, lean on girl power, and berate or at least eschew any notions of patriarchal reverence. See Gloria Steinem’s famous “women need men like a fish needs a bicycle.”*

CONTINUE READING

Be the first to comment on "What Women Really Want Is The Patriarchy"

Leave a Reply