Patriarchy didn’t teach men that they’re disgusting, wtf
That’s like the opposite of the patriarchy
This is the shit feminists do that piss me off. Literally anything bad? Patriarchy. Literally anything good? Feminism. Even if it makes no goddamn sense.
I consider the patriarchy to be the social power structure that enforces gender norms. What exactly do you consider the patriarchy to be?
Is there literally anything I could do to convince you to re evaluate your position? Because I bet no, and I really don’t have the time or energy to beat my head against a wall right now.
Bullshit. That’s one tiny aspect of it - men unable to support other men. The vast majority of the problem of men seeing themselves as disgusting is feminsm. “All men are potential rapists” is a disgusting and extremely prevalent ideology. All men are trash, all men are pigs, you see women say these things all the time and it’s just accepted.
I’m trying, but without external validation, all that remains is the status quo.
Go to a gay bar. Not even joking. Gay men are the single best thing for any man’s confidence. And sense of male self-worth.
You don’t have to be gay to uplift and compliment the other men you interact with. That’s just a side effect of toxic masculinity. Men can compliment each other, uplift each other, and say “Hey man, you’re looking good today. You make that shirt look good!” without being gay. Not that I disagree with anything you said, just you’re comment got me thinking about the weird effects of the patriarchy on men supporting other men.