Just about everyone looks better when they smile. It’s true regardless of gender. I don’t see where sexism enters the equation.
I feel pretty oblivious. What am I missing?
Just about everyone looks better when they smile. It’s true regardless of gender. I don’t see where sexism enters the equation.
I feel pretty oblivious. What am I missing?
It’s a weird thing to say in general. Would you say that to a man?
I’ve never said it to anyone (well, except when taking a group photo). You’re right; it’s weird. I’m just looking for perspectives on why it’s weird.
It’s weird because it sets you up as the objective authority on what makes them attractive.
With another man, that just comes across as a neutral weird. With a woman, it comes across with a bunch of historical and cultural baggage tied to how a large number of men treat women, and automatically associates you with that group of misogynists.
That is hysterically arrogant and appallingly nauseating at the same time.
Exactly.