wannabeXL
Well-known member
- Joined
- May 31, 2009
- Messages
- 60
- Reaction score
- 0
This is a rant. This is only a rant.
Why does physical attractiveness still matter after all this time? They told us in elementary school, middle school, and sometimes high school that "beauty is skin deep" and "it's the inside that counts." I've always believed that and it's one of the reasons I never bother to wear make-up or dress up in pretty clothes (the other reasons included practicality, laziness, and lack of money). Yet I find that once you're past high school most people find these sayings outdated and begin caring about looks more than they ever did before. The boys routinely go to the gym hoping to be the next Arnold Schwarzenegger, while the girls get breast implants and buy themselves more beauty products than they need. Obese people are told to lose weight "so they can live longer," but I think the real reason is more like "so they can look more attractive." I mean, I'm all for promoting health, but with the focus being on losing weight (which will only change your outer look) than on actual lifestyle changes (which will have more effect on your actual health) shows like The Biggest Loser just drive me crazy!
As a female, I find this obsession with physical attractiveness rather bothersome and mildly offensive. I don't understand why people tell me I should get contacts and wear my hair differently so I could look "pretty" when I'm still the same freaking person underneath it all. I don't understand why, when I'm dressed up for a party, I should take "you look really beautiful tonight" as a compliment. Doesn't that statement suggest I don't look beautiful the rest of the week? Do I want to look beautiful for someone who thinks make-up and a fitting dress have the magical power to turn an ugly duckling into a beautiful swan? And what's up with that story anyway? Why would an ugly duckling want to be a beautiful swan just to impress other people?
I think I'm slowly turning into a bitter feminist. Or maybe I'm just bitter.
Why does physical attractiveness still matter after all this time? They told us in elementary school, middle school, and sometimes high school that "beauty is skin deep" and "it's the inside that counts." I've always believed that and it's one of the reasons I never bother to wear make-up or dress up in pretty clothes (the other reasons included practicality, laziness, and lack of money). Yet I find that once you're past high school most people find these sayings outdated and begin caring about looks more than they ever did before. The boys routinely go to the gym hoping to be the next Arnold Schwarzenegger, while the girls get breast implants and buy themselves more beauty products than they need. Obese people are told to lose weight "so they can live longer," but I think the real reason is more like "so they can look more attractive." I mean, I'm all for promoting health, but with the focus being on losing weight (which will only change your outer look) than on actual lifestyle changes (which will have more effect on your actual health) shows like The Biggest Loser just drive me crazy!
As a female, I find this obsession with physical attractiveness rather bothersome and mildly offensive. I don't understand why people tell me I should get contacts and wear my hair differently so I could look "pretty" when I'm still the same freaking person underneath it all. I don't understand why, when I'm dressed up for a party, I should take "you look really beautiful tonight" as a compliment. Doesn't that statement suggest I don't look beautiful the rest of the week? Do I want to look beautiful for someone who thinks make-up and a fitting dress have the magical power to turn an ugly duckling into a beautiful swan? And what's up with that story anyway? Why would an ugly duckling want to be a beautiful swan just to impress other people?
I think I'm slowly turning into a bitter feminist. Or maybe I'm just bitter.