- A & E
- Business & Science
If we live in a post-sexism society, why does sexism keep making the news?
Take, for instance, the new announcement that women can officially serve combat roles in the military. I don’t pretend to understand all the complexities and implications of the military’s rules, but I don’t understand why this is being met with any negative reactions at all. If you think women are fragile and unfit for war, well, you can rest easy that nobody’s suddenly forcing them to fight in the front lines — and you should also get out a bit more and meet a few more women.
Though this is the most current “gender issue” on everyone’s lips, it isn’t specifically what’s on my mind. Apparently, the majority now thinks that not only has our society come so far that equal-treatment advocacy is no longer needed — it’s actually come to be perceived as a negative thing.
In the past few weeks or so, two prominent female singers (Taylor Swift and Katy Perry, if anyone cares) have made waves by proudly saying they don’t consider themselves feminists. In Swift’s words, she “wasn’t raised to see things as girls versus boys.” While some of us might be inclined to sneer at her utterly missing the point, the truth is that this what our society in general considers feminism to mean: dour, man-hating shrews who think women are superior to men in every way and look for misogyny in everything. This is no longer considered the stereotype, as far as I can see; many people seem to be under the impression that this is what feminism irrefutably, unabashedly is.
Lest you think I’m preaching here, I’ll be honest about something: I have to fight not to believe that myself. “Feminism” has such a negative connotation in today’s society that even the people it’s trying to defend are programmed to turn their noses up at it. In fact, at least on the Internet, if anyone claims something to perhaps be a little bit demeaning to women or pandering to the male gaze, they are automatically hit with the term “femi-Nazi” and told no man will ever want them (as if this were every woman’s ultimate goal in life, to attract the mouth-breathing basement-dwellers of e-culture).
Anyway, for years, when someone described themselves as a “feminist,” I too would have a negative, or at least wary, internal reaction as a reflex. Then, one day, I met a young woman wearing a T-shirt emblazoned with a now-popular saying: “Feminism is the radical belief that women are people.” This gave me a different perspective on the matter, one that has continued to grow as I’ve broadened my horizons.
Basically, no matter what your perceptions of the word are, if you believe women are essentially equal to — not superior to, not exactly the same as, but equal to — men, then you are a feminist (yes, even if you’re a man). If you more or less believe that, but are still uncomfortable with publicly identifying as one due to the negative connotations surrounding the term and your own internalized definition of it, then at least, perhaps, we could all stop perpetuating the misconception that being a feminist makes someone “butch” (for a woman), “girly” or “whipped” (for a man), anti-man or anti-Christian (not to single out one religious community; this just seems to be the most common complaint aimed at the movement). It just makes them pro-human.