We see women being objectified in society everyday by not only men, but other women as well. As a woman myself, I know first-hand what it’s like to be objectified and treated unfairly in our patriarchal society. But can we really just blame the male gender for this issue? Are boys born with some unconditioned instinct that causes them to look at women in sexual ways, or as lesser beings? Is there something or someone else to blame?
In our post-9/11 world, a hyper-awareness of skin tone and religion has taken hold in American culture. Islamophobia is a result of this hyper-awareness and has become a trend within our practices. Non-profits, blogs and publications projecting anti-Muslim sentiments have sprung up, and it’s clear that the opposition is based in a fear of the Other.