We see women being objectified in society everyday by not only men, but other women as well. As a woman myself, I know first-hand what it’s like to be objectified and treated unfairly in our patriarchal society. But can we really just blame the male gender for this issue? Are boys born with some unconditioned instinct that causes them to look at women in sexual ways, or as lesser beings? Is there something or someone else to blame?