Where are all the feminist women in real life?
Apart from one girl in college, I have never met a single woman in my life who understood what feminism even means let alone call themselves a feminist. Most of the women are still very conservative in nature with maybe a little bit progressive views here and there. And this is Gen Z women I am talking about. I don't expect much from women from the previous generations considering that their patriarchal conditioning was very deep rooted and they didn't have constant access to social media and other people's point of view to break away from the conditioning and relearn things. But it's very disappointing when I see well educated urban women in their 20s being so regressive in this time we live in. It makes me extremely sad and frustrated.