What Does Feminism Mean ?

Aziza El Aazizi 2018-11-02 00:00:00 Social
What Does Feminism Mean ?

 “Feminism taught me that in order to become a better version of yourself, you don't bring other women down—you empower them. Empowerment is a contagious thing. Feminism also made me realize that beauty should never be confined to one specific type. Beauty is diverse. Feminism taught me to appreciate beauty in all forms and colors, not just the ones I grew up accustomed to.”

“Feminism is making your own decisions for yourself and not allowing preconceived notions about what a woman ‘should’ be affect that. It means viewing myself as worthy of love and success, so that my self esteem and my ability to reach my goals are not affected by the world's view of women. It means both actively and mentally supporting other women instead of comparing myself to them.”

“ Feminism has not just taught us to fight for our rights; it has also taught us compassion—compassion for the people who aren't as privileged as we are and compassion for those men who are chained by patriarchal norms.”

“Too often, feminism is defined as a series of stereotypes. Feminists hate men. Feminists don't wear makeup. Feminists only wear pants. However, these are just a series of misconstrued stereotypes. Feminism is about the idea that women are equal in every way and that women are capable of achieving anything they want. It's not about giving up anything you believe in; it's about believing in yourself and other women and your ability to change the world.”

“Feminism is equity .It's equity of gender, sexuality, race, religion, color, and creed. It's creating a better and more equitable world where we can all live without fear of judgment or persecution.”