In 2014, Beyonce performed at the MTV Video Music Awards, with the word "feminist" boldly shining behind her as she did. People went crazy over this, one reason being that they considered Beyonce courageous for proudly declaring herself a feminist in a time when saying you support women's rights and gender equality has become a controversial point of view.
For some, being a feminist just comes naturally, especially if you've always been able to do the things you want and have been encouraged to pursue your dreams. For others, feminism is the dirty new F-word, and anyone who calls themselves feminists should be shunned. It can be frustrating when you're faced with the latter kind of people, but the important thing is to help them understand what feminism is really about, because there are a lot of misconceptions floating around.
A common idea is that feminism is about women's dominance over men. The truth is that feminism aims to achieve equality for women in political, social, and economic aspects, among others.
The goal of feminism is not to make men subservient, but to uplift women.
After all, bringing someone up doesn't mean that you're putting someone down. One group of people gaining rights and influence does not involve taking away the rights and influence of another.
Some people also assume that feminists are just angry lesbians. Well yes, some of them are, and there's really nothing wrong with that, but they're not the only kind of feminists. Feminists come in all shapes and sizes, with different orientations, and from different backgrounds. Some of them are even men, which would surprise anyone who thinks that only women can be feminists. But they all agree on one thing: empowering women is a good thing and gender equality needs to be achieved.
Have you ever heard some women say, "Oh, I'm not a feminist. I like men"? That comes from the mistaken idea that feminists are man haters.
Many feminists are in happy, heterosexual relationships. What feminists actually hate is the oppressive, patriarchal society that we live in, and the rules it imposes to keep women down.
People who scoff at feminism also say that feminism is no longer needed because men and women are equal now. This statement comes from a place of privilege and are said by people who are blissfully unaware of the continuing struggles of women around the world. In many other countries, young girls are denied an education and are forced into marriage. In Saudi Arabia and Afghanistan, it's well known that women are prohibited from doing something as simple as driving. In the US, which is widely acknowledged to be a developed country, women are being denied the right to make decisions about their bodies, wage inequality persists, and women continue to make up a small proportion of elected officials. Everywhere else, women are subjected to sexual harassment on a near-daily basis, and some languish in abusive relationships with no way out.
In a world where all that is still happening, feminism remains very much relevant and necessary, and perhaps a better understanding of what it really is might help make gender equality a reality even sooner.
What other misconceptions about feminism do you often hear? Sound off in the comments below!