There’s a new “F” word in town—feminism. Ever since a Vogue UK writer asked Beyoncé if she was a feminist, it seems to have become a staple in the magazine reporter’s handbook of questions for celebrity interviews.
While some of Hollywood’s hottest are quick to jump on the feminism train, a lot of women distance themselves from it. While being a feminist is definitely a personal decision, most of the ones quick to run away from the label are totally misguided about what feminism actually is.
Contrary to what some may think, feminism is not the belief that women should rule over men with an iron fist. Merriam-Webster Dictionary defines feminism as “the belief that men and women should have equal rights and opportunities.”
Whether you notice it or not, your favorite actress or singer can have a huge impact on how you think about certain topics. So, we’re setting the record straight on the specious answers of five starlets, along with some advice from women who know what feminism actually means.