I think it's pretty safe to say I've been a feminist all my life.
Growing up in a progressive household, I always felt like I could do whatever I wanted and be whatever I wanted. It also helped that early on, my feminist beliefs were validated by the Spice Girls, so I proudly displayed various items emblazoned with "Girl Power," such as a t-shirt and a sparkly keychain on my mini-backpack.
Unfortunately, not everyone gets to grow up with that encouragement, and this, along with our culture's obsession with genderizing everything, is why people today think "feminism" is a bad word.
I've explained to numerous people, mostly guys, that "feminist" is simply a term for someone who believes in gender equality. Plain and simple. We aren't man haters, and we're not trying to take over the world and make men obsolete (although ask me on a day when I'm arguing with the boyfriend or witnessing the macho bravado of all the tools that live in Roseville - then I might be singing a different tune).
Feminism to me is just common sense. Men and women are equal and not limited to certain roles strictly based on gender. You want to be a stay-at-home mom or dad? Great! You want to go work full time and be a go-getter? Lovely. It's not all about women running around braless and lashing out at women who decide to stay at home. It's about choice and getting to be whatever we choose.
I feel like with my busy job, I've gotten away from being more in tune with what's going on. Not to mention, I've stopped volunteering for my domestic violence/sexual assault causes. Ugh, it makes me feel awful because I thoroughly enjoyed it, but this new schedule leaves me little time to even go to the bathroom, let alone lend a hand to others (I know, I know. Excuses!).
Fear not, because I am still the feminist I ever was - I get my feminist magazine, read my feminist blogs, buy feminist books, spout my feminist rhetoric to anyone who will listen (my cat). It is in no way a bad word - I'm a feminist and proud of it!