I’m frustrated. I’ve just seen my tenth, or maybe eleventh, headline of today on how should I tell men that they’re wrong. That they should stop mistreating women. But is there really anything that we can do?
Truth be told I’m a man and these headlines make me feel bad. Most of my best friends and coworkers are women. I hate people who abuse their power as much as anyone. Yet, on a daily basis I see headlines and articles stating it is the men that are evil. I don’t like it.
Reading these headlines feels bad, for I’m not a bad person regardless of my gender.