I watch all kinds of TV shows, and some of them are entertaining and others are over the top (yet still entertaining), but one thing that rubs me the wrong way is how friendship is sometimes portrayed. This is especially true about women friends.
Now, men and women behave differently in their friendships, and I get that. But it seems like what we see the most are female friendships that are toxic and childish. I think sometimes we can learn a thing or two from what we see on TV, and sometimes... it's just like watching a car wreck.
I'm sensitive to the way females are shown because obviously I write about friendship, I have female friends, I am female, and I want women to treat each other well. To me, this is what being a strong woman is all about. I'd like to see more of that on TV.
I mentioned to you recently that I wanted to read the book What Would Michelle Do? by Allison Samuels. Well, I finally did and enjoyed it. One thing I liked about it was that the book was aimed toward young women and was very positive and encouraging. It offered examples on how to behave in a more positive way, how to be a friend, and even how to determine if you need to move on from friends. I liked the fact that it gave details on these things, because, let's face it, we need more positive and concrete examples. You can read my full review of the book, and if you've read it please feel free to share your own review by clicking the "write a review" link at the top or bottom of the review page.
As for TV, I'd like to see more positive friendships between strong women. Is there a show that has this? What are your thoughts?