A very smart friend of mine posted a link to a website that has two videos about the male centric viewpoint of Hollywood, the first specifically pointing out that out of the past 50 movies that have won Oscars 4 out of that 50 have centered on women’s lives. That, to me, is pretty telling about what the dominant view point in the United States. I am very sure that arguments could be made to the contrary, and since I’m not good at arguing I shall just hand over the link and suggest anyone who come across this joins the lively conversation in the comments.
Please go check this out, and then maybe try the little thought experiments I like to do.
1. When you are watching a cop show, or really any show, switch the races of the actors and try to really picture someone of a different race in that roll.
2. Do the same with the genders.
3. (Now, I think this one is very interesting) Imagine (Actually, remember that) Latino and African-American actors who play thugs as people who have tried out for that part so that they can be actors. Remember that they are only playing these rolls because they are written by the writers. Imagine all the “bad guys” as actors who got a casting call and came in because they fit the description.