In Hollywood, Money Always Trumps Politics

This post was originally published at For anyone who is obsessed by politics, whether they lean left, right or straight down the middle wondering what’s wrong with those people on the left and the right, there is an underlying belief that every movie, every television show, pretty much anything anyone says or does must be inherently based upon some political goal or bias. Nowhere is that more apparent than when it comes to the media, or as the right likes to call it with a voice of disdain, Hollywood. We’ve seen it most vividly recently in the discussions regarding ABC’s reboot of Roseanne and how the network is finally paying attention to conservatives. The truth, however,

© Designed by Samantha Rabney 2017. Proudly created with WIX.COM