We are hard on Hollywood and its influence on society.
Here's the question:
Does Hollywood influence society...or...is our society influencing Hollywood?
In other words, is our society creating experiences because of Hollywood...or...is Hollywood telling stories about society?
Subscribe to:
Post Comments (Atom)
4 comments:
Some of both. However, I think Hollywood affects society more than society is affecting Hollywood. Actors are hardly reflecting the mainstream of our country in the way they live. And Hollywood definitely has a vision of what America ought to be -- and pushes its agenda relentlessly. For instance, by watching nearly any movie, you would think homosexuals were half of the population, sexual promiscuity is where it's at, and marriage / monogamy equals monotony. Further, every Republican is a bumbling doofus -- as are Christians.
I am not a big fan of Hollywood.
So, how do you really feel, Rick/Dad? :)
I think that it's a both/and. It think that at one time TV overwhelming won this war. It had a huge sway over the actions of people, especially our youth. However in the past few years as society has slowly, but very surely, been giving people what they have come to want. What a cycle of degradation!
I just pray that God takes over the hearts of men so that He can be glorified.
I agree with Rick and Keith-some of both. There are some wonderful Christian actors but most of what you find that Hollywood puts out there for the world to see is so lopsided against what we believe. All you have to do is watch the Britney Spears circus!
However, Hollywood can also make us think about the world around us and just how imperfect and sometimes insane it is.
The biggest influence on some people is not Hollywood but the internet.
Post a Comment