Whollywood

02/18/2010 08:16

What has the term "Hollywood" come to mean to us?  I hear all the time people saying, "Look at what Hollywood is doing to our society," or, "That's not really how it is, it's just what Hollywood wants us to think."  Is that really where we're at?  Is it true that we treat legitimate entertainment business as a possible brainwasher?

 

The term has become so generalized, sometimes I don't even know what I mean when I say it.  Or I should say, I don't know what other people are thinking when I bring it up in conversation.  To some it's the root of all evil in the modern world, and to others it's a gigantic hub of ideas and creative expression.

 

The other thing is how all "Hollywood" involved people, celebrities for the most part, get instant personal criticism and all character flaws are pointed out just because of their visibility.  There's an inherently negative attitude many of us non-"Hollywood" people get when we talk of movies, music, actors.  I'm willing to bet that only 50% of those people we criticize really deserve it.  We forget that they're part of humanity as well and are subject to everything we are and vice versa.  Yes, a life drastically changes the more famous one becomes, and personalities can be affected and altered, but take a look at yourself, take off your shoes, and put on some new ones to walk around in for a minute.

 

These "Hollywood" humans aren't terrible people.  There's nothing solely wrong with the location that is Hollywood;  it's just another piece of real estate, another physical location.  Stop sectioning off and disregarding a whole segment of society because you're not famous, because you don't understand why other people would want to work that type of business, and because you already know how shallow and meaningless their type of life is.