Hollywood Problems

The problem with Hollywood having any impact on the world outside of entertainment is their lack of real-world knowledge. If these people do not live in the real world and face real-world problems with issues such as racism, gender discrimination, poverty, and any issue possible that normal, everyday working-class citizens face, why should they try to solve these problems unless it were for money, further endorsement, ratings, or other such monetary or reputation boosters?

Another problem Hollywood has is that its general opinion differs widely from the general opinion by the average person. American citizens realize this, and this creates huge mistrust between these two societies. If a celebrity is standing at a podium speaking his or her mind about something he or she heard someone else say and then regurgitates, and it doesn't coincide with the views of the public, this looks like the celebrity is trying to persuade people to think the same way. It comes across poorly.

Furthermore, many of the views by celebrities are extremely negative. Global warming, problems with the government, discrimination, the war on terror, abortion, the economy, rising gas prices these are all issues which are important to the country. All Americans think about these situations. However, most try and remain optimistic that things will change for the better. Many Americans are still very conservative, however, and would like to find a solution which fits with traditions. Celebrities are extremely liberal, and this scares most Americans who are not used to so much pessimism.

Negativity is highly contagious. Actors/actresses, singers, and other entertainment folk constantly get married and then divorced, do drugs or become alcoholics and go to rehab, drive drunk and go to jail, or speak negatively about the President of the United States simply to increase ratings. These types of problems spread to other parts of the country, which causes a ripple effect throughout the world where America has much influence.

Most of the time, everyday people watch the television and laugh. We barely pay any attention unless what is said is incredibly bold and/or shocking. However, it does seem lately that these types of people are gaining more attention. Is it due to the lack of alternative entertainment, or is it that people are starting to actually care what celebrities say? Who knows, but the more attention is afforded to these people, the less respect our country will get from others.