Before I started traveling internationally on a regular basis I honestly thought the rest of the world loved America. I never really questioned the media hype about American Exceptionalism. Having never been anywhere else I took it all as truth. We are the best. The saviors of the world, right… Of course I never said that publically and I was always respectful of other cultures, but these feelings just came naturally. I didn’t realize there was any other way to think.
Looking back with new information it’s not surprising that that “love” for America is fading. You can only be that arrogant asshole for so long before you start losing friends. The more I travel the more I realize what people expect of me as an American is not who I am.
The Caricature of America
I’ve had conversations with people from all over the world and never has anyone came right out and said “We don’t like America.”, but what they have said is “You aren’t what I expected.” or “I like you more than I thought I would.” It seems there is a vivid difference between how people feel about our leaders, how Americans are portrayed in the media, and who we actually are as individuals.
I remember having a conversation with one young women in Japan about what she expected me to be like and what she thought America was like. She described the jersey shore and the movie American pie. A caricature of American pop culture, war mongering, and fist pumping. Great. Of course she realized those stereotypes are unreasonable, but a part of our image is what is on TV.
As Americans I think it is easy to forget how truly isolated we are over here on this side of the world. Our distance has lead to a subtle arrogance and lack of cultural understanding that comes natural to the rest of the world. Perhaps this has lead to a slight feeling of superiority and elitism – we believe the crap we are selling ourselves. In turn American pop culture (sometimes the worst parts) are exported to the rest of the world. This leaves a gap between who Americans really are and what non-Americans think we are.
The research shows we aren’t popular either. Study after study finds America is viewed as uncaring and generally disliked. Here are a few of the stats:
So what is the solution?
Get out there America. Read a book, gather some culture, talk to the people, and TRAVEL! Get out of your comfort zone and realize there is a whole other world out there filled with people who have different thoughts, ideas, and experiences. It’s not scary. Show the rest of the world we care. Show them that we know a thing or two about them, that we want to learn, and that we aren’t a bunch of drunk frat boys as seem on TV.
If we continue to let the politicians and media tell the rest of the world who WE are then the rest of the world will start to believe them.