American Decline

The reports of American decline are greatly exaggerated

By Jason Fields, The Week

The narrative of American decline gets louder every day. Our parents had it better. Our influence overseas has waned. Our country is poorer and more divided than ever.

We say these things, and we mean them in the moment, and our knowledge of history — or lack thereof — allows us to truly believe them. “American decline is the idea that the United States of America is diminishing in power geopolitically, militarily, financially, economically, socially, culturally, in matters of healthcare, and/or on environmental issues,” observes a Wikipedia entry on the idea. “There has been debate over the extent of the decline, and whether it is relative or absolute,” the article adds, but that it is happening is presented nearly as established fact.

Our decline is in every journal you can think of, on the left, on the right. It’s on our fine website here at The Week. And the idea has a long and illustrious history: Decline is a sad story many peoples have told themselves.

But what if the story is wrong? And what if constantly calling back to the way things never actually were makes it impossible for our own age to live up to our expectations — let alone hopes?

When you’re looking for signs of decline, it’s easy to see them everywhere. In our culture wars, one side sees a horrible moral fall from the good old ways, while the other thinks we’re losing the high ground and slipping back into the dark ages. We see decline on the battlefield, pretending we don’t win wars anymore. We say people can’t find good jobs anymore. Our culture is bankrupt! Look, the only movies we can can make anymore are sequels!

READ MORE

Categories: American Decline

Leave a Reply