The West Lives On in the Taliban’s Afghanistan

The Taliban has succeeded in reconquering Afghanistan. But while the U.S. may be gone, the new regime faces increasing Westernization among its subjects—and its own fighters.

Throughout the 20th century, any country that fought a war against the U.S. articulated a vision of the world that meant rejecting America in its most essential form. Japan, Germany, and states sponsored by the Soviet Union like Vietnam articulated civilizational visions that were totally at odds with bourgeois capitalism and liberal democracy in theory and practice. But look at any of these countries today and one can find a total transformation of that way of life. Both in victory and defeat, the power of U.S. culture to transform a receptive population seems to entail a “cultural victory” no matter what the political-military situation on the ground is like. And after a recent visit to the hermit kingdom of Afghanistan, Palladium correspondent David Oks has reported that something similar is under way even in this isolated corner of the world.

Firstly, David reports that our perception of Afghanistan as a backward, violent holdout of theocratic reaction does not hold up to reality. The streets of Kabul, which used to host many addicts to methamphetamine, have mostly been cleaned up. One feels safe walking around at night, and although women and non-Taliban are shut out of power, in day-to-day life there is little attempt by the authorities to enforce Sharia law—one can walk around clean-shaven without harassment, and if anything the patrolling mujahideen enforcers are curious to hear about what life abroad is like.

While this might be a relief for the visitor, it also indicates that the Taliban have not done much with their mandate to transform the country:


Categories: Geopolitics

Leave a Reply