Chatting with English friends over the past few days, one theme keeps coming up. America is now, for the first time ever, a real empire. And the Brits know an empire when they see one. Yes, the U.S. has been a dominant global power before now; and, yes, it had an enormous sphere of influence in the past century. But it was always challenged by a serious rival in the past; and it was also hobbled by a profound ambivalence toward foreign entanglement. Both qualifications have now disappeared. My friend Niall Ferguson kept haranguing me for years about the disparity between American power and American responsibility in the post-Cold War world. In return I kept telling him that Americans simply didn’t want to be the heirs of the British Empire of the nineteenth century. It wasn’t in their DNA.