Before the war it was always the United States *are*, after the war it was the United States is... it made us an is.
As a Southerner I would have to say that one of the main importances of the War is that Southerners have a sense of defeat which none of the rest of the country has.