The U.S. wound up invading Japan and Germany in World War II. Does that make the U.S. the aggressor? Or where Germany and Japan the aggressors, regardless of invasion, because they initiated the war? The south resorted to war when they fired on Sumter. Having intiated hostilities you can hardly complain when war comes to you, now can you?
Well, I suppose it depends on the viewpoint now doesn't it?!
To Southerners, LINCOLN was the aggressor. He could have withdrawn the troops from Southern Soil, as was requested. He chose not to do so, and to attempt to resupply said troops, KNOWING FULL WELL, that the Confederacy would never tolerate such actions. So therefore he started the ball rolling. And to compare Germany and Japan is totally assinine! Both of those countries committed actions of agression FIRST. The South never invaded the North. So once again, the term "War of NORTHERN AGRESSION" is correct.