The U.S. invaded Germany in World War II. Was that the War of U.S. Aggression?
“The U.S. invaded Germany in World War II. Was that the War of U.S. Aggression?”
She said Hitler.