Liberalism has destroyed America.
“Liberalism has destroyed America.”
I’d say leftism has destroyed America. Liberalism is just useful idiocy that the left uses on occasion to advance its agenda of tyranny. When the left has gained the power it seeks, liberals and the childish fools that embrace it will be eliminated.