The Urban Culture is actually an anti-culture promoted by the left (you said Marxists - yes) to counter, dilute, and destroy the predominate Christian/Western Culture of America.
17 posted on 06/18/2015 11:31:03 AM PDT by MrB
(The difference between a Humanist and a Satanist - the latter admits whom he's working for)