To: Peach
I would love to see a freeper fight over the meaning of "nations of the West".
What makes one a WESTERN nation?
Is it democracy, christianity, equal rights, abortion rights, anti-slavery laws...
it would make an interesting debate.
Australia and Britain are on opposite sides of the globe... but are clearly WESTERN. Does it mean anything not Eastern? Does Japan fit the profile? Malaysia, Turkey? Israel?
fwiw...
To: Robert_Paulson2
Oh, boy - you asked some good questions but too late at night for intelligence from this Freeper. And in the morning, I need my two cups of tea. By afternoon, ...
6 posted on
03/13/2003 7:29:03 PM PST by
Peach
To: Robert_Paulson2
The term "western" is NOT a geographic term. Some of the things you mentioned (democracy, christianity etc.) are elements of a western state. The term used to imply the Cold War era distinction between East and West (communist vs. democratic). Since the collapse of communism, the meaning of the term has changed. Any country with a solid tradition of democratic institutions is a western country.
11 posted on
03/13/2003 7:53:15 PM PST by
Mihalis
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson