Free Republic
Browse · Search
Bloggers & Personal
Topics · Post Article

To: Williams
There is a strong consensus of opinion among historians that the West did cause WWII. The post WWI treaties penalized German into severe inflation and misery. That proved to be a fertile situation for Hitler's rise. WWII was an extension to WWI, mostly due to an unreasonable peace.

General Marshall knew this. That is why there was a reconstruction following WWII rather than continued punishment.

57 posted on 10/17/2022 5:25:18 AM PDT by GingisK
[ Post Reply | Private Reply | To 6 | View Replies ]


To: GingisK

Germany is part of the West and Russia was a major cause of WW2 - from the East.

More palp.

WW2 was caused directly by the German Nazi Party and Hitler.
And by Imperial Japan and the Japanese generals.

Both were based on racist ideologies. Both committed atrocities wildly out of proportion to any post WWI bitterness.

And Germany before it launched the World War by attacking Poland, had achieved an economic turnaround.

Germany’s post WWI economic woes are often given as a cause for the rise of Nazism. Regardless, they did not cause the rise of Imperial Japan, or WW2.

And Russia, a major protagonist in WW2, had abandoned WWI and even allied with Germany to kick off WW2. Therefore, Stalin’s immense role in WW2 and Hitler’s hatred of communist Russia also do not fit the simplistic narrative.

AND most historians believe Neville Chamberlain’s appeasement policy helped cause WW2, another disconnect from your narrative.


59 posted on 10/17/2022 5:40:04 AM PDT by Williams (Stop Tolerating The Intolerant)
[ Post Reply | Private Reply | To 57 | View Replies ]

Free Republic
Browse · Search
Bloggers & Personal
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson