I thought the West (Britain & France) went to war against the Nazis because of the invasion of Poland. The U.S. went to war against Germany because they declared war on us on Dec. 11, 1941in solidarity with their Japanese allies. I know there is controversy about who in the West knew what and when about Nazi atrocities, but I don’t think they were a cause for going to war. The full horrors were discovered only in the last months of the war as the camps were overrun.
I wasn’t trying to attest to the core reasoning for war. Fact is that much of the atrocities weren’t uncovered until well into the war. My point is that atrocities are out in the open with ISIS, but everyone continues to go about their day as if nothing is wrong.
I believe the most likely case is that Roosevelt knew, but he felt the best and fastest way to alleviate the suffering of the Jews, gypsies, and others was to win the war.
Prior to the US entry into WWII, there was plenty of press about Nazi savagery. The American people simply did not want to become involved in another Euro war.
Similarly, the slaughter of millions in Cambodia was no secret in the 1970's. The American public generally took a "not my problem" view of things.