Why do people persist in thinking that WWII was only in Europe?
Japan had been rising for some sixty years at that point. WWII would have come if WWI had not been fought, if Hitler had not come to power if Europe had just continued to have petty little wars. War with Japan was always coming. And without Germany to ally with they would have allied with the USSR.
It would have arrived a bit later, probably in the 1950's but it would have come.
And considering their advanced in germ warfare it would have been a very nasty war.
War with Japan does not equal a World War. Japan initially had no allies and any war with them probably would have been Pacific-centric.
But for Hitler the WWII that came to pass would not have.