I find it hard to understand that the radiation output and contamination from this is as great much less greater than from a pair of atomic bomb blasts, one Uranium and the other Plutonium based, upwind from Tokyo in August of 1945. Or from the multiple atmospheric tests conducted in the American southwest deserts within a couple hundred miles of Santa Fe and Las Vegas. Japan and the States of Nevada and New Mexico seem to have survived those just fine.
One of the bombs dropped on Japan was a plutonium bomb. By design it was intended to convert a maximum amount of the plutonium into destructive energy. Any plutonium remaining after the blast would be wasteful.
From years back I recall reading that the U.S. only had a 30 year supply of uranium for powering reactors. The solution to that problem was to build "breeder" reactors which would not only supply energy but would convert some of the uranium into plutonium which could then be separated out and used to power plutonium reactors. Such a process would then give us a 300 year supply of nuclear energy.
So, we are comparing a nuclear weapon explosion where it was intended to destroy all of the plutonium with spent reactor fuel which may have been intended to create plutonium.
I don't know the extent to which my explanation would apply specifically to Fukishima. I think I read that at least one of the reactors was powered by plutonium. At some point the fuel would be "spent" to the point that the reactor would not be efficient, but that might still leave half or more of the original plutonium in the spent rods.
Should the cooling pool fail, 460 tons of stuff falls to the ground and reacts with the ground. Very very bad.
Also, if the cooling pools develop leaks, the rods may heat up enough to melt, and the potential outcome of that is unpredictable