The real problem with radioactive dating is that it assumes that the rate of radioactive decay is constant throughout time. And meanwhile physicists are always talking about how constants were changing during the Big Bang, how some constants are changing due to an “expanding universe”, etc. But to stick to their philosophy, they absolutely must insist that the rate of radioactive decay is and always has been constant.
One rather silly defense of their approach is comparing rates of radioactive decay in the 1950s to the present. As though a linear rate of change is the only kind of change.
Then you can show how that could have changed, reproduce that in a lab, and be able to show that the variance ends up with 6k years versus 65 Million years? Can you use that variable rate of decay to prove the discrepancy between 6k and 65 million?
The design and licensing of nuclear power plants is dependent on a constant decay rate. Ergo, you call for the revocation of the licenses of the nuclear power plants shutting down about 20 percent of our country's electrical supply. Thanks!