That was Laplace's idea about 200 years ago, but it has since been shown to be wrong.
For one thing, a deterministic universe is its own 'simplest computer'. In order to calculate the trajectories in the manner you suggest, it would take an amount of computing power that would grow exponentially in time. Any finite computer would be overwhelmed in a finite time.
For another thing, you can't number the photons. Fundamental charged particle interactions suffer what's called an "infrared divergence", which means that the number of photons radiated in the course of the interaction depends on then energy cutoff at which you stop counting. If you take the cutoff all the way down to zero, the number of real photons emitted goes to infinity. No matter where you place the cutoff, you'll be missing something, so your calculation will necessarily be imperfect.
[Geek alert: if you think that this infinite sum of photons will lead to infinite quantities, you're right...but that's only half of the story. The interaction also gets a contribution from the virtual photons, and this contribution to the calculation is also infinite. However, it has an opposite sign from the real contribution, and the two infinite sums almost exactly cancel. The remaining residue, as it turns out, is independent of where you place the cutoff. That said, the real contribution really is real, so you really can't count--or account for--all of the photons.]
Finally, and worst of all, many quantum events (such as subatomic decays) are uncaused, and cannot be accounted for by any mechanism involving particles in motion. If such an accounting were possible, then all ensembles of such events would necessarily obey an esoteric relation called Bell's Inequality. It is a fact of nature, however, that many real-world interactions violate Bell's Inequality. If you're interested, I wrote a brief sketch of Bell's Inequality (in the form of a Platonic dialogue) which may be found here.