Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: Phaedrus
At the microscopic level, before an event or "observation" or measurement takes place, only a probabilistic description can be given as the momentum and position of the particles involved. This description is mathematically quite rigorous and deterministic and is the basis for "electronic revolution"; i.e. it has been very very economically productive so there is no doubt as to the quality of the math. Microscopic affects all of the macroscopic, i.e. microscopic becomes macroscopic, and the "hitch" is that probabilities routinely become hard reality but nobody has a clue as to how or why this happens because there is nothing is the math or reason that requires resolution in any particilar way.

This is rather specious.

Let's take a specific example; the crossed Stern Gerlach apparatus you'll fined in Feynman's lectures, and many other places. (It's a thought experiment, and for various reasons I'm convinced that if you tried to do it you would have difficulty getting the same result, but my objections are unrelated to the basic point, so let's ignore them).

OK, a Stern-Gerlach experiment is designed to measure the component of angular momentum of a particle along one axis. The easiest case is a so called 'spin 1/2' particle; its angular momentum can have only two measured values along any axis, +1/2 and -1/2. So you take a thermalized beam of spin 1/ 2 particles, put them through the apparatus, and separate the particles according to their angular momentum along the z axis (L_z). Half the particles (approximately) are in the +1/2 beam, and half in the -1/2 beam. You then take the +1/2 particles, and put them through a SG apparatus along x. What you find for any single particle is that it has a 50% probability that it will have L_x = +1/2 and a 50% probability that it will have L_x = -1/2. QM says this is because L_x and L_z don't commute - that is, they obey an uncertainty relationship - and that therefore a measurement of L_z will make the value of L_x completely indefinite. Your point is presumably that something else could influence the result of the L_x measurement - a hidden variable that would be window for free will or whatever to get in. Trouble is, the L_x/L_z uncertainty principle maps 1:1 onto the x/p uncertainty principle, and the latter is not merely a theorem of measurement; it accounts for the very structure of matter. The second problem is no hidden variable has ever been found, and not for want of looking. Finally, entangement experiments, where one measures L_z of one particle and it scrambles the L_x of a second entangled one, place some very severe restrictions on your hidden variable. It shows it's not merely a matter that we haven't figured out to measure the L_x of the second particle; the first measurement puts the L_x of the second particle in a completely undefined state.

Most physicists as I understand it simply reject that such a variable exists. It's not glossing over the problem; it's realizing that the uncertainty principle is a central component of the physics, and not merely an inadequacy of measurement.

Gerry, writing from Copenhagen.

316 posted on 07/09/2003 9:31:09 AM PDT by Right Wing Professor
[ Post Reply | Private Reply | To 310 | View Replies ]


To: Right Wing Professor
This is rather specious.

No it is not, RWP, appeals to authority notwithstanding.

317 posted on 07/09/2003 9:41:07 AM PDT by Phaedrus
[ Post Reply | Private Reply | To 316 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson