To: ableChair
I'm not understanding what you mean by "heat energy dissipated by a laser".
A 50 watt laser takes far more in than 50 watts to generate the 50 watts of laser light output. So lots of heat energy dissipated there. I don't think that's what your talking about.
We're talking about 50 watts of laser light energy. In order for it to be "heat energy dissipated" it would have to be converted to heat somewhere on its travels.
In space for example the beam would go on practically forever virtually never being turned into heat energy unless it was absorbed by something. Its power density would slowly drop due the fact that the laser's light beam isn't perfectly coherent and would slowly spread over distance. As it spread the power level would be the same, just over a larger area.
In an atmosphere the laser light does a number of things. Some of it is absorbed and turned into heat. Approximately 19% of the sun's energy is absorbed by going through many miles of atmosphere. This is broad spectrum energy from the sun where some frequencies are readily absorbed and others aren't. That 19% is an average for all spectrums of the sun. Laser frequencies can be chosen for much less absorption over the same distance.
Scintillation is another problem in an atmosphere. That is why the stars twinkle. This is particularly hard on lasers because it disturbs the narrow beam width over distance causing it to shimmy around. Scintillation does not cause the light energy to be converted to heat, it just bends it slightly and disturbs the wave front making parts brighter and other parts dimmer momentarily.
The biggest loss factor for laser light through the atmosphere is scattering. This is caused by water vapor and small particles in the air. Fog is an extreme example. The fog doesn't absorb the light so much as it scatters it around. This scattering around is what makes the laser beam visible going through the air. This scattering doesn't cause the laser light energy to be converted into heat until the scattered light is absorbed by something.
Energy density is what it takes to damage the retina in the eye. As long as the laser beam isn't substantially spread, absorbed or scattered by the atmosphere it remains dangerous for long distances.
328 posted on
09/29/2004 12:32:00 AM PDT by
DB
(©)
To: DB
I'm not understanding what you mean by "heat energy dissipated by a laser".
All energy releasing devices produce entropy (2nd Law of Thermodynamics). Entropy often takes the form of heat. In the case of a laser, we're talking about heat generated within the unit itself (you're right, I'm not concerned about that) and heat energy generated as the laser excites air molecules or atoms of other elements within the air. That would be your "heat loss on it's travels" comment. Yes, you can play with wavelengths to avoid excitation, but some HAS to occur. But that energy has to come from somewhere. It is taken OUT of the laser beam itself. That's conservation of energy. You can't get something for nothing. Now, maybe lasers just ARE that efficient and the loss is too small to matter, but I suspected that they are not that efficient (namely because of the incredible hurdles faced by SDI). In other words, they will easily lose 50 watts * t2 of energy to the atmosphere in the form of heat over a 5-10 mile distance...or so it would SEEM, and possibly even lose an amount of energy equivalent to the thermal energy released by a space heater or a sufficiently brief time interval. The purpose of the space heater in the thought experiment was to intuitively illustrate how much energy we're talking about. A space heater just doesn't dump a lot of energy, even if run for a long time. Unfortunately, I think the thought experiment was too abstract for some of the posters here as they kept focusing on the mechanical differences between space heaters and lasers, which was not the point. The point was merely to give the reader an intuitive sense of the energies we're talking about, then use conservation of energy to compare the two ENERGIES, NOT devices.It's amazing what a little common sense and a background in basic physics can tell you regardless of your 'credentials'. These hyped claims about lasers just don't pass the sniff test. If it were as easy as some posters are claiming, we could protect the U.S. from ICBMs far, far more cheaply than the way we're doing it now and all these posters would be billionaires for their suggestion that we just go to Radio Shack and get the hardware. It can't be that simple. Yes, I realize that the energy required to burn out an eye is far less than the energy required to burn a missile, but if you can fire a 50 watt * t2 laser 10 miles and burn out an eye, why not a 10'000 watt * tn laser that can burn metal? It's not a big leap in power or energy. This is FAR less energetic than what SDI contemplated. Maybe the tech has changed in some dramatic ways, and maybe lasers ARE that efficient, but given that no one has developed a workable system as yet suggests that energy loss in the atmosphere is hugely significant, even for destroying a metallic missile.
To: DB
Oh, BTW, the Earth's atmosphere absorbs 95% of incident radiative energy, which is what we're concerned about here, not 81% as you suggested. This is a common error as people confuse radiative and convective energy.
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson