Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: MayflowerMadam

There is an astronomy buff that takes photos through his telescope. He had some unedited photos that still look pretty cool with colors, etc.

The Webb images are from electromagnetic(?) devices that are “seeing” wavelengths that are not all visible to the human eye, so the various wavelengths are represented by a color that we can see.

I imagine that they have the raw data (wavelengths referenced to color) and then enhanced images where they run the data through the computer to smooth it, enhance it, etc.


28 posted on 07/22/2022 12:05:40 PM PDT by 21twelve (Ever Vigilant. Never Fearful.)
[ Post Reply | Private Reply | To 25 | View Replies ]


To: 21twelve

JWST is sensitive to wavelengths between .6 micron and 28.5 micron. “Visible light” refers to wavelengths between 0.4 and 0.78 micron (approximately). For reference, HST is sensitive to wavelengths between 0.1 micron and 1.8 micron. The two observatories are complementary. Both telescopes measure mostly light which is not detectable by the human eye.

“Light” ( whether ultraviolet, visible, or infrared) is focused by the telescope onto an array of electronic detectors, to record an image. The array of sensors is known as a “focal plane array” (FPA). Filters are used to select a relatively narrow range of wavelengths for each image. Unmodified, with the data from those detectors scaled to “brightness” on your computer monitor, would produce a grayscale (black&white) image. To render the data scientifically useful, they are processed to compensate for known (as in measured) irregularities in the FPA. The color images NASA releases are produced by assigning three selected non-visible wavelength images to red, green, and blue.

Some people describe these images as “computer generated”. They’re wrong. CGI refers to visualization of “data” artificially generated within a computer, with no reference out to the real world. JWST imagery (and similar, including the camera on your cellphone) are visualizations of data generated and measured naturally in the real world. A film camera does exactly the same thing, using chemical rather than electronic processes.

You cellphone camera produces color images by putting a pattern of red, green, and blue microscopic filters in front of an FPA sensitive to visible light. It’s called a “Bayer Filter”, and the Wikipedia page does a good job of explaining how it works.


36 posted on 07/22/2022 12:57:31 PM PDT by NorthMountain (... the right of the peopIe to keep and bear arms shall not be infringed)
[ Post Reply | Private Reply | To 28 | View Replies ]

To: 21twelve

That’s my hubby’s avocation, i.e., astrophotography. The purists hold great contempt for the manipulation of images to make them too “purdy”.

Smoothing raw TIFF files is one thing. NASA takes their little crayons to the pics. They are gorgeous; just not real. Kind of like airbrushing models’ bodies.


39 posted on 07/22/2022 1:01:31 PM PDT by MayflowerMadam
[ Post Reply | Private Reply | To 28 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson