Charon in true color
Image credit: NASA / Johns Hopkins University Applied Physics Laboratory / Southwest
Research Institute / Alex Parker.
We hear it all the time. Well, maybe you dont, but I get this thrown at me a lot. We see beautiful images released by NASA and other space agencies: ghostly nebulas giving tantalizing hints of their inner structures, leftover ruins of long-dead stellar systems, furious supernovae caught in the act of exploding and newborn stars peeking out from their dusty wombs.
Instead of just sitting back, relaxing and enjoying the light show the universe is putting on, some people feel compelled to object: But those colors are fake! You wouldnt see that nebula with your eyes! Binoculars and telescopes wouldnt reveal that supernova structure! Nothing in the universe is that shade of purple! And so on.
I think its first important to describe what a telescope is doing, especially a telescope with a digital camera attached. The telescope itself is an arrangement of tubes, mirrors and/or lenses that enable the instrument to capture as much light as possible. Obviously, it pulls in much more light than the human eye does, or it wouldnt be very good at what it was built to do. So, naturally, telescopes will see really faint things things youd never see with your eyes unless you hitched a ride on a wandering rogue exoplanet and settled in for a million-year cruise.
A telescopes second job is to shove all those astronomical photons into a tiny spot that can fit into your iris; otherwise, it would just dump the light on your whole face, which wouldnt be very interesting or useful. That act of focusing also magnifies images, making them appear much larger than in real life.
So, already, a telescope is giving you an artificial view of the heavens.
Your retinas have special sensors (aka, rods and cones) that can pick out different colors. But digital sensors like the one you might use to take a selfie arent sensitive to colors at all. They can only measure the total amount of light slamming into them. So to correct for this, they use filters, and either employ multiple sets of sensors or combine multiple readings from the same sensor.
Either way, the result is the same: an avalanche of data about the properties of the light that hit the device at the same moment you were taking your picture. Fancy software algorithms reconstruct all this data into an image that kinda, sorta approximates what your eyes wouldve seen without the digital gear.
But as anyone who has had to fiddle with exposure and lighting settings knows, its far from a one-to-one, human-computer match.
Doing science
If youve ever played with filters before posting a selfie, youre doing it for a reason: You want the picture to look better.
Scientists want pictures to look better, too for the sake of science. Researchers take pictures of stuff in space to learn about how it works, and some higher contrast here or a little brightening over there can help us understand complex structures and relationships within and between them.
So dont blame NASA for a little photo enhancement touching up; theyre doing it for science. [NASAs 10 Greatest Science Missions ]
The colors of the universe
But what about adding colors? If one had to do a census, perhaps the most common colors in the universe are red and blue. So if youre looking at a gorgeous Hubble Space Telescope image and see lots of those two colors, its probably close to what your unaided eye would see.
But a broad wash of green? A sprinkling of bright orange? Astrophysical mechanisms dont usually produce colors like that, so whats the deal?
The deal is, again, science. Researchers will often add artificial colors to pick out some element or feature that theyre trying to study. Elements when theyre heated will glow in very specific wavelengths of light. Sometimes that light is within human perception but will be washed out by other colors in the picture, and sometimes the lights wavelength is altogether beyond the visible.
But in either case, we want to map out where that element is in a particular nebula or disk. So scientists will highlight that feature to get clues to the origins and structure of something complex. Wow, that oxygen-rich cloud is practically wrapped around the disk! How scientifically fascinating! You get the idea.
Ever since William Herschel accidentally discovered infrared radiation, scientists have known that theres more to light than light. Redder than the deepest reds gives you infrared, microwaves and radio. Violet-er than the deepest violet gives you ultraviolet, plus X-rays and gamma-rays.
Scientists have telescopes to detect every kind of electromagnetic radiation there is, from tiny bullet-like gamma-rays to radio waves that are meters across. The telescope technologies are pretty much always the same, too: collect light in a bucket, and focus it into a central spot.
So, of course, scientists would like to make a map. After all, we did spend quite a bit of money to build the telescope. But what color is a gamma-ray that comes from a distant supernova? What hue is a radio emission from an active galaxy? We need to map all this data onto something palatable to human senses, and we do that by assigning artificial colors to the images.
Without that, we wouldnt be able to actually do science.
https://www.space.com/34146-fake-colors-nasa-photos-stop-complaining.html
This image was taken at 2:49 a.m. EDT (06:49 a.m. GMT) on July 14, 2015, five hours before Pluto closest approach, with New Horizons Ralph instrument. The picture was snapped at a distance of 150,000 miles (250,000 km). Image credit: NASA / Johns Hopkins University Applied Physics Laboratory / Southwest Research Institute.
The big surprise about Pluto is that it’s so active. The features show signs of relatively recent changes.
Is it true that only planets have moons?
bfl
Thanks for getting that cheesy Cyndi Lauper song stuck in my head for the rest of the day.