How scientists colorize photos of space


This is all the light in the universe that
we can see.
It’s just a fraction of what’s out there.
Most frequencies of light are actually invisible
to us.
The light we can see appears red at its lowest
frequencies and violet at its highest.
This is called the “visible spectrum,”
and we see it because cells in our eyes called
“cones” interpret light reflecting off
of objects.
We have three different types of cones that
are sensitive to long, medium, and short wavelengths
of light.
Which roughly correspond to red, green, and
blue on the visible spectrum.
These are the primary colors of light. Every
other color is some combination of these three.
And that combination is the guiding principle
in colorizing black and white images.
This portrait was taken in 1911.
I know. You came here for space photos. We’re
getting there, I promise.
It’s one of the first examples of color
photography, and it’s actually three black-and-white
photos composited together.
Russian chemist Sergei Prokudin-Gorskii took
three identical shots of this man, Alim Khan,
using filters for specific colors of light.
One allowed red light to pass through, one
allowed green, and one allowed blue.
You can really see how effective this filter
system is when you compare the red and blue
exposures.
Look how bright Khan’s blue robe is in
the photo on the right, meaning more of that
color light passed through the filter.
Dyeing and combining the three negatives gives
you this.
Alright, you get the idea. So let’s take
it into space.
The Hubble Space Telescope has been orbiting
Earth since 1990, expanding human vision into
deep space and giving us images like this
one.
The thing is, every Hubble image you see started
out black-and-white.
That’s because Hubble’s main function
is to measure the brightness of light reflecting
off objects in space, which is clearest in
black-and-white.
The color is added later, just like the portrait
of Alim Khan/
Except today, scientists use computer programs
like Photoshop.
Let’s use this photo of Saturn as an example.
Filters separate light into long, medium,
and short wavelengths.
This is called “broadband filtering,”
since it targets general ranges of light.
Each of the three black-and-white images are
then assigned a color based on their position
on the visible spectrum.
The combined result is a “true color”
image, or what the object would look like
if your eyes were as powerful as a telescope
like Hubble.
Okay, now one with Jupiter.
See how combining the red and green brings
in yellow?
And then adding blue brings cyan and magenta
to fully represent visible spectrum.
Watch this animation two more times and I think
you’ll see it.
Great, now let’s add another level of complexity.
Seeing an object as it would appear to our
eyes isn’t the only way to use color.
Scientists also use it to map out how different
gases interact in the universe to form galaxies
and nebulae.
Hubble can record very narrow bands of light
coming from individual elements, like oxygen
and carbon, and use color to track their presence
in an image.
This is called “narrowband filtering.”
The most common application of narrowband
filtering isolates light from hydrogen, sulfur,
and oxygen, three key building blocks of stars.
Hubble’s most famous example of this is
called the Pillars of Creation, which captured
huge towers of gas and dust forming new star
systems.
But this isn’t a “true color” image,
like the one of Saturn from before.
It’s more of a colorized map.
Hydrogen and sulfur are both seen naturally
in red light, and oxygen is more blue.
Coloring these gases as we’d actually see
them would produce red, red, and cyan, and
the Pillars of Creation would look more like
this.
Not as useful for visual analysis.
In order to get a full color image and visually
separate the sulfur from the hydrogen, scientists
assign the elements to red, green and blue
according to their place in the “chromatic
order.”
Basically that means that since oxygen has
the highest frequency of the three, it’s
assigned blue.
And since hydrogen is red but a higher frequency
than sulfur, it gets green.
The result is a full color image mapping out
the process by which our own solar system
might have formed.
The Hubble Space Telescope can record light
outside of the visible spectrum too – in
the ultraviolet and near-infrared bands.
An infrared image of the Pillars of Creation,
for example, looks very different.
The longer wavelengths penetrate the clouds
of dust and gas that block out visible light
frequencies, revealing clusters of stars within
it and beyond.
These images showing invisible light are colored
the same way: multiple filtered exposures
are assigned a color based on their place
in chromatic order.
Lowest frequencies get red, middle get green,
highest get blue.
Which could beg the question: are the colors
real?
Yes and no.
The color represents real data.
And it’s used to visualize the chemical
makeup of an object or an area in space, helping
scientists see how gases interact thousands
of lightyears away, giving us critical information
about how stars and galaxies form over time.
So even if it isn’t technically how our
eyes would perceive these objects, it’s
not made up, either.
The color creates beautiful images, but more
importantly — it shows us the invisible
parts of our universe.

100 thoughts on “How scientists colorize photos of space”

  1. Many thanks to Vox & Coleman Lowndes for doing this video on representative color. It's an issue I love to talk about, happy to contribute.

  2. I will say however that after seeing mostly images like these, when you find True Color images its a bit rewarding as they aren't cared about by people. I remember finding the true colors of Uranus and Neptune to be a challenge.

  3. Good presentation. I would have added that all digital color cameras are monochrome and work this way. A DSLR or your phone sensor has a bayer matrix where every pixels has it's own color filter, arranged groups of 4: one red, one blue and two green. The photo is captured mono the same way as Hubble's CCD camera, then debayered and saved by the camera as a color image.

  4. "Begging the question" means to assume what one is trying to prove. It would be better to say "Raises the question."

  5. That's dishonest. Sounds like this is how NASA markets itself. 'Well space isn't all that interesting but we still want your tax dollars.'

  6. Another awesome video, thanks very much for this one. Really great delivery, tight and classy VFX, and an awesome audio mix.

  7. @Vox – so informative; thank you for your careful walk-through on the light and electromagnetic spectrum processing of astrophotography!

  8. Great explanation ,even tho I still didn't understand but pretty much simplified if I was in this sector 🙂

  9. But why are red, green, and blue the primary colors of light when red, yellow, and blue are the primary colors. Green is made from yellow and blue

  10. Finally, the answer to a question I asked myself once and forgot about.
    This is so amazing! It makes the universe that much more interesting.
    S C I E N C E

  11. This is a really well-explained video, even relatable to microscopic images as well! As molecular neuroscientists, we also use RGB principle in microscope image analysis.

  12. Hey Vox!
    Can we please have a schedule from you for the different series you upload (Ex. Darkroom every X day or Almanac every Y day). Thankyou!

    Btw, love your videos.

  13. with paint and stuff the primary colors are cyan magenta and yellow to people who don’t know. The others are secondary colors. For exaple you mix magenta and yellow you get red. If you mix yellow and cyan you get blue and so on. Magenta is also a color with no wavelength, Thats because the blue and red receptor turns on in the eye and between red and blue is green but green has its own receptor. Then the brain creates magenta and then magenta don’t have a wavelenght. Between red and green is yellow between green and blue is cyan but magenta? no. Also the colorwheel is a lie all the colors are in a regtangle. So if you wanna go out to get some acrylic colors or any paint and want the primary colors get cyan yellow and magenta. Hope you undestood this.

  14. I appreciate this video for being factual, but I don’t appreciate you referring to the last ruling remnant of the Mongol Empire, and descendant of Genghis Khan, as “this man, Alim Khan.”

  15. Technically me as a astrophotographer know that you don’t need to “add” color as you can take them with color to begin with. Some cameras do need this process but not all.

  16. This is very good to have. We really need this level of explanation, not just to science pursuers and people passionate about knowledge, but for amateur astronomers and photographers as well, for really grasping the understanding about capturing light from deep space. The Universe gave us its "formula" and now we must do the best we can to reach out for it, for everything that is.

  17. I am an astrophotographer and this video is SPOT-ON. This is exactly how Hubble (and I) use a mono camera and filters to image the cosmos in color.

  18. I think I missed something. If making a black and white photo into a color photo is as easy as applying filters and a specific color, why are people spending hundreds of hours hand coloring historic photos?

  19. If we are going to believe that this colorization is 100% accurate, then let them recreate that Russian scientist experiment. They should take a picture of someone in black and white and also in color. Then give the black and white version to a color expert who has NOT seen the original color photo. If the person can recreate exactly the original color based on this technique, then we will know that these space photo colorization technique gives the right result.

Leave a Reply

Your email address will not be published. Required fields are marked *