RocketSTEM Issue #11 - April 2015 | Page 166

How they do it: The processing of Hubble images from B&W into stunning full-color By Mike Barrett Everyone has seen the dramatic images produced by the Hubble Telescope from the iconic Pillars of Creation to the hundreds of galaxies in one shot, but how are these images created? The Hubble Telescope captures images in monochrome, just like the old black and white photographs with no color, but the images we see are stunning vibrant color pictures. How are these created and what elements determine the color mapping? To understand the elements that create the images we need to understand what we see as an image. A typical image is created from a grid of pixels or dots. Each pixel represents a colour and intensity in the image. These are grouped together so tightly that the eye cannot see the individual pixel, but sees a smooth transition from one pixel to the next. Basically a color image is created from three separate components: Red; Green; and Blue. From these prime colours any other color can be represented by mixing them in varying proportions. Each pixel of the image has 3 components Red, Green and Blue with a scale of 0 to 32,768 representing the intensity of the color for that pixel. In this case 0 represents no colour and 32,768 is full color. So a pixel having an RGB value of 0:0:0 is black and that having 32,768:32,768:32,768 will be white. HST WFC3/UVIS images of the galaxy group Stephan’s Quintet in three broad-band visible-light filters; left: F439W (B), center: F555W (V) and right: F814W (I). Credit: STScI, OPO, Zolt Levay Knowing how an image is constructed allows us to start to understand how a Hubble image is put together. Hubble has a number of instruments on the telescope, but the one we shall examine in this article is what is known as the Wide Field Camera 3 or WFC3. The camera of WFC3 is capable of recording a much larger spectrum than the human eye can see. This ranges from ultraviolet through visible light to near-infrared. The WFC3 camera is a 16 mega-pixel monochrome camera which produces a greyscale image. If an unfiltered image is taken it will include all the spectrum from ultraviolet to near infrared with each pixel having an intensity value of 0 to 32,768 representing the sum of all the light entering the camera. As this is greyscale there is just a single value for each pixel representing the intensity of the data recorded. Screen image of the FITS Liberator GUI specially developed by the ESA and NASA for processing Hubble’s images. Credit: STScI, OPO, Zolt Levay Capturing the entire spectrum is not particularly useful and the image needs to be restricted to certain wavelengths of light. To restrict the type of data recorded by the camera there are a series of filters that can be placed in front of the camera’s sensor to restrict the data recorded to particular wavelengths. To create a true colour image the camera must take 3 images: one only allowing red to pass through; one recording only the green light; and one the blue. These will all be monochrome images, but can be reconstructed into a normal color image. Allowing a wide range of the spectrum to pass through a filter is known as Broadband filtering. A process that amateur astronomers use to eliminate the effects of light pollution is known as narrowband filtering. This allows astrophotography to be carried out even from city centres where the effects of the sodium lighting are filtered out allowing other light to pass through. This is a fundamental principle for narrowband imaging and is based around spectrometry. Spectrometry is the ability to determine the composition of certain elements based on the light that is produced. This is of particular use to astronomy as it allows the study of particular gasses in the universe. Filters have been developed to isolate the more useful elements and can be used for imaging. In particular most emission nebula mainly consist of hydrogen and a hydrogen alpha can be used to restrict only light of the 656 nm wavelength to pass through. This principle of only letting certain wavelengths of the spectrum to reach the camera also applies to non-visible light such as ultraviolet at the blue end of the spectrum and infrared at the red end. The camera is able to see and record these wavelengths as differing intensities in