Teasing life into planetary images

At first the raw data from your camera looks like a big noisy ghost of what you saw in the eyepiece. Then slowly you recover the detail that you saw and finally details that eluded your eye. Read on to learn about the life of planetary image data.

Acquisition of raw images

There are three primary goals when you take image data:

  • Get your images in focus. Don't make a difficult job impossible. Take the time to get the best focus that you can. The image will jump around on your screen and the best focus will drift. Use focusing aids: magnified views Bahtinov mask, and electronic focus highlighting measures, to make sure that your focus is spot on.
  • Don't overexpose. You can recover from some underexposure when you stack your image, but overexposure is irrecoverable. It is as bad as leaving the lens cap on.
  • Record as many images as you can as quickly as possible. Each image is a lottery ticket to get a lucky clear image. They need to be images of the same scene before shadows have moved or the planet turned enough to be noticeably.

Grading and stacking

Grading uses tools that calculate a sharpness index and your own eyes to pick the best images. By the luck of the atmospheric seeing lottery, some images will be much clearer than others. Sometimes image defects like a passing plane or a bit of dust on the sensor can fool computer tools; your eyes need to be part of the process. The spread of scores will help you determine how many images to keep. Good seeing conditions may leave you with half your images suitable for stacking. Only a few percent may be useful when the air is turbulent. Remember the garbage in, garbage out rule.

Stacking your good images reduces noise in your final image. Stacking more images improves the signal to noise ration in your stack, but it is a game of diminishing returns. The noise is reduced by the square root of the number of images that you stack. Stacking 9 images will leave 1/3 of the noise. Stacking 100 will leave 1/10. Stacking 10,000 will leave 1% of the noise. Most of the benefit happens quickly and even a few images makes a big difference. This is why grading is so important. Adding bad images into the stack will reduce noise but also reduce contrast and blur your final image.


This is where you get to use photoshopping for good; to reveal truth rather than obscure it. Photoshop is useful, but often isn't the best tool for a particular job. I often move my data between three or four tools to produce a final image. Whatever tools you choose, there are two main jobs that you will need to accomplish at this step:

  • Sharpening can pull details that you didn't realize were hidden in the fog of your image. Only some sharpening algorithms increase resolution and reveal real hidden detail. Other algorithms only fool your eye and brain, but in reality produce an inferior image.
  • Light intensity management, useing tools like levels and curves, fits a universe of light onto a screen or paper print. They dynamic range of the human eye can be as much as 1,000,000:1 (20 photographic stops). A perfectly exposed camera image might capture a range of 10,000:1 (~ 12 stops). Your computer screen or print is probably closer to 200:1 (~ 8 stops).

They eye and brain are easy to fool. Good processing lets you see all the detail in the image, in spite of the limitations of the medium. Bad processing creates artifacts in the image or obscures some parts of it in revealing others.

Hints for processing

The easiest way to make an image appear sharper is to add noise to it. The human brain will perceive an image with a little bit of noise added as sharper than the image without the noise. Many image processing programs have an algorithm called unsharp mask. Unsharp mask has origins before digital photography. You take an image, blur it, subtract the blurred image from the original, and add a bit of that back into the original image. This creates artifacts along edges that trick your brain sending it into sharpness happiness, but in reality the image has less detail.

Deconvolution and wavelet algorithms model blurring as the convolution of a point spread function (the effects diffraction, dispersion, and optical imperfections) and mathematically reverse this, revealing the true detail present in the image. These algorithms were used to recover the first useable images from a defective Hubble Space Telescope. Later that was fixed with corrective lenses, but these algorithms remain a powerful tool. Using an approximate point spread function is often good enough to dramatically improve an image.

Resolution recovery sharpening algorithms should be applied only to linear data, before any intensity manipulation is performed on the data.

Intensity masking directs sharpening or level adjustment tools to operate on only the regions of the image where they reveal and not where they obscures image details. It is especially useful for avoiding sharpening artifacts like the light and dark rings (onion skin effect) that sharpening algorithms can create around the edge of a planet.

Using sharpening or light curve intensity tools takes a light touch. There is a soft boundary between revealing the data hidden in the image and making the image look hard or over processed. It is an art that I struggle with for every image that I produce.

I reworked the finished image with a little lighter touch in the enhancement and tried out stacking with Autostakkert 2 virtualized on macOS. The stacking results were similar, but stacking is easier in Autostakkert.

Jupiter at opposition 2017 - Revisited
Jupiter with moons Io, Europa, and Ganymede (L to R) from Austin, Texas 2017-04-07 10:25 PM CDT. Questar 1350/89 mm telescope with a Dakin 2X Barlow lens and ZWO ASI120MC planetary video camera. Gain 46, exposed 77 msec captured in oaCapture running on macOS. Best 52 of 2000 exposures stacked at 1.5x in the amazing Autostakkert 2 and deconvolved in Lynkeos. RGB aligned to reduce atmospheric diffraction. Final crop and exposure tweaks in Photoshop.

Content created: 2017-04-10 and last modified: 2017-04-12




Submit comments or questions about this page.

By submitting a comment, you agree that: it may be included here in whole or part, attributed to you, and its content is subject to the site wide Creative Commons licensing.























Moon Phase