Mars Update from Mauri Rosenthal

I'm often amazed by the talent and skill of other astrophotographers. Active astrophotographers using small telescopes are a specially talented group. Mauri Rosenthal of the New York AAA is one of those I admire most and have learned much from. He recently copied me with this update on his planetary workflow for Mars and has allowed me to share it. This is his work below, so credit him if you use it.

Mars update -\- Sunday night imaging marathon

Mauri Rosenthal, August 1, 2018,

Since I spent the whole day doing this I’m going to summarize the workflow I settled on. This is more so that I can have this reference for my infrequent planetary imaging, I don’t expect anyone to actually read all of this unless they’re diving down this particular rabbit hole.

To summarize the data capture process, I used a ZWO ASI 120MC planetary cam in a 2X Televue Powermate in the eyepiece port of my Q 3.5, tracking Mars via the Questar’s PowerGuide 2 built-in tracking (with a rough polar alignment).

I had the experience of two prior nights of both under and overexposing the target so I finally knew how I wanted the exposure to look both visually and via histogram during the live capture – I needed the brightest (red, actually) portion of the histogram about 2/3 of the way to the right and a clear spike for the rest, just past the left edge. I kept the gain near the middle of the slider (50 -75) so that I could use the highest frame rate to achieve that histogram; it shows up in the SharpCap data files as exposures of .012 to .025 seconds.

I collected 21 separate AVI files, starting with 500 frames but expanding to 3000 frames as I gained confidence about my focus and exposure settings. That provided some 25000 to 30000 frames, captured during a 50 minute span, finally filling my laptop drive with a fresh 32Gb of data.

An important “aha” came from checking Mars’s rotation in a couple of sources, including that simulator linked here and also in CdC. I was surprised to see how swift it was, which meant that I had to account for rotation in processing – e.g. no point in stacking (without derotating a la WinJupos) more than 5 to 10 minutes tops. I decided to go with 10 minute blocks to keep from going completely nuts (did I mention that I spent the whole day on this?).

Anyway the other aha regarding rotation was confirming the orientation of my scope; shooting through the Questar eyepiece port was giving me correct North/South but mirror image East/West. My planet is rotating from right side to left, while the simulators etc show left to right. Once I realized that I’ve got a mirror image, I could confirm that the light and dark regions in my images match the reference images – my Martian splotches are real things, not artifacts. This made me think in terms of using a video to a) illustrate just how fast Mars is turning and b) enable a viewer to see that the splotchy surface is bona fide. I spent some time messing around with WinJupos and I rediscovered what I learned a couple of years ago – it’s kind of a hassle to use unless you’ve set up your workflow for it. I think I finally did everything right and generated a consolidated image but it was no sharper than the best single image from the middle of my session, so I gave up on it.

So here’s what I did once I transferred my 32 gb onto my desktop:

  • Group the videos, each ranging from 500 to 3000 frames into 10 minute tranches. Note that as I was shooting I started with 500 frames and increased to longer captures as I gained more confidence regarding focus and exposure settings. My tracking was so-so; it actually would have been great tracking for say 800 mm focal length but at 2800mm a small deviation leads to a lot of drift.
  • Use PIPP to break up the videos into individual frames and quality score them in one directory so that I could load the best 1000 frames from each 10 minute block into AS3. This means that thousands of frames never even made it into AS3.
  • Stack the top 300 frames (or 30%) with 1.5x Drizzle with AS3
  • Load the resulting TIFF file into Registax 6 and use one set of wavelet and gamma adjustments for all 5 of the stacked images. The wavelet settings were pretty aggressive with boosts to both sharpening and noise reduction, achieved via a lot of trial and error. The gamma adjustment made a big difference in getting the right contrast between the light and dark regions as well as the nice red color (I did not change the white/color balance – Mars is really red!)
  • Use PixInsight for inspecting and sharpening the 5 stacked images; I had to tweak the curves on one of them to get them all pretty similar.
  • Drop the 5 tweaked stacked images back into PIPP to generate a movie
  • Stick the movie into MS Movie Maker to add watermark and caption
  • Polish up the best of the 5 stacks as a still image

I redid the video several times trying to get something that would look better on a phone. Instagram won’t allow my 2 second video (apparently 3 seconds is the lower limit) so I also tried a few versions closer to 10 seconds but ultimately I decided that life is too short and I didn’t want to fool around with it anymore, so I just linked to the video in Flickr.

Finally just to illustrate the real magic of stacking and wavelet sharpening, here’s what a few “high quality” individual frames straight out of the captured video look like (using screenshots here to make this simple):

Here’s what the AS3 drizzle stack output looks like for the above with their 297 closest friends (very few are round without weird blemishes from dust etc):

At least it’s round… and here’s that TIFF after RS6 wavelets

And then the gamma adjustment

And from here it’s small sharpening and curves tweaks for the finished image.

Content created: 2018-08-05




Submit comments or questions about this page.

By submitting a comment, you agree that: it may be included here in whole or part, attributed to you, and its content is subject to the site wide Creative Commons licensing.























Moon Phase