Astrophotography using an Alt/Az mount

Just one of well over 100 articles in the author’s Astronomy Digest: http://www.ianmorison.com

[This article shows how, using a manual frame rotator and the provided BBC Basic program which can be run on Windows, Mac or Linux, one can manually rotate the camera to maximise the fully exposed image area when capturing a long total exposure.]

At home, I can use a permanently mounted equatorial mount for astroimaging but this is not portable.  I do, however, have an excellent iOptron AZ Mount Pro Alt/Az mount to take to star parties and dark sky sites and, at the latter, use this to carry out some astrophotography. It is sometimes said that one cannot do astrophotography using an Alt/Az mount but this is not true providing action is taken to remove the effects of frame rotation. 

Many of the world’s largest telescopes are Alt/Az.  Their cameras are mounted on computer controlled rotating platforms which keep the sensor aligned on the object.  At the 2022 NEAF exhibition, ZWO announced that, in preparation, was a camera rotator which will be controlled by, I assume, the ASI Air.  PrimaLuceLab provide their Arco rotator and Pegasus their Falcon rotator (£565) both of which can be controlled by  a laptop.  These will continuously rotate the sensor so long exposures could be made if the telescope was autoguided.  In the manual method outlined below, the exposures will need to be short (but the mount can be unguided).

It should be pointed out that Alt/Az mounts cannot track near the zenith but, on the other hand, do not suffer from meridian flips as do many equatorial mounts.  A useful feature when imaging with an Alt/Az mount is that the frames are automatically ‘dithered’ as the image (apart from the very centre (*) ) moves across the sensor so eliminating what is called ‘Colour Mottling’ due to the varying sensitivity of the pixels over distances of 20 pixels or so.

[(*) If the mount is tracking perfectly, then the central region will not be significantly dithered, so its actually better if there is a little movement of the image over the sensor during the total imaging period – this is very likely to be the case unless the mount is autoguided.]

Planetary and Lunar imaging are not really affected as usually very short exposures or a video sequence is taken over a short time period.  It is when one might want to image a deep sky object for an hour or two that problems will arise due to what is called frame rotation. To illustrate frame rotation one can consider the Orion constellation from rising in the east to setting in the west.  In the east, Orion is ‘lying’ on its back, due south, it is vertical and, in the west, lying on its front.  Its image will thus be rotate over the sensor of a camera mounted on an Alt/Az mount.  

Provided that a lens that shows no or little distortion (rectilinear) or a telescope is used, a stacking program such a deep sky stacker or, my favourite, ASTAP will de-rotate the captured frames before stacking them and so an image can be made.  [Many short focal length lenses will exhibit some distortion and, when many rotated frames are stacked, the outer parts become blurred.]

Exposure time

The frames must themselves have a short exposure otherwise stars will give elongated curved images away from the frame centre.  One can take some test images with a range of short exposures to test for star trailing (in this case short arcs).  The maximum exposure time will depend on the sensor size.  It is not really sensible to image above 60 degrees elevation as this requires very short exposures but, in this worst case, for the UK (or similar latitude) when due south, the maximum exposure time for a full frame sensor is about 15 seconds but longer for a Micro 4/3 sensor.  For the UK I now tend to take no more than ~30 second exposures using any of my mounts. Stacking a large number of frames allows satellite or aircraft trails to be eliminated if a ‘Sigma-Kapper’ stacking mode is used.

As an example, suppose an ASI 1600, APS C sized sensor were to be used which has a pixel size of 3.8 microns The equation to give the number of pixels traversed at the extreme corners of the frame is (t is in seconds):

Pixels Traversed = 0.271 x cos (latitude) x cos (azimuth) x t / cos (altitude)

This gives 0.325 pixels per second for my example imaging due south. Assuming one can tolerate a blurring of 7 pixels in the extreme corners, then the maximum exposure is given by:

t = 7 / 0.325 = 21 seconds.

If a Micro 4/3 sensor is used:

Pixels Traversed = 0.219 x cos (latitude) x cos (azimuth) x t / cos (altitude)

t = 7 / 0.219 = 32 seconds.

If a Full Frame sensor is used :

Pixels Traversed = 0.43 x cos (latitude) x cos (azimuth) x t / cos (altitude)

t = 7 / 0.47 = 15 seconds

So, using my Altair Astro 294 cooled Micro 4/3 sensor camera I could use ~30 second exposures.

Depending on how much the image has had to be cropped one might find that a movement of 7 pixels in the corners is too much. In which case the exposure must be reduced by the factor 7/ (Allowed pixel movement).

The Alt/Az exposure problem.

As  the frames taken of an image will rotate with respect to the sky, only the central part of the stacked image will have the full exposure. Providing that the object to be imaged does not fill the frame, one can crop this region from the full frame.  How much of a problem this is depends on the rate of frame rotation during the course of the imaging.

Ideally, one like to image a DSO when highest in elevation as it crosses the meridian due south. Sadly frame rotation is at its fastest due south and increases with greater elevations.  The latitude of the observer also plays a part. 

Frame rotation is least as objects are rising in the East or setting in the West but then they will then tend to have lower elevations so the effects of the atmosphere will be worst.

Suppose we aim to image a DSO in the hours before and after transit and take M27, the Dumbbell Nebula as an example.  As it crosses the Meridian at 01:18 in July, as seen from a latitude of +53 degrees (central England), it will have an elevation of just under 60 degrees – so this will be pretty much a worst case scenario.  One hour before, at an azimuth of ~154 degrees it has an elevation of ~+57 and, likewise, one hour after.  During this period the rotation rate will first rise to a maximum when due south and then fall back to the value at the start  of the imaging run.  Below I will provide a BBC BASIC program that calculates the rate of frame rotation for any azimuth and elevation. This program can be run on a BBC computer emulation program.

In the case of our example, the initial frame rotation rate is ~2.5 degrees in 10 minutes, the maximum due south is ~2.9 degrees in 10 minutes and, not surprisingly, one hour after transit it has dropped again to ~2.5 degrees in 10 minutes.  The total frame rotation during this period will thus be ~32 degrees.  The image below this shows simulated captured frames at the start and end of the imaging period. The frame rotation is very obvious.

As the imaging continues, the outer parts of the result of stacking all the frames will have less exposure.  The area which has full exposure can then be cropped out but will, of course, cover a smaller area.

What could be done manually?  The idea is very simple: at times during the 2 hour period, the camera is rotated to compensate for the frame rotation.  Some refractors allow the focuser orientation to be changed so this can act as a rotator.  But perhaps better, from for example, Rother Valley Optics in the UK, one could purchase a ‘RVO M42 CAA 360° Rotator’  which screws onto the T piece that enters the focuser barrel and onto which the astro camera or DSLR/Mirrorless camera (with a suitable bayonet adapter) screws. For my setup no additional step up or step down adapters are required. One must remember that the rotator adds ~10 mm of thickness and this must be taken into account when using a field flattener or reducer.

In either case, one would interrupt the imaging at suitable times throughout the imaging period and rotate the camera clockwise by a suitable amount.  How often one would need to do this would depend on how little one wanted the crop the stacked image.  In the image below, for the example given above, this time I have ‘rotated’ the camera by ~5 to 6 degrees after each 20 minute period of imaging.  As seen,  the amount of cropping required is then not too great.  Remember, this is essentially a worst case scenario – an object at high elevation imaged in the south.

This image shows the comparison of a full frame, the final crops when a ~6 degree rotation has been made each 20 minutes and that when no correction for frame rotation has been made.

The BBC Basic Program to calculate the frame rotation rate.

Downloading the program for Windows, Mac or Linux

Go to https://www.bbcbasic.co.uk/index.html

Select the operating system.

Choose which emulation screen to use,

The program can be copied and pasted from the article.

As the latitude is likely to remain a constant, this is included as a constant and will need to be edited in line 10 for the user’s own latitude. one then simply inputs the azimuth and elevation and the program gives the total frame rotation in a 10 minute period.

 The Program

10 CLAT = COS(RAD(53))

   20 PRINT “AZ?”

   30 INPUT AZ

   40 CAZ = ABS(COS(RAD(AZ)))

   50 PRINT “EL?”

   60 INPUT EL

   70 CEL = COS(RAD(EL))

   80 R = 15.04 * CLAT * CAZ / CEL

   90 R10 = R * 600 / 3600

  100 PRINT “Rotation Rate in 10 minutes = ” R10

  110 END

If we run this program for the initial state of our demonstration, we get:

AZ

? 154

EL

?57

Rotation Rate in 10 minutes = 2.489…