Triangulum Galaxy
This photo of the Triangulum Galaxy was taken at the Northern Skies Observatory in Peacham, Vermont. The telescope is a PlaneWave f/6.8 17-inch CDK. The camera mounted on the telescope is an Apogee Alta F16M Monochrome CCD with a Kodak 52 mm full frame sensor.
Also known as Messier 33 (M33) or NGC 598, Triangulum is a spiral galaxy about 3 million light years away, has a diameter about half that of our Milky Way, and contains 40 billion stars. It lies within the constellation Triangulum in the northern sky, relatively close to the Andromeda galaxy.
Most observatory cameras are set up to take monochrome images only. A red filter, green filter, and blue filter are usually used to bring color to the final image. In addition to those 3 filters, an H-alpha filter was used for this photo. The H-alpha filter isolates a visible spectrum of light that shows destabilized hydrogen. The irregular shaped red objects in the photograph, some with white in them, are ionized hydrogen gas clouds. We are able to see them clearly in the photo because of the use of the H-alpha filter. These ionized hydrogen gas clouds, also called H II regions, reach temperatures of 10,000 degrees Kelvin and are massive areas of star birth.
Taking the Photo:
Taking the photo itself is pretty easy because everything is computerized. It is not necessary to look through the telescope, find the object in the sky and focus on it, or even be at the observatory to request a photo. The observatory is one of about 20 in the Skynet Robotic Telescope Network. To take the photo, we logged into that site and put in a "plan". For the plan, we specified the object to be photographed (M33), the filters to be used (the 4 filters mentioned above), and the exposure time (we used 5 minutes per image). A monochrome image is created for each filter used.
The plans are placed in a queue. The images are taken automatically when conditions are right, e.g. clear skies, dark skies, and the object to be photographed is visible to the telescope. That could potentially be a week or more, especially if there are lots of plans queued up or lots of cloudy nights. When the time comes, the door on the dome opens (if not already open), and both the telescope and door robotically position themselves to take the images. They also must continue to track the object precisely for the duration, in this case about 20 minutes (5-minute exposures for each filter, plus time to load the filters). Each of the 4 filters get loaded automatically at the point it is needed.
Processing the Photo:
The images created are in FITS format (Flexible Image Transport System), most commonly used in scientific applications, especially astronomy and microscopy. This format is not supported by typical photo editing software such as Photoshop or Lightroom. In order to process the monochrome images, I had to learn ImageJ, a free open-source, Java-based application developed by the National Institute of Health.
When I first opened the monochrome images in ImageJ, they were almost entirely black, with just a few white dots scattered around. Looking at the histogram, it very closely hugs the left boundary. There is data there, but it is all very dark. One of the first steps needed is to do a "logarithmic stretch" on each image. This effectively spreads out the histogram towards the center. Actually, this stretching also happens behind the scenes with traditional DSLR and other cameras. If shooting in JPG format, the camera's firmware does this stretching for you. If shooting in RAW, the stretching occurs at the point the RAW data is opened in image processing software such as Lightroom.
After the stretching in ImageJ, you basically do brightness adjustments on each of the images, then create a color composite from which you can do final color balancing. I also did manual image alignment. The stars in the composite should be white, so for example, if you zoom in on the stars and see red at the top of most of them, then the image associated with the red filter needs to be moved down one or more pixels in the composite. It was actually quite simple to do the alignment.
Additional options in ImageJ allow for noise reduction, sharpening, etc. There are also many available plugins, an example of which would be for building mosaics (analogous to panoramas). If you wanted to photograph the Andromeda galaxy on this telescope, for example, a mosaic would be required because the image of Andromeda is too large to fit on a single frame!
When processing in ImageJ is complete, the composited image can be saved in a variety of formats. I saved to TIFF, brought it into my usual software to set EXIF info, and did the noise reduction, sharpening, and other processing there as well.
For additional info and a photo of the Northern Skies Observatory, see: www.flickr.com/photos/davetrono/42239486970
Northern Skies Observatory website: www.nkaf.org
Triangulum Galaxy
This photo of the Triangulum Galaxy was taken at the Northern Skies Observatory in Peacham, Vermont. The telescope is a PlaneWave f/6.8 17-inch CDK. The camera mounted on the telescope is an Apogee Alta F16M Monochrome CCD with a Kodak 52 mm full frame sensor.
Also known as Messier 33 (M33) or NGC 598, Triangulum is a spiral galaxy about 3 million light years away, has a diameter about half that of our Milky Way, and contains 40 billion stars. It lies within the constellation Triangulum in the northern sky, relatively close to the Andromeda galaxy.
Most observatory cameras are set up to take monochrome images only. A red filter, green filter, and blue filter are usually used to bring color to the final image. In addition to those 3 filters, an H-alpha filter was used for this photo. The H-alpha filter isolates a visible spectrum of light that shows destabilized hydrogen. The irregular shaped red objects in the photograph, some with white in them, are ionized hydrogen gas clouds. We are able to see them clearly in the photo because of the use of the H-alpha filter. These ionized hydrogen gas clouds, also called H II regions, reach temperatures of 10,000 degrees Kelvin and are massive areas of star birth.
Taking the Photo:
Taking the photo itself is pretty easy because everything is computerized. It is not necessary to look through the telescope, find the object in the sky and focus on it, or even be at the observatory to request a photo. The observatory is one of about 20 in the Skynet Robotic Telescope Network. To take the photo, we logged into that site and put in a "plan". For the plan, we specified the object to be photographed (M33), the filters to be used (the 4 filters mentioned above), and the exposure time (we used 5 minutes per image). A monochrome image is created for each filter used.
The plans are placed in a queue. The images are taken automatically when conditions are right, e.g. clear skies, dark skies, and the object to be photographed is visible to the telescope. That could potentially be a week or more, especially if there are lots of plans queued up or lots of cloudy nights. When the time comes, the door on the dome opens (if not already open), and both the telescope and door robotically position themselves to take the images. They also must continue to track the object precisely for the duration, in this case about 20 minutes (5-minute exposures for each filter, plus time to load the filters). Each of the 4 filters get loaded automatically at the point it is needed.
Processing the Photo:
The images created are in FITS format (Flexible Image Transport System), most commonly used in scientific applications, especially astronomy and microscopy. This format is not supported by typical photo editing software such as Photoshop or Lightroom. In order to process the monochrome images, I had to learn ImageJ, a free open-source, Java-based application developed by the National Institute of Health.
When I first opened the monochrome images in ImageJ, they were almost entirely black, with just a few white dots scattered around. Looking at the histogram, it very closely hugs the left boundary. There is data there, but it is all very dark. One of the first steps needed is to do a "logarithmic stretch" on each image. This effectively spreads out the histogram towards the center. Actually, this stretching also happens behind the scenes with traditional DSLR and other cameras. If shooting in JPG format, the camera's firmware does this stretching for you. If shooting in RAW, the stretching occurs at the point the RAW data is opened in image processing software such as Lightroom.
After the stretching in ImageJ, you basically do brightness adjustments on each of the images, then create a color composite from which you can do final color balancing. I also did manual image alignment. The stars in the composite should be white, so for example, if you zoom in on the stars and see red at the top of most of them, then the image associated with the red filter needs to be moved down one or more pixels in the composite. It was actually quite simple to do the alignment.
Additional options in ImageJ allow for noise reduction, sharpening, etc. There are also many available plugins, an example of which would be for building mosaics (analogous to panoramas). If you wanted to photograph the Andromeda galaxy on this telescope, for example, a mosaic would be required because the image of Andromeda is too large to fit on a single frame!
When processing in ImageJ is complete, the composited image can be saved in a variety of formats. I saved to TIFF, brought it into my usual software to set EXIF info, and did the noise reduction, sharpening, and other processing there as well.
For additional info and a photo of the Northern Skies Observatory, see: www.flickr.com/photos/davetrono/42239486970
Northern Skies Observatory website: www.nkaf.org