Back to photostream

The Horsehead Nebula, Barnard 33, IC434; The Flame Nebula, NGC2024, Sharpless2-277; NGC2023, LBN954; and d.s. Alnitak, ζ Orionis

The Horsehead nebula in Hydrogen-alpha

 

Astrophotography is a complex process combing optics, precise mechanical pointing & tracking, real-time guiding corrections, field derotation, environmental, and sequencing through multiple pieces of equipment that all need to “talk” to each other to automate the 66 exposure frames that were obtained here. Small corrections in focus need to be made as temperatures change, small errors in tracking are corrected using a second telescope and camera that counters the mechanical equipment limitations to keep it at level of seeing that the atmosphere allows.; periodic small adjustments in tracking, called dithering, are made to average out sensor issues that could later be mistaken for signal, then finally calibration frames are taken using a special light panel of perfectly uniform illumination…this is all done remotely and robotically controlled by a computer mounted with the scope communicating with my computer at home. My telescope could be in my backyard or halfway across the world.

 

After collecting the data frames called “light frames” those 66 44MB images are added together in a process called stacking. This registers, normalizes, calibrates, aligns and adds the data signal while keeping the noise as low as possible, it also normalizes gradients and artifacts due to the optics as well as light pollution. …the final preprocessed image is essentially a uniform blank gray single image about 250GB in size with no stars or nebula visible!

The post processing next takes this image and pulls or stretches the signal from the background noise (dark sky sensor dark current) makes two images: one with the stars removed and one with the stars. Those two are processed separately using different techniques to enhance the extremely dim nebula signal from the noise. Then eliminate more noise and enhance sharpness of the fine details. Finally the two Star field and nebula images are recombined in the single image you see.

Just three decades ago équipement and processing of this level were only available in professional research grade equipment costing millions of dollars. Of late, advances in signal processing software, consumer available astronomy camera grade sensors and computing power have allowed amateur astronomers to rival the best science images of only a decade before. And the cost and relative ease of entry to excellent astrophotography just over the last couple of years has decreased immensely allowing even novice astronomers with zero knowledge to press a couple power buttons and the scope, mount and software can automagically produce images that are really pretty good…all controlled on your telephone!!!

 

My setup is fully robotic and fully remote controlled of high end equipment designed for wide-field targets that are large and extremely dim deep space objects like nebula, star clusters and nearby galaxy clusters. As you may be able to tell it is a complex system of different equipment and software that all must to talk to each other perfectly throughout an evening(s) of imaging. So while the equipment does it thing outside in the cold or buggy heat, I can be cozy, warm and bug free away at home or sleeping. I even get warning alarms if something goes wrong (weather, clouds, dawn light or numerous other equipment issues) so if not automatically corrected that I can correct it remotely or physically need to check on the equipment. However failures are rare. While this project only involved less than a terabyte of data, some projects can collect 4+TB of data which must be off loaded remotely from the scope side SSD drives to a remote raid server.

 

Equipment

Telescope: WO FLT91 @540mm & f/5.9

Fattener: WO 68III (no reduction);

Camera: ZWO ASI2600MM Pro (monochromatic cooled APS-C CMOS sensor @-10°C, 100gain, 1x1bin);

Filters: Chroma Hα @ 36mm & 5nm band;

Mount: TTS160 Panther Alt-Az

Field derotation: TTS rOTAtor

Guide scope: WO RC51

Guide camera: ZWO ASI290MM

Control Computer: PLL Eagle4

Focus: PLL SS2 robotic focuser

Environment: PLL ECCO-2

Dew control: Kendrick heater straps

Flat panel: Elumiglow custom made

Image processing Computer: MacBook Pro M1

Data storage: Samsung2x 2TB SSD scope side & local OWC Mercury 4x6TB Raid

Remote Power: custom made: with LiTime 100Ah LiFePO battery, Victron Blue remote shunt, inverter, and power supply/charger

 

 

Sequencing and Processing

Remote client communications: Parallels Client

Planetarium: Cartes du Ciel & SSPro 6

Sequencing. framing & imaging: N.I.N.A.

Guiding: PHD2

Plate-solving: ASTAP

Alignment & pointing & power up: TTS custom

Communications platform: ASCOM

Focus control: NINA

Environment control: NINA

Calibration frames: NINA

Stacking & Preprocessing: APP - AstroPixelProcessor

Post processing: Pixinsight & Affinity Photo2

 

 

Image Frames

Integration time: 1hr 6min (66x60sec) Hα

Calibrations: flats and bias only

Pallet: H-alpha only

 

This was a fairly simple project as it involved only one Hα filter over a single evening of imaging with relatively short 60 second exposures. I can image more data with different filters and add that to the original data at a later time…

 

Color images however can take multiple evenings, weeks or years to collect hundreds of 3-5 minute HOS narrowband filter (Hydrogen-alpha, oxygen-III, and Sulfur-II) exposures as well as wideband short exposures of LRGB (luminence, red, blue, and green wide band pass filters) using 6 different filters for the camera. Total integration times (total exposure time) can easily be 40+hours involving thousands of light and calibration frames and multiple terabytes of data. The processing of the final image can take hours to days. So why not just use a color camera (called an “OSC” one shot color) instead of a monochrome sensor? Easy: monochrome sensors are superior and much more sensitive than OSC color cameras that must use a Bayer filter and collect only ¼ of the photons per pixel that a monochrome sensor can over the same amount of exposure time. Color cameras introduce more noise in the signal and further complicate the tracking/guiding errors due to the need for longer exposures. Plus color cameras just plain suck with color calibration and getting star colors those jeweled and bright objects that they are from the blue giants to the deep red carbon stars to all the colors in between. Also false color images are very easy to compose with a monochrome sensor!

 

Hope you enjoy the beauty of the invisible evening sky as much as I enjoy revealing it!!

 

1,620 views
1 fave
2 comments
Uploaded on March 25, 2025