04 November 2010
Posted in Crosby Observatory
Measuring and evaluating the brightness of stars can be traced back to the Greek astronomer and mathematician Hipparchus during 190 - 120 BC. He is responsible for producing a catalogue of comparative brightness and positioning of over 850 stars. Hipparchus formed the apparent magnitude scale to determine the brightness of a star as seen by an observer from earth.
How does this scale work? The brighter the celestial object appears, the lower the value of its magnitude. For instance, the faintest objects you can see using the naked eye are indicated with a magnitude of 6, while the Sun on the apparent magnitude scale is –26.74. However, most of the stars we gaze at in an urban neighborhood with our eyes are usually somewhere around 3 to 4 and if using binoculars, the limit is 10. More recently, through the use of the powerful Hubble Space Telescope, astronomers have located stars with magnitudes of 30+. It is this basic classification from over 2,000 years ago that led to the magnitude scale that we still use today!