Pages

Subscribe:

Tuesday, November 22, 2011

A Star's Magnitude


A Star's Magnitude

Magnitude is the degree of brightness of a star. In 1856, British astronomer Norman Pogson proposed a quantitative scale of stellar magnitudes, which was adopted by the astronomical community. He noted that we receive 100 times more light from a first magnitude star as from a sixth; thus with a difference of five magnitudes, there is a 100:1 ratio of incoming light energy, which is called luminous flux.
Because of the nature of human perception, equal intervals of brightness are actually equal ratios of luminous flux. Pogson's proposal was that one increment in magnitude be the fifth root of 100. This means that each increment in magnitude corresponds to an increase in the amount of energy by 2.512, approximately. A fifth magnitude star is 2.512 times as bright as a sixth, and a fourth magnitude star is 6.310 times as bright as a sixth, and so on. The naked eye, upon optimum conditions, can see down to around the sixth magnitude, that is, +6. Under Pogson's system, a few of the brighter stars now have negative magnitudes. For example, Sirius is –1.5. The lower the magnitude number, the brighter the object. The full Moon has a magnitude of about –12.5, and the Sun is a bright –26.51!

























































0 comments:

Post a Comment