Apparent magnitude of a star is a measure of its brightness in logarithmic units. By conventional definition, apparent magnitude mm is related to the observed flux density F of light from the star (which is its brightness) by the formula
"m = - 5 \\log_{100} \\frac{F}{F_0} = - 2.5 \\log_{10} \\frac{F}{F_0} \\, ,"where "F_0" is the reference flux density corresponding to zero apparent magnitude. Since the flux density (brightness) for a star can be measured in different spectral bands (ultraviolet, visible, infrared etc.), one can speak of the apparent magnitude in the corresponding spectral band. For two objects A and B with apparent magnitudes "m_A" and "m_B", respectively, we have
"m_A - m_B = - 5 \\log_{100} \\frac{F_A}{F_0} + 5 \\log_{100} \\frac{F_B}{F_0} = 5 \\log_{100} \\frac{F_B}{F_A} \\, ."Thus, for "m_A = - 5" and "m_B = - 10", we obtain "5 \\log_{100} \\left( F_B \/ F_A \\right) = 5", or "\\log_{100} \\left( F_B \/ F_A \\right) = 1". Hence, the ratio of their brightness is "F_B \/ F_A = 100".
Comments
Leave a comment