- Introduction to Astronomy
- The Celestial Sphere - Right Ascension and Declination
- What is Angular Size?
- What is the Milky Way Galaxy?
- The Astronomical Magnitude Scale
- Sidereal Time, Civil Time and Solar Time
- Equinoxes and Solstices
- Parallax, Distance and Parsecs
- Luminosity and Flux of Stars
- Kepler's Laws of Planetary Motion
- What Are Lagrange Points?
- Glossary of Astronomy & Photographic Terms
- Astronomical Constants - Some Useful Constants for Astronomy

The brightness of an object is a basic observable quantity. It is easy to observe two stars and say that star A is brighter than star B, but it would be handy if we had a way of quantifying this brightness so we can say that star A is x times as bright as star B. To this end the Magnitude Scale was introduced.

## History of the Magnitude Scale

The Greek mathematician Hipparchus is widely credited for the origin of the magnitude scale, but it was Ptolemy who popularised it and brought it to the mainstream.

In his original scale, only naked eye objects were categorised (excluding the Sun), the brightest Planets were classified as magnitude 1, and the faintest objects were magnitude 6, the limit of the human eye. Each level of magnitude was considered to be twice the brightness of the previous; therefore magnitude 2 objects are twice as bright as magnitude 3 objects. This is a logarithmic magnitude scale.

With the invention of the telescope and other observational aids, the number of new objects soared and a modification was needed to the system in order to accurately categorise so many new objects. In 1856 Norman Robert Pogson formalised the magnitude scale by defining that a first magnitude object is an object that is 100 times brighter than a sixth magnitude object, thus a first magnitude star is 2.512 times brighter than a second magnitude object.

Pogson's scale was originally fixed by assigning Polaris a magnitude of 2. Astronomers later discovered that Polaris is slightly variable, so they first switched to Vega as the standard reference star, and later again switched to using tabulated zero points for the measured fluxes. This is the system used today.

## Two Magnitude Scales

Going back to star A and star B, let's say that star A is magnitude 2 and star B is magnitude 3. According to the magnitude scale, star A would appear to be 2.512 times as luminous than star B. Here we are referring to the stars *Apparent Magnitude*, that is, its brightness as seen from Earth. This is how most magnitudes are presented on TV, planetarium software and magazines.

But how do we know that Star A is actually brighter than Star B? It is entirely possible for Star A and Star B to have the same luminosity, but star B could be further away than star A, thus appears dimmer to us from Earth.

We need another scale which compares the actual brightness of a star if it were a fixed distance from the Earth. This scale is called the *Absolute Magnitude* and the fixed distance is set at an internationally agreed 10 parsecs. A parsec is a distance from the Earth to an astronomical object which has a parallax angle of one arcsecond (1/3,600 of a degree). We will cover parallax in another article, but for now 1 parsec is equal to 3.26 light-years or 1.92 x 10^{13} miles.

**Absolute Magnitude is given the symbol M, while Apparent Magnitude is given lower case m**.

Our Sun has an apparent magnitude of -26.73, which easily makes it the brightest object visible in the sky, however, the Sun would not be as bright if it was 10 parsecs away. At this distance, it would only shine at a mere apparent magnitude of 4.6, so it would be quite faint in the night sky. At 10 parsecs the Sun's magnitude is called the Absolute Magnitude.

Sirius is the next brightest star in the sky has an apparent magnitude of -1.47, however it only lies 2.64 parsecs away so it is relatively close. If it was moved to a standard 10 parsecs away it would be absolute magnitude 1.4, that's 8 times brighter than our Sun at the same distance.

Here's a quick way of remembering the difference between absolute and apparent magnitude:

**Apparent magnitude**

*appears*to be brightest, Absolute magnitude*absolutely*is the brightest.## Example Stars on the Magnitude Scale

In this table, we can see examples of various points on the magnitude scale.

Celestial Object | Magnitude |
---|---|

Sun | -26.74 |

Full Moon | -12.74 |

Venus | -4.6 |

Sirius (brightest star) | -1.44 |

Naked Eye Limit (urban) | +3 |

Naked Eye Limit (dark skiese) | +6 |

Binocular Limit | +9.5 |

12" Telescope Limit | +14 |

200" Telescope Limit | +20 |

Hubble Telescope Limit | +30 |

## Apparent Magnitude, Absolute Magnitude and Distance

There are two main types of magnitude commonly used in astronomy. The first of these, **apparent magnitude**, is the brightness of the object as seen by an observer on the Earth. The apparent magnitude of a star is dependent on two factors:

- The luminosity of the star (total energy per second radiated)
- The distance of the star from Earth

The second, **absolute magnitude**, is dependent solely on the star's luminosity and can be regarded as an intrinsic property of the star. Absolute magnitude is defined as the apparent magnitude of an object if it were a standard distance from the Earth. The standard distance is 10 parsecs. Since distance is always equal when comparing absolute magnitudes, it can be removed as a factor in the star's brightness which is why it can be regarded as an intrinsic property.

## Absolute magnitude and Luminosity

A star's luminosity, *L*, is the total amount of energy radiated per unit time. The absolute magnitude of a star is related to its luminosity in the same way as apparent magnitude is related to flux. If we compare the ratio of the brightness of two stars, expressed in terms of their luminosities, then we obtain a relation for the difference in their absolute magnitudes.

Equation 23 - Absolute Magnitude Relation

Capital letters are used to indicate absolute magnitudes and lower case letters are used to identify apparent magnitudes.

As we have previously stated, absolute magnitude is the apparent magnitude of an object if it was a distance of 10 parsecs from the Earth.

It is clear from this definition that a star located at 10 parsecs from the Earth will have the *same* apparent **and** absolute magnitude. A star that is further away than 10 parsecs will have a fainter apparent magnitude than absolute magnitude and a star that is closer than 10 parsecs will have a brighter apparent magnitude than absolute magnitude.

How do we know stars absolute magnitude? We could travel to every star and measure the apparent brightness from a distance of 10 parsecs, but at the moment that really isn't a practical solution. Luckily for us, however, the apparent and absolute magnitudes are related by a very important formula.

## Distance Modulus

**Distance Modulus** is the difference between the apparent and absolute magnitudes. This can be obtained by combining the definition of absolute magnitude with an expression for the inverse square law and Pogson's relation. Using the distance modulus it is possible to establish a relationship between the absolute magnitude, *M*, of a star, its apparent magnitude, *m*, and its distance, *d*.

The inverse square law tells us that for a star at distance *d* (parsecs), with observed flux *F _{m}*, then its flux

*F*at 10 parsecs would be given by:

_{M}Equation 24 - Inverse Square Law for Flux

We can combine this with equation 23 above to give the distance modulus equation.

Equation 25 - Distance Modulus

If we measure stars apparent magnitude, and its distance in parsecs is known, then we can determine the absolute magnitude and hence the luminosity of the star. If we know the stars absolute and apparent magnitudes we can use distance modulus to calculate the distance to the star. This equation is very powerful and will be used a great many times in upcoming tutorials.

The formula for calculating Absolute Magnitude within our galaxy is:

Equation 31 - Absolute Magnitude

Where *D* is the distance to the star in parsecs.

## Example

Barnard's Star lays 1.82 parsecs away and has an observed (apparent) magnitude of 9.54.

m - M = 5((log_{10}D)-1) M = 9.54 * 5((log_{10}1.82)-1) M = 9.54 - (-3.7) M = 13.24

If Barnard's Star were to be moved to a distance of 10 parsecs from the Earth it would then have a magnitude of 13.24.

If we already know both Apparent and Absolute magnitudes, it is possible to calculate the distance to the star:

d = 10^{0.2(m - M + 5)}

Using Barnard's Star again,

d = 10^{0.2(9.54-13.24+5)}d = 10^{0.26}d = 1.82 parsecs

## Bolometric Magnitude

Another type of magnitude of interest to astronomers is the **bolometric magnitude**. So far the absolute and apparent magnitudes are based on the total visible energy radiated from the star. We know that not all of that energy is received on Earth since it is filtered out by our atmosphere.

Bolometric magnitude is a based on the flux throughout the entire electromagnetic spectrum. The term absolute bolometric magnitude is based specifically on the luminosity (or total rate of energy output) of the star.

The bolometric magnitude *M _{bol}*, takes into account electromagnetic radiation at all wavelengths. It includes those unobserved due to instrumental pass-band, the Earth's atmospheric absorption, and extinction by interstellar dust. It is defined based on the luminosity of the stars. In the case of stars with few observations, it must be computed assuming an effective temperature.

Equation 39 - Bolometric Magnitude

This can then be reworked to find the ratio of luminosity.

Equation 40 - Luminosity ratio of magnitudes

is absolute magnitude or apparent magnitude relative to the distance from Earth?

Why does the size of a star correlate with bolometric luminosity? why cant a star be relatively small but bolometrically bright, or large but bolometrically dim?

Does anybody know how to "unaccount" for the atmosphere when measuring absolute magnitude?

okay so what is the difference between absolute and apparent magnitude?

Apparent Magnitude - is how bright something appears to you from were you are.

Absolute Magnitude - is how bright something looks from a set distance of 10 parsecs (32.6 lightyears).

A dim star will still be very bright if you are really close to it. Conversely, a really bright star can be seen from very far away.

I am very interested in astronomy, but I suck at mathematic calculations (go figure...). I am taking a class currently, and a question posed is as follows:

Polaris is a second-magnitude star. Phi Pegasi is about sixteen times fainter than Polaris. What is the approximate magnitude of Phi Pegasi?

I have been given the choices of 18, -14, 3, -3, and 5. I do not know the calculation/equation that I need to use to reach an answer. I am not asking that you answer this for me, but I would like to know / to understand the appropriate equation. Please respond via e-mail, as soon as possible. Thank you!

The answer is 5th magnitude (apparent).

each magnitude is 2.512 brighter or fainter then the next increment. So a 1st magnitude star is 2.512 times brighter then a 2nd magnitude star. Therefore, 2.512 * 2.512 * 2.512 = 15.85 times brighter or fainter - notice that 15.85 is close to 16). So that means there are 3 magnitudes of difference between Polaris and Phi Pegasi - so 2 (mag of Polaris) + 3 (mag diff) = 5 th mag for Phi Pegasi.

Does anybody know the equation relating all-spectrum flux with bolometric apparent magnitude?

If we know that a certain star has a bolometric apparent magnitude of EXACTLY zero, then how many watts per meter squared do you get from it when you integrate over the entire spectrum?

omg this calculation is so difficult

(m-M) = 5logd-5. Great but how do I get d = ?????

I cannot see how to rearrange can you help

Many thanks

d = 10^(1 + (m-M)/5)

10 raised to the power of 1 plus the quantity of apparent mag minus absolute mag divided by 5.

a quick sanity check: if m = M then 10 raised to the 1st power is 10. which is 10 parsecs.