10.2 Luminosity and Apparent Brightness

Luminosity is an intrinsic property of a star—it does not depend in any way on the location or motion of the observer. It is sometimes referred to as the star’s absolute brightness. However, when we look at a star, we see not its luminosity, but rather its apparent brightness—the amount of energy striking unit area of some light-sensitive surface or device (such as a human eye or a CCD chip) per unit time. In this section, we discuss how these important quantities are related to one another.

Another Inverse-Square Law

Figure 10.4 shows light leaving a star and traveling through space. Moving outward, the radiation passes through imaginary spheres of increasing radius surrounding the source. The amount of radiation leaving the star per unit time—the star’s luminosity—is constant, so the farther the light travels from the source, the less energy passes through each unit of area. Think of the energy as being spread out over an ever-larger area, and therefore spread more thinly, or "diluted," as it expands into space. Because the area of a sphere grows as the square of the radius, the energy per unit area—the star’s apparent brightness—is inversely proportional to the square of the distance from the star. Doubling the distance from a star makes it appear 22, or four, times dimmer. Tripling the distance reduces the apparent brightness by a factor of 32, or nine, and so on.

The Inverse Square Law

Figure 10.4 Inverse-square Law As it moves away from a source such as a star, radiation is steadily diluted while spreading over progressively larger surface areas (depicted here as sections of spherical shells). Thus, the amount of radiation received by a detector (the source’s apparent brightness) varies inversely as the square of its distance from the source.

Of course, the star’s luminosity also affects its apparent brightness. Doubling the luminosity doubles the energy crossing any spherical shell surrounding the star and hence doubles the apparent brightness. The apparent brightness of a star is therefore directly proportional to the star’s luminosity and inversely proportional to the square of its distance:

Figure 10.5 Luminosity Two stars A and B of different luminosities can appear equally bright to an observer on Earth if the brighter star B is more distant than the fainter star A.
Thus, two identical stars can have the same apparent brightness if (and only if) they lie at the same distance from Earth. However, as illustrated in Figure 10.5, two non-identical stars can also have the same apparent brightness if the more luminous one lies farther away. A bright star (that is, one having large apparent brightness) is a powerful emitter of radiation (high luminosity), is near Earth, or both. A faint star (small apparent brightness) is a weak emitter (low luminosity), is far from Earth, or both.

Determining a star’s luminosity is a twofold task. First, the astronomer must determine the star’s apparent brightness by measuring the amount of energy detected through a telescope in a given amount of time. Second, the star’s distance must be measured—by parallax for nearby stars and by other means (to be discussed later) for more distant stars. The luminosity can then be found using the inverse-square law. Note that this is basically the same reasoning we used earlier in our discussion of how astronomers measure the solar luminosity (in our new terminology, the solar constant is just the apparent brightness of the Sun). (Sec. 9.1)

The Magnitude Scale

Instead of measuring apparent brightness in SI units (for example, watts per square meter W/m2, the unit in which we expressed the solar constant in Section 9.1), optical astronomers find it more convenient to work in terms of a construct called the magnitude scale. This scale dates back to the second century B.C., when the Greek astronomer Hipparchus ranked the naked-eye stars into six groups. The brightest stars were categorized as first magnitude. The next brightest stars were labeled second magnitude, and so on, down to the faintest stars visible to the naked eye, which were classified as sixth magnitude. The range one (brightest) through six (faintest) spanned all the stars known to the ancients. Notice that a large magnitude means a faint star.

When astronomers began using telescopes with sophisticated detectors to measure the light received from stars, they quickly discovered two important facts about the magnitude scale. First, the one through six magnitude range defined by Hipparchus spans about a factor of 100 in apparent brightness—a first-magnitude star is approximately 100 times brighter than a sixth-magnitude star. Second, the characteristics of the human eye are such that a change of one magnitude corresponds to a factor of about 2.5 in apparent brightness. In other words, to the human eye a first-magnitude star is roughly 2.5 times brighter than a second-magnitude star, which is roughly 2.5 times brighter than a third-magnitude star, and so on. (By combining factors of 2.5, we confirm that a first-magnitude star is indeed (2.5)5 100 times brighter than a sixth-magnitude star.)

Figure 10.6 Apparent Magnitude This graph illustrates the apparent magnitudes of some astronomical objects. The original magnitude scale was defined so that the brightest stars in the night sky had magnitude one while the faintest stars visible to the naked eye had magnitude six. It has since been extended to cover much brighter and much fainter objects. An increase of one in apparent magnitude corresponds to a decrease in apparent brightness by a factor of approximately 2.5.
In the modern version of the magnitude scale, astronomers define a change of five in the magnitude of an object to correspond to exactly a factor of 100 in apparent brightness. Because we are really talking about apparent (rather than absolute) brightnesses, the numbers in Hipparchus’s ranking system are now called apparent magnitudes. In addition, the scale is no longer limited to whole numbers, and magnitudes outside the original range 1–6 are allowed—very bright objects can have apparent magnitudes much less than 1, and very faint objects can have apparent magnitudes far greater than six. Figure 10.6 illustrates the apparent magnitudes of some astronomical objects, ranging from the Sun, at -26.8, to the faintest object detectable by the Hubble or Keck telescopes, at an apparent magnitude of +30—about as faint as a firefly seen from a distance equal to Earth’s diameter.

Apparent magnitude measures a star’s apparent brightness when seen at the star’s actual distance from the Sun. To compare intrinsic, or absolute, properties of stars, however, astronomers imagine looking at all stars from a standard distance of 10 pc. (There is no particular reason to use 10 pc—it is simply convenient.) A star’s absolute magnitude is its apparent magnitude when viewed from a distance of 10 pc. Because distance is fixed in this definition, absolute magnitude is a measure of a star’s absolute brightness, or luminosity. The Sun’s absolute magnitude is 4.8. In other words, if the Sun were moved to a distance of 10 pc from Earth, it would be only a little brighter than the faintest stars visible in the night sky. As discussed further in More Precisely 10-1, the numerical difference between a star’s absolute and apparent magnitudes is a measure of the distance to the star.

Concept Check

Two stars are observed to have the same apparent magnitude. Based on this information, what, if anything, can be said about their luminosities?