Words Ya Gotta Know
Beginners often have trouble describing distances on the sky. You might get tangled up in a conversation like this:
"Do you see those two stars? The ones that look about eight inches apart?"
"Yeah, but they look more like six feet apart to me. . . ."
The problem here is that distances on the sky can't be expressed in linear measures like feet or inches. The way to do it is by angular measure.
Astronomers might say the two stars are 10 degrees (10°) apart. That means if lines were drawn from your eye to each star, the two lines would form a 10° angle at your eye. Simple!
Hold your fist at arm's length and sight past it with one eye. Your fist from side to side covers about 10° of sky. A fingertip at arm's length covers about 1°. The Sun and Moon are each ½° wide. The Big Dipper is 25° long. From the horizon to the point straight overhead (the zenith) is 90°.
There are finer divisions of angular measure. A degree is made up of 60 arcminutes, and each arcminute is made up of 60 arcseconds.
If two objects appear a quarter degree apart, astronomers might note that as 15 arcminutes (abbreviated 15' ). The brightest planets usually appear just a few dozen arcseconds across as seen from Earth. A 5-inch telescope can resolve details as fine as 1 arcsecond (1" ) across. This is the width of a penny seen at a distance of 4 kilometers (2½ miles).
Seen from Earth, the night sky looks like a huge dome with stars stuck on its inside surface. If the Earth beneath us vanished, we'd see stars all around us and we'd have the breathtaking sensation of hanging at the center of an immense, star-speckled sphere.
Astronomers designate the positions of stars by where they are on this celestial sphere.
Picture the Earth hanging at the center of the celestial sphere. Imagine the Earth's latitude and longitude lines ballooning outward and printing themselves on the celestial sphere's inside. They now provide a coordinate grid on the sky that tells the position of any star, just as latitude and longitude tell the position of any point on Earth. In the sky, "latitude" is called declination and "longitude" is called right ascension. These are the standard celestial coordinates.
Declination is expressed in degrees, arcminutes, and arcseconds north (+) or south () of the celestial equator.
Right ascension is expressed not in degrees, but in hours (h), minutes (m), and seconds (s) of time, from 0 to 24 hours. Astronomers set up this arrangement long ago because the Earth completes one turn in about 24 hours. So the celestial sphere, with its coordinate grid permanently printed on it, appears to take about 24 hours to complete one turn around Earth.
There's a slight complication. A star's celestial coordinates gradually change over the years, due to a slow shift of the Earth’s orientation in space called precession. When right ascension and declination are given in books and atlases, you'll often see them accompanied with a year date such as 2000.0. (The ".0" means the beginning of the year: midnight January 1st.) This is the moment for which the coordinates are strictly correct. For most amateur purposes this refinement is too small to matter.
The brightness of a star (or anything else in the sky) is called its magnitude. You’ll encounter this term often.
The magnitude system began about 2,100 years ago when the Greek astronomer Hipparchus divided stars into brightness classes. He called the brightest ones "1st magnitude," meaning simply "the biggest." Those a little fainter he called "2nd magnitude," meaning second biggest, and so on down to the faintest ones he could see: "6th magnitude".
With the invention of the telescope, observers could see even fainter stars. Thus 7th, 8th, and 9th magnitudes were added. Today binoculars will show stars as faint as 8th or 9th magnitude, and an amateur's 6-inch telescope will go to 12th or 13th. The Hubble Space Telescope has seen to about 30th magnitude which is nearly 10 billion times fainter than the faintest stars visible to the unaided eye.
On the other end of the scale, it turns out that some of Hipparchus's "1st-magnitude" stars are a lot brighter than others. To accommodate them, the scale now extends into negative numbers. Vega is zero (0) magnitude, and Sirius, the brightest star in the sky, is magnitude 1.4. Venus is even brighter, usually magnitude 4. The full Moon shines at magnitude 13, and the Sun, magnitude 27.
The Earth orbits (circles around) the Sun once a year at a distance from the Sun averaging 150 million kilometers, or 93 million miles. That distance is called one astronomical unit (a.u.). It's a handy unit for measuring things in the solar system.
The distance that light travels in a year 9.5 trillion km, or 5.9 trillion miles, or 63,000 a.u. is called a light-year. Note that the light-year is a measure of distance, not time . . . just like kilometers or miles.
Most of the brightest stars in the sky lie a few dozen to a couple thousand light-years away. The nearest star, Alpha Centauri, is only 4.3 light-years away. The Andromeda Galaxy, the nearest large galaxy beyond our own Milky Way, is 2.5 million light-years distant.
Professional astronomers often use another unit for big distances: the parsec. One parsec equals 3.26 light-years. (In case you're really wondering, a parsec is the distance where a star shows a parallax of one arcsecond against the background sky when the Earth moves 1 a.u. around the Sun.)
A kiloparsec is 1,000 parsecs, and a megaparsec is a million parsecs.
Now, was that so hard?