Should you use light-years or parsecs for astronomical distance?

You give astronomical distances beyond the solar system in light-years, but professional astronomy papers use parsecs. Which is preferable?


Trigonometric parallax determines the distance to a star by measuring its slight shift in its apparent position as seen from opposite ends of Earth's orbit.
Bill Saxton, NRAO / AUI / NSF

Light-years, no question! Here’s how I see it. The parsec (which equals 3.26 light-years) is defined as the distance at which a star will show an annual parallax of one arcsecond. This means it is based on two arbitrary quantities: the radius of Earth’s orbit, which was a random accident of how the solar system fell together, and the definition of the arcsecond — an even more arbitrary unit that stems from the ancient Babylonians’ base-60 style of arithmetic, along with their notion that the circle should be divided into 360 degrees because there “ought to be” 360 days in a year. The light-year, by contrast, is based on only one arbitrary value (Earth’s orbital period) and on a fundamental constant of nature: c, the speed of light.

Moreover, as a practical matter, few distances in modern astronomy are directly measured by the parallax method anymore. However, it’s often important in modern astrophysics to know the light-crossing time of a given distance — and if the distance is given in light-years, there you are.

— Alan MacRobert