A new method of measuring star formation in the earliest galaxies finds that they’re producing more massive stars than expected — a result that could affect our understanding of how galaxies grow their stars.
A few months ago, Sky & Telescope reported on a study of a nearby star-forming region (30 Doradus) forming an unexpected number of massive stars. The region might even contain stars with up to 300 times the mass of the Sun — but that wasn’t the real surprise.
Astronomers had thought that the same basic processes ought to shape star formation no matter where it happens, resulting in the same relative numbers of stars everywhere. If that turned out not to be true — and 30 Doradus seemed to be proving the exception — then astronomers would have to rethink everything, from how they classify galaxies to how quickly the universe formed its stars.
But it’s difficult to reach such conclusions on the basis of a single star-forming region, no matter how well it has been studied. Now, new work appearing in Nature appears to confirm that star formation depends not on fundamental processes but on environment. Astronomers may have to do some rethinking after all.
Four Distant Starbursts
The focus in this study turns to four galaxies whose light took more than 10 billion years to travel to Earth. These galaxies are bursting with stars, but they’re also dusty, which makes them immune to methods requiring ultraviolet, visible, or infrared light. Instead, Zhi-Yu Zhang (University of Edinburgh, UK, and European Southern Observatory, Germany) and colleagues trained the Atacama Millimeter/submillimeter Array (ALMA) on these galaxies, searching for emission related to carbon monoxide, a signal tied to a galaxy’s history of star formation.
ALMA’s incredible resolving power received a helping hand in the form of gravitational lenses: foreground galaxies aligned just so. Their gravity acted as a cosmic lens to magnify the light from these distant starbursts.
The scientists measured two isotopologs of carbon monoxide: 13CO and C18O. 13C, (which contains one more neutron than ordinary carbon atoms) is released by stars of all masses, whereas 18O (which contains two extra neutrons compared to ordinary oxygen atoms) is released by only by more massive stars. Since more massive stars live brief lives, measuring the abundance of 13CO and C18O serves as a fossil record of how many massive stars formed relative to low-mass stars.
The signature is immune to what the study authors describe as “pernicious” effects of dust. But the authors also acknowledge that the measurement is a roundabout way of getting at the problem.
That’s because they’re observing carbon monoxide molecules that are floating in the gas between the stars, rather than measuring radiation from stars themselves. So they’re in effect probing the galaxy’s entire history of star formation. Granted, galaxies in the early universe have a shorter history and thus a shorter amount of time for confounding effects to complicate the measurements, but as Kevin Covey (Western Washington University) points out, it’s still possible.
Redefining Cosmic Noon
If the measurements hold up, then what we dub “starburst” galaxies in the early universe aren’t actually making as many stars as we thought. Most stars have less than the Sun’s mass, but these young galaxies appear to be pouring more of their energy into making more massive stars. That means that other methods of estimating star formation rates might be wrong. In fact, our entire understanding of the cosmic star formation — which astronomers currently think peaked when the universe was roughly 4 billion years old — might be wrong.
Profound implications aside, Zhang’s team has a ways to go when it comes to convincing all of their colleagues. But they’re only getting started: Zhang says they’re already preparing more systematic surveys that will include nearby galaxies and multiple tracers of star formation.