Gamma rays produced in blazar jets travel across billions of light-years to Earth. During their journey, the gamma rays pass through an increasing fog of visible and ultraviolet light emitted by stars that formed throughout the history of the universe.
Occasionally, a gamma ray collides with starlight and transforms into a pair of particles -- an electron and its antimatter counterpart, a positron. Once this occurs, the gamma ray light is lost. In effect, the process dampens the gamma ray signal in much the same way as fog dims a distant lighthouse.
From studies of nearby blazars, scientists have determined how many gamma rays should be emitted at different energies. More distant blazars show fewer gamma rays at higher energies -- especially above 25 GeV -- thanks to absorption by the cosmic fog.
The farthest blazars are missing most of their higher-energy gamma rays.The researchers then determined the average gamma-ray attenuation across three distance ranges between 9.6 billion years ago and today. From this measurement, the scientists were able to estimate the fog's thickness. To account for the observations, the average stellar density in the cosmos is about 1.4 stars per 100 billion cubic light-years, which means the average distance between stars in the universe is about 4,150 light-years. A paper describing the findings was published Thursday on Science Express. "The Fermi result opens up the exciting possibility of constraining the earliest period of cosmic star formation, thus setting the stage for NASA's James Webb Space Telescope," said Volker Bromm, an astronomer at the University of Texas, Austin, who commented on the findings. "In simple terms, Fermi is providing us with a shadow image of the first stars, whereas Webb will directly detect them." Measuring the extragalactic background light was one of the primary mission goals for Fermi.