Is this what's known in scientific circles as the fudge factor?
It's what's known as the extinction function, and it has plagued astrophysics for a hundred years. For example, one of the most important measurements in astrophysics is the measurement of Cepheid variable stars, which have a strong correlation between the period of their variation and their intrisnic brightness. This makes them a "standard candle": if you know how long a Cepheid's period is, and you know how bright it appears to your telescope, you should be able to tell how far away it is.
The only problem is that between here and the distant star, the universe is not exactly transparent. There are molecular clouds that block some of the light from the star, making it appear dimmer--and therefore farther away--than it actually is. This wreaked havoc with galactic distance measurements in the first half of the 20th century, until people learned how to measure the extinction function and correct for it.
In the case of the very earliest stars, the extinction function doesn't affect our distance measurement. (We do that by looking at the redshift.) What it does do is affect our ability to count the earliest stars, and measure their brightness. The universe was a much thicker, murkier place back then, so it's not surprising if we've been underestimating the amount of activity.