apparentmagnitude
Apparent magnitude is a measure of the brightness of a celestial object as seen from Earth. It is a logarithmic scale, meaning that a small difference in magnitude corresponds to a large difference in brightness. The scale is inverted, so brighter objects have lower (or more negative) magnitudes, while dimmer objects have higher (or more positive) magnitudes. The brightest objects in the sky, such as the Sun and the full Moon, have very large negative apparent magnitudes. For example, the Sun has an apparent magnitude of about -26.74, while the full Moon has an apparent magnitude of about -12.74. The brightest stars in the night sky, such as Sirius, have apparent magnitudes around -1.46. The faintest objects visible to the naked eye have an apparent magnitude of about +6.5. Telescopes can observe objects with much fainter apparent magnitudes. The apparent magnitude scale is a logarithmic scale based on a ratio of brightness. A difference of 5 magnitudes corresponds to a brightness ratio of exactly 100. This means that an object with an apparent magnitude of 1 is 100 times brighter than an object with an apparent magnitude of 6. The zero point of the scale is arbitrary, but it is conventionally set such that Vega has an apparent magnitude of exactly 0. This allows for a standardized comparison of brightness. It is important to distinguish apparent magnitude from absolute magnitude, which is a measure of an object's intrinsic brightness, independent of its distance from Earth.