While much of the cosmos remains mysterious and unexplored, we know this much is true: space is big. Very big. To quantify the vastness of the cosmos, astronomers often refer to things being a certain number of light-years away. What does that really mean?

Though seemingly contrary to how it sounds, a "light-year" is a measure of distance, rather than time. A light-year is the distance light travels in a year. Specifically, the International Astronomical Union defines a light-year as the distance light travels in 365.25 days.

In a similar vein, you could describe 60 miles as a car-hour (the distance a car travels in an hour on a highway). In fact, we often tell people distances in terms of time – "I'm 10 minutes away," for instance. The term "light-year" was invented because, simply put, the equivalent distances in miles, meters or kilometers were huge. [The Biggest Unsolved Mysteries in Physics]

How huge? Light moves at 186,282 miles per second, or 299,792.5 kilometers per second. That's 670.6 million miles an hour. The distance to the nearest star is 4.3 light-years, or 25.3 trillion miles (40.7 trillion km).

The first mention of light-years dates back to 1838 and a German scientist named Friedrich Bessel. He measured the distance to a star called 61 Cygni, and got a distance of 660,000 times the Earth's orbital radius. He noted that light would take about 10 years to get there, but he didn't like the term "light-year." (One reason was that at that time, it wasn't clear that light's speed was a fundamental constant of nature). In 1851, the term made a first appearance in Germany, in an astronomical publication known as Lichtjare. Later, astronomers adopted it and "light-years" are now a popular unit of measurement even in the scientific literature.

The light-year competes with the parsec, which stands for parallax-second, and is equal to 3.26 light-years. A parallax-second is the number of arcseconds (1/3600th of a degree) that a star's apparent position shifts when measuring its distance. British astrophysicist Arthur Eddington, a prominent scientist in the early 20th century, preferred the parsec, calling the light-year "inconvenient." His was a losing battle, however.

Light-years can be divided into light-days, light-hours or even light-seconds, though those units are used less often. The sun is 8 light-minutes away, which means it takes light from the sun 8 minutes to reach Earth. [Quiz: How Well Do You Know Our Solar System?]

All this depends on knowing the speed of light, and that turns out to be hard to measure because it goes so fast. Galileo attempted it in 1638, and he described an experiment in which one person covers a lantern while another on a tower some distance away tries to time when the light gets there. The experiment failed, and Galileo could only answer that however fast light was, neither human reflexes nor the clocks at the time were speedy enough to catch it. (He did come up with an estimate of at least 10 times the speed of sound, but that was very much a guess.)

Danish astronomer Ole Rømer was able to make an estimate in 1676, using the timing of eclipses of Jupiter's moon Io. Later, in 1729, James Bradley used a phenomenon called stellar aberration, in which the apparent positions of stars in the sky seem to change slightly depending on the movement of the Earth, to get a closer estimate of light's velocity. Scientists kept refining these estimates, and by the 1860s, Scottish physicist James Clerk Maxwell showed that electromagnetic waves travel at a certain speed in a vacuum. That speed is a constant, and at the time, most physicists thought of light as a pure wave. (We know now that it isn't – it can be a particle, too).

Finally, in 1905, Albert Einstein's theory of special relativity posited that light always travels at the same speed no matter where it is observed from. This was a big step because suddenly, the speed of light became one of the constants of the universe – and thus, more useful for measuring distances.

Original article on Live Science.