Wednesday, 28 May 2014

How Long, How Far and Stuff Like That, a brief History of some measures of length

A few days ago John D Cook posted a link in his Units @UnitFact·Tweet (if you have any interest in measures and their history, a must follow) that said, "Units of length in LOTR" which more clever people (not me) realized was Lord of the Rings. It contained a link to a web site called tolkiengateway
In essence, it discussed a section from one of the "Rings" books in which a magic rope was used to descend a cliff. As part of the blog he computes the length rope by using ells, a very old measure of distance, to conclude the elven rope must have been 112.5 feet long.
I have been instructed by those who know these things that the LOTR took place in Middle Earth, and the period is "before the age of man."

It did get me thinking again about the history of units of distance, and I thought I might begin with the ell. Now the word ell, comes from the Latin Ulna for a bone in the forearm. It is the red one in the picture. It is also called the "elbow bone" and it gives a clear image of the original length it indicated, a measure from the elbow to the end of the middle finger. The term actually came from the Latin through the Old Germanic alinâ of the same meaning. The same measure was common in the Roman standards by a different name,cubitum, from which the English "cubit" is derived, and which was defined to be 1 1/2 Pedes (feet).
(If Cubitum sonds suspiciously like a good source for "cube", your close, but that came from the Greek, κύβος to be Latinized into cubus)

Of course many more people know cubit than ell, most probably because of the famous Bible story of Noah and the flood in which he was instructed to build an ark 300 cubits, by 50 cubits, by 30 cubits.  Noah didn't speak Latin, but the same measure in Hebrew is אמה which I am told is pronounced something like ammah. This is essentially the primary unit of measure in all the Old Testament. So how long was that? Well, it seems there are two versions of the Hebrew cubit, the standard cubit or short cubit (about 17.5 inches) and a long cubit, which is given in Ezekiel 43:13 “These are the measurements of the altar in cubits (the cubit is one cubit and a hand breadth): the base one cubit high and one cubit wide, with a rim all around its edge of one span. " A hand breadth is the distance across the four fingers when they are pressed together, about four inches today, but the Roman standard was 1/4 of a Pes (foot), so the long cubit would be in the neighborhood of 20.5 inches.
The rim of one span is a different unit of hand measure, the length between the tip of the thumb and little finger when the hand is spread apart. If you try this on your own arm, you should find that it takes about two spans to reach from the tip of the middle finger to the bend of the elbow, and hence a span was 1/2 cubit.

After the collapse of the Roman Empire the measures they had established so carefully began to be altered and twisted in every little Dukedom and Middle Earth Kingdom. By the 16th century or so, many of them had adopted a standard measure of cloth which is the double ell, which was very nearly a yard. It was so common a measure that the "double" sort of slipped away and it just became the Scottish ell (not to be confused with Scottish Ale) It was common in my youth for clerks to approximate a yard of material by
taking the cloth and holding it between thumb and forefinger of each hand at the center of their chest (or often at their nose), then letting the cloth slip between one hand as the other drew it out to the tip of the arm. If you fold the arm at the elbow and bring it to touch the chest (or nose) you can see this was about two ells, or a double ell. In Scotland the statutes set it at 37 inches, with a special longer Barony ell for land measure. Cloth was such an important item of trade in markets in England and Scotland, that the cross in the Market square would often have an official (double) ell  inlaid in Metal, or etched in the stone to measure off cloth and other items sold by length.

Sometimes units in one culture are merged with units from another.  After 1066 the Norman Conquerors merged their  residue of the Latin inch (from a word for the 12th part of something, ounce is from the same source,) and merged it with the Anglo Saxon unit of a Barley corn.  Three barley corns was established as one inch.  In 1324 Edward II declared that the inch would be, "three grains of barley, dry and round, placed end to end".  Barley corns are still the standard measure for UK shoe sizes, with each increase in length of 1 size relating to one barley corn.  The adult size zero is a shoe 25 barley corns long.  (In shoe size, like in building floors, the UK starts at 0 while the US begins at 1).

Greater distances extended the common smaller standards.  A Roman Legion used a standard pace (one left, one right) of 5 feet.  Our current unit for a mile comes from the Roman  mille passes which means one thousand paces.  1500 of these passes would cover a Gallic league, which Mr. Cooks Units tweet recently reminded us all that "A league is an hour's walk. (There are many slightly different ways to make this more precise, but an hour's walk is the idea behind them.)"  So those Roman centurians (yep, 100 in each unit) marched along at a clip of about 25 paces per minute, or one stride per calculations make this about 1.5 miles per hour. I remember learning way back in my youth in the military that the US standard march is 100 (2.5 ft)strides per minute, and that these were essentially the standards for the Roman Legions, so I expect that the League was an hours walk for the normal folk, not for trained soldiers. (I'm sure my drill sgt in boot camp would have loved to hear my view on how we should march at a pace to cover 1.5 miles per hour)

On the extreme end, a light year, the distance a beam of light can travel in a vacuum in a Julian year (how many things are still defined for the Julian yr.?) of 365.25 days.  The term was introduced before it was a well defined distance. In 1838 Fredrich Bessell made the first successful measurement of the distance to a star other than the sun, the star 61 Cygni (actually he didn't see below).  At that time the longest standardized unit of measure was the Astronomical Unit, the distance from the Earth to the sun.  Bessell used the parallax of 61 Cygni to determine that it was 660000 astronomical units away from the Earth, then he interjected that this meant that light took 10.3 years to cover that distance (this was an estimate, as the precise speed of light was not known at that time).  He never used the term "light year", but others soon did.  Otto Ule used it in an article in 1851, for example.  The true astronomers seem to dislike the term; Arthur Eddington called the  term an inconvenient and irrelevant unit.  To overcome this issue they created the unit Parsec.  I love this unit of measure because it is based on two imaginary similar triangles with a common vertex.  When the Earth moves through it's orbit around the sun, we imagine a right triangle with the right angle at the sun and legs of 1 AU from the sun to the earth, and another leg perpendicular to the earth's elliptic orbit reaching out to (wait for it) an imaginary star 1 Parsec away.  The parallax of this "imaginary near star" against the distant stars would be 1 second of arc. Of course, there are no stars as close as 1 parsec.   So when we measure a star that it farther out, the parallax angle is smaller, and so a star that is 2 parsecs away, will have a 1/2 arc sec of parallax.  The Earth's nearest star neighbor outside our solar system is Proxima Centauri, which is 1.3 parsecs away (or about 4 light years).  That means, if I understand all this imaginary astronomy, that it should have a parallax of about .75 arc sec.  (if that is a gross error, can one of my astrophysics readers give me a quick correction)
To give you an idea how accurately they can now measure cosmological angles, 1 arc second would be about the thickness of a US dime (the side view) seen from about 4 kilometers (2 1/2 miles). 

The parsec unit was likely first suggested in 1913 by British astronomer Herbert Hall Turner. Named from an abbreviation of the parallax of one arcsecond.

The Centauri star system is actually three stars, cleverly labeled alpha, beta, and Proxima is about .05 parsecs closer than the other two, hence its name.

The Centari alpha/beta binary was actually probably the first star to have it's parallax accurately measured. Scottish astronomer Thomas Henderson studied the trigonometric parallaxes of the AB system from April of 1832 until May of the following year. Trouble was, the parallax was so much greater than stars normally have that he was afraid he had made a mistake. So in 1839, after seeing Bessel's results, he published.

There are, of course, new measurements for the very short units, but it is late, and this old teacher needs a break. Good night, all.

Post a Comment