The Standard of Length
The first international standard of length was a bar of a
platinum - iridium alloy called the
standard meter, which was kept at the International Bureau of Weights and
Measures near Paris. The distance between two fine lines engraved near the ends
of the bar, when the bar was held at a temperature of 00C and
supported mechanically in a prescribed way, was defined to be one meter.
Historically, the meter was intended to be one ten-millionth of the distance
from the north pole to the equator along the meridian line through Paris.
However, accurate measurements showed that the standard meter bar differs
slightly (bout 0.023%) from this value.
Figure 1:
Because the standard meter is not very accessible, accurate master copies of it were made and sent to standardizing laboratories throughout the world. These secondary standards were used to calibrate other, still more accessible, measuring rods. Thus, until recently, every measuring rod or device derived its authority from the standard meter through a complicated chain of comparisons using microscopes and dividing engines. Since 1959 this statement had also been true for the yard, whose legal definition in the United States was adopted in that year to be
Figure 1:
Because the standard meter is not very accessible, accurate master copies of it were made and sent to standardizing laboratories throughout the world. These secondary standards were used to calibrate other, still more accessible, measuring rods. Thus, until recently, every measuring rod or device derived its authority from the standard meter through a complicated chain of comparisons using microscopes and dividing engines. Since 1959 this statement had also been true for the yard, whose legal definition in the United States was adopted in that year to be
1 yard = 0.9144 meter
(exactly)
Which is equivalent to
1 inch = 2.54
centimeters (exactly)
The accuracy with which the necessary intercomparisons of
length can be made by the technique of comparing fine scratches using a microscope
is no longer satisfactory for modern science and technology. A more precise and
reproducible standard of length was obtained when the American physicist Albert
A. Michelson in 1893 compared the length of the standard meter with the
wavelength of the red light emitted by atoms of cadmium. Michelson carefully
measured the length of the mater bar and found that the standard meter was
equal to 1,553,163,5 of those wavelengths. Identical cadmium lamps could easily
be obtained in any laboratory and thus Michelson found a way for scientists
around the world to have a precise standard of length without relying on the
standard meter bar.
Despite this technological advance, the metal bar remained
the official standard until 1960, when the 11th General Conference
on Weights and Measures adopted an atomic standard for the meter. The
wavelength in vacuum of a certain orange-red light emitted by atoms of a particular
isotope of krypton, 86Kr, in electrical discharge was chosen (see
Fig. 2). Specifically, one meter was defined to be 1,650,763,73 wavelengths of this
light. With the ability to make length measurements to a fraction of a wavelength,
scientists could use this new standard to make comparisons of lengths to a
precision below 1 part in 109.
The choice of an atomic standard offers advantages other
than increased precision in length measurements. The 86 Kr
atoms are available everywhere, are identical, and emit light of the same wavelength.
The particular wavelength chosen is uniquely characteristic of 86Kr
and is sharply defined. The isotope can readily be obtained in pure form.
By 1983, the demands for higher precision had reached such a
point that even the 86Kr standard could not meet them and in that
year a bold step was taken. The meter was redefined as the distance traveled by
a light wave in a specified time interval. In the words of the 17th
General Conference on Weights and Measures.
The meter is the length of the path traveled by light in
vacuum during a time interval of 1/299, 792, 458 of a second.
This is equivalent to saying that the speed of light c is
now defined as
c=299,792,458 m/s
(exactly)
Figure 2
This new definition of the meter was necessary because
measurements of the speed of light had become so precise that the
reproducibility of the 86Kr meter itself became the limiting factor.
In view of this it then made sense to adopt the speed of light as a defined
quantity and to use it along with the precisely defined standard of time (the
second) to redefine the meter.
Below table shows the range of measured lengths that can be
compared with the standard.
Table Some Measured Lengths
No comments:
Post a Comment