Measure for Measure: Putting Numbers to the World Around Us
July 31, 1998 - November 1, 1998
Where am I? What do I weigh? How fast am I going? What time is it? How much has it rained? How far is it to the store? We ask these questions, and many more, on a daily basis. Most of the time we take the answers for granted. But what did people do before we had exact standards for weights and measurements? And what difference does standardization make to us?
As far back as 3000 BC, people devised early measurements based on practical activities. The stadion (the distance run without getting out of breathabout 200m), the furlong ("a furrow long," or the distance a horse can pull a plough without stopping for a rest) and the acre (the amount of land two yoked oxen can plough in one day) could easily vary from one village to another, but they had the advantage that they were easy to understand and easy to approximately reproduce.
It became evident, though, that weights and measurements needed to be standardized, not only for scientific and business reasons, but to prevent fraud. Depending on the "skill" of the person doing the measuring, the actual quantity could differ considerably. For this reason, governments stepped in to standardize measuring procedures.
Today, a person can get his or her blood pressure taken, bake a cake, drive a car, fly on a plane, monitor rainfall, mail a letter, or check the time - all with the assurance that the weight or measurement is accurate and uniform.
Welcome to Measure for Measure , where you will see many ordinary - and not-so-ordinary - devices that calibrate our world. We hope the exhibit will give you a better appreciation for the importance of these devices, and of measurement itself, in our daily lives.
Way back before there were clocks....before there were yardsticks.....before there were weight scales....people needed to measure things. But not very much! Back then, there were only two times: night and day. And there were very few numbers required as reputed to the simple-living South Sea Islanders, there are: One.......two......three.....many! Why bother keeping track when the head of the household the tribal chief told everybody what to do, when to do it, and how much to do before quitting. However, with the advent of structured society, we suddenly found a greater personal interest in when do we quit work for the day? ; how many vegetables or fruit do I get? ; how much of this land is mine? Our Measure by Measure... exhibit shows some of the many devices that we have invented to satisfy this need for always more accurate measuring methods.
Body measurements probably provided the most convenient bases for early linear measurements; early weight units may have derived casually from the use of certain containers or from calculations of what a person or animal could lift or haul.
The Egyptian cubit is generally recognized to have been the most widespread unit of linear measurement in the ancient world. It came into use around 3000 BC and was based on the length of the arm from the elbow to the extended finger tips. The accuracy of the cubit is confirmed by the dimensions of the Great Pyramid of Giza.
The earliest known weight is possibly the Babylonian mina, which in one surviving form weighed about 23 ounces.
In the first millennium BC, commercial domination of the Mediterranean passed into the hands of the Greeks and then the Romans. A basic Greek unit of length was the finger; 16 fingers equaled one foot. The Romans subdivided the foot into 12 inches.
Completely separate from Mediterranean-European history is that of ancient China; yet the Chinese system exhibits all the principal characteristics of the West.
A noteworthy characteristic of the Chinese system, and one that represented a substantial advantage over other systems, was its predilection for decimal notation, as demonstrated by foot rulers dating back as far as the sixth century BC.
Probably some of the first measuring standards were portable, convenient to use, and always with one. The old testament measures Noah,s ark by cubits the length of a man's forearm or the distance from the tip of the elbow to the end of his middle finger. Many of our current standards originated from other body measurements. Our foot-rule started out as the length of a man's foot. Of course, the foot varies in length, so that for more accuracy a 16th century Teutonic surveyor decreed that "16 men will be selected from the group leaving church. The length of the left foot of each shall be taken and... "(averaged); an early example of hexadecimal system, easily divisible in geometric progression. Once the ancients started using arms and feet for measuring distance, it was only natural that they also thought of using fingers, hands and legs. An inch originally was the width of a man's thumb. It is probably no accident that the French word for thumb, "pouce" , is also the word for inch. Twelve times that distance made a foot. Three times the length of the foot was the distance from the tip of a man's nose to the end of his outstretched arm. The distance very closely approximates what we now call the yard. The Roman Legion measured the length of their march by the "pace" , a distance equal to two strides. The Legion established the mile unit, consisting of the Latin "mille passum" meaning 1000 paces. The very short length measurements were done using grains of barley (barleycorns) placed on end. However, none of theseß were positive fixed dimensions or true standards.
With the invention of the balance, the Babylonians made a basic contribution. Instead of comparing the weights of two unknowns on the balance, they compared the weight of the unknown to a set of stones especially prepared for weight service. Archaeologists have found some of these stones finely shaped and polished, probably the world's first weight standards. Medieval English practice was to use the same concept, with the "stone" legally accepted as 14 pounds. The Egyptians and the Greeks used a wheat seed as the smallest unit of weight, a standard that was very uniform and accurate for the times. The grain is still in limited use as a standard weight. However, wheat seeds are no longer actually put in the pan of the balance scale. Instead, a weight that is practically the same as that of an average grain of wheat is arbitrarily assigned to the grain, and is nominally 1/500 of an ounce (avoirdupois), or 65 milligrams. The Arabs established a small weight standard for gold, silver and precious stones which often were a part of trade or barter deals. To weigh the small valuable quantities, they used as a weight standard a small bean called a "karob". This was the origin of the word "carat" which jewelers still use to express the weight of gems and precious metals.
The Romans set the popular standard for heavier weights. The Roman "Libra" (which gives us the abbreviation lb. ) was defined as the weight of 7680 grains of wheat, fixing the grain as the basic measure. King Henry VIII went on to define the avoirdupois pound as 7000 grains. The French defined the troy pound (from Troyes, France, an important trading center in the Middle Ages) as 5760 grains for use in weighing jewels and precious metals. The Dark Ages for Standard Measurements The Roman Empire had codified by both length and weight measurements, established standards, and maintained a reference set of these standards in a temple, to be used by the entire civilized world. Tragically, after the fall of the Roman Empire, these standards were lost and there was no longer any uniformity.
In the Middle Ages, almost every town had its own standards of weights and measures, and there were variations between those of one trade and another. As late as the 18th century in Italy there were more than 200 units of length called the "foot". By the late 1700s, the measurements problem was heading for conflict. There were two outstanding systems in the Western World: the French system, pushed by the science-minded Revolutionary government and the English system, a follow-on of the Roman system. This period marks the dawn of the Metric system.
The Metric System
As early as 1584 Simon Stevenius had already proposed a decimal system of units and money in his book De Thiende. However, it was not until the French Revolution that the climate was conducive to creating a completely new system of units. In 1790 the French Academy of Science was commissioned by the National Assembly to design a new system of units for use throughout the world. They decided that this system should have the following attributes:
1. The system should consist of measuring units based on unvariable quantities in nature.
2. All units other than the base units should be derived from these base units.
3. Multiples and submultiples of the units should be decimal.
These principles still underpin the modern metric system. France created worldwide interest with this development and it resulted in 15 countries subscribing to the Metre convention in 1875. Through this the Bureau International des Poids et Mesures (BIPM) came into being. The BIPM now functions under the guidance of the Conférence Générale des Poids et Mesures (CGPM) which has delegates from all the countries that have subscribed to the convention. Over the years the metric system evolved, and in 1960 at the 11th CGPM the system was officially named the Système International d'Unités, or SI for short. The SI is the logical evolution of the metric system through the years and replaces all previous metric systems. It is a living dynamic system which is continually being improved to keep pace with devlopments in science and technology.
Timeline for the Development of the Metric System
ca 1670 A French clergyman, Gabriel Mouton, proposes a standard linear measurement based upon the length of the arc of one minute of longitude on the Earth's surface and divided decimally.
ca 1760 The English developed a unit of length based upon that of a pendulum with a one-second period.
1795 The French government adopts a new system of length: the one ten-millionth part of the distance from the North Pole to the Equator when measured on a straight line running along the surface of the Earth through Paris. The metre, from the Greek word metron, is chosen as the unit name for this length.
1798 Survey completed , setting the length of the meter.
1799 Meter and kilogram reference standards created and deposited in the National Archives.
1812 The French resisted converting to the metric system. Napoleon was forced to restore the old units of measure (yards, inches, pounds, quarts).
1840 France mandates the use of the Metric System, hoping to make it universal.
1866 US legalizes use of the Metric System.v
1875 17 Countries sign the Treaty of the Metre including the US.
1889 CGPM (General Conference of Weights and Measures) creates new reference standards and distributes copies to member nations.
1900 Metric System already adopted by 35 countries including all industrialized nations except the British Commonwealth and the US.
1960 The SI (International System) units are adopted. These are built from seven independent base units: meter, kilogram, second, ampere, kelvin, mole, and candela.
1965 Great Britain adopts the Metric System
1968 South Africa adopts the Metric System
1969 New Zealand adopts the Metric System.
1970 Canada and Australia adopt the Metric System.
1975 US passes the Metric Conversion Act and establishes the US Metric Board to coordinate the voluntary conversion.
1982 The Office of Metric Programs replaces the Metric Board.
1988 The Omnibus Trade Bill becomes law and requires all federal agencies to use metric units in their procurements, grants, and business activities by 1992.
The History of Map Making
So many ancient peoples used maps that both the ability and the need to make maps appear to be universal. The earliest existing maps were made by the Babylonians about 2300 BC. Cut on clay tiles, they consisted largely of land surveys made for the purposes of taxation.
More extensive regional maps, drawn on silk and dating from the second century BC, have been found in China. One of the most interesting types of primitive map is the cane chart constructed by the Marshall Islanders in the South Pacific Ocean. This chart is made of a gridwork of cane fibers arranged to show the location of islands. The art of map making was advanced in both the Mayan and Inca civilizations, and the Inca, as early as the 12th century AD, made maps of the lands they conquered.
The first map to represent the known world is believed to have been made in the sixth century BC by the Greek philosopher Anaximander. It was circular in form and showed the known lands of the world grouped around the Aegean Sea at the center and surrounded by the ocean.
One of the most famous maps of classical times was drawn by the Greek geographer Eratosthenes about 200 BC. It represented the known world from England on the northwest to the mouth of the Ganges River on the east and to Libya on the south. The map was the first to show transverse parallel lines for equal latitudes. The map also had some meridians of longitude but they were irregularly spaced.
About AD 150 the Egyptian scholar Ptolemy published his geography containing maps of the world. These were the earliest maps to use a mathematically accurate form of conic projection. Because of the curvature of the earth, it is impossible to represent any portion of it on a flat surface without the adoption of a projection.
Following the fall of the Roman Empire, European map making all but ceased. Arabian seamen, however, made and used highly accurate charts during this same time period. Beginning in the 13th century, Mediterranean navigators prepared accurate charts of that sea, usually without meridians or parallels but provided with lines to show the bearings between important ports. In the 15th century, editions of Ptolemy's maps were printed in Europe.
A map produced in 1507 by Martin Waldseemuller, a German cartographer, probably was the first to apply the name America to the newly discovered transatlantic lands. The map, printed in 12 separate sheets, was also the first to clearly separate North and South America from Asia. Waldseemuller used the name America to honor the Italian navigator, Amerigo Vespucci (1454-1512), who first accepted South America as a new continent.
In 1570 the first modern atlas was published by the Flemish map maker, Abraham Ortelius. Gerardus Mercator (1512-94) of Belgium was considered one of the greatest cartographers of all times; the projections he devised for his world map proved invaluable to all future navigators.
The first maps to show compass variation were produced early in the 17th century; the first charts to show ocean currents were made about 1665. A complete topographic survey of France was issued in 1793, soon followed by Great Britain, Spain, Austria, and Switzerland. In the United States the Geological Survey was organized in 1879 for the purpose of making large-scale topographic maps of the entire country.
During the 20th century, map making underwent a series of major technical innovations, such as aerial photography and the use of satellites, which have permitted more accurate mapping of the outline of the United States.
The compass, the principal instrument of navigation, is a device that indicates direction on the Earth's surface. Before the advent of the compass, seamen estimated directions largely with reference to the wind. Two basic types of compasses exist: a magnetic compass and a gyroscopic compass.
The magnetic compass utilizes the force known as magnetism. The earth itself has a magnetized core, two magnetic poles, and lines of force that form a magnetic field. The ancient Chinese were aware that the earth's magnetism, through the medium of the lodestone, could be used to indicate horizontal directions. Mediterranean seamen of the 12th century were probably the first to use a magnetic compass at sea.
There are two types of magnetic compasses: the dry card and the liquid. The dry card compass consists of a magnetized needle mounted on a pivot at the center of a fixed graduated card so as to permit the needle to swing freely in the horizontal plane.
In the liquid compass, the most stable mariner's type, the card is mounted in a sealed bowl filled with a liquid of low freezing point. The buoyancy of the card is adjusted so that it floats, thus ensuring the minimum possible friction between the cap and the pivot. Frictional force between the cap and the pivot reduces the sensitivity of the compass. The liquid compass is used more often than the dry compass.
The needle oscillation encountered in ship and aircraft compasses led to the development of the gyrocompass in the early twentieth century. Since it is not north-seeking, it is unaffected by magnetism. Because of this ability to maintain its orientation, it is useful as a direction indicator. The gyrocompass consists of a rapidly spinning wheel set in a framework that permits it to tilt freely in any direction. It is valuable in airplanes for holding the course in rough air.
Credit for inventing the first practical gyrocompass in 1907 belongs to the German inventor Hermann Anschutz-Kaempfe. The first American gyrocompass, designed by Elmer A. Sperry, was produced in 1911.
What is a Sextant?
A sextant is a hand-held navigational instrument used to determine the altitudes of celestial bodies. It does this by measuring the angle of a celestial body from the horizon. The altitude of the sun at high noon can then be used to determine latitude.
The name comes from the Latin sextus , or "one-sixth." The sextant's arc spans 60 degrees, or one-sixth of a circle.
The sextant was invented by the English Captain John Campbell in 1757. It met the needs of the more scientific navigators, including the great explorer Captain Cook. Before its invention, sailors used the octant, or reflecting quadrant, which could measure up to 90 degrees. The sextant will read angles up to 120 degrees, because of the double reflection of the arc of the instrument.
In the 1770s, Jesse Ramsden of London introduced new sextants that revolutionized the graduation of both circular arcs and linear scales. So accurate were the scales produced by Ramsden's dividing engines that it became possible to make sextants much smaller without losing accuracy.
Surveying is a means of relatively large-scale, accurate measurement on the Earth's surface. Surveys were undertaken for practical reasons: road-building, estate evaluation, and boundary-control.
Surveying theory is based on the principles of geometry and trigonometry, and the methods used are based on the principles of physics.
It is probable that surveying had its origin in ancient Egypt. The Great Pyramid of Khufu at Giza, built about 2700 BC, is 755 feet long and 480 feet high. Its nearly perfect squareness and north-south orientation affirm the ancient Egyptians' command of surveying.
Egyptians apparently used rope with knots or marks at uniform intervals for land measurements, wooden rods for distance measurements, levels, and the wooden groma to establish right angles. These simple instruments helped the Egyptians reestablish boundary markers obliterated by the Nile's flood waters.
The Greeks used a form of log line for recording the distances run from point to point along the coast while making their slow voyages from the Indus to the Persian Gulf about 325 BC.
Plane tables were used in Europe in the 16th century, and the principle of graphic triangulation and intersection was practiced by surveyors. In 1620 the English mathematician Edmund Gunter developed a surveying chain, which was superseded only by the steel tape in the beginning of the 20th century.
By the late 18th century modern surveying can be said to have begun. One of the most notable early feats of surveying was the measurement in the 1790s of the meridian from Barcelona, Spain, to Dunkirk, France to establish the basic unit for the metric system of measurement.
In the 20th century, two revolutionary surveying changes have been introduced: photogrammetry, or mapping from aerial photographs (about 1920), and electronic distance measurement, including the adoption of the laser for this purpose as well as for alignment (in the 1960s). Important technological developments benefiting surveying in the 1970s included the use of satellites.
Is a Transit the Same as a Theodolite?
Well, almost. A transit, used for the fundamental determination of star positions, is one of the most important of the astronomical instruments. Invented by the Danish astronomer Ole Roemer (1644-1710) in 1690, it is a small refracting visual telescope that rotates only on a horizontal axis accurately set in the east-west direction. Unlike other telescopes, the transit is not clock-driven to follow the stars. It can point to any altitude, but can be trained on a star only when that star is due north or south of the observer; that is, on the celestial meridian. The transit is mounted on a tripod to insure stability for long periods of time, and is provided with levels to determine its tilt, and with fixed reference points to determine its orientation. A modern instrument may have an electrical drive, controlled by the observer.
A theodolite is an optical instrument used in surveying to measure horizontal and vertical angles. It is designed on the same principles as the transit; generally, the traditional or American-style instrument, with exposed controls and verniers, is called a transit, and the more modern European model a theodolite.
The theodolite is smaller than a transit and has a telescope that is less than six inches long, compared to about ten inches for the transit. The theodolite telescope uses prisms in order to enlarge the image and has three leveling screws; the transit, on the other hand, uses lenses and has four leveling screws.
Because adjusting and measuring devices are enclosed in a theodolite, its appearance is smooth and streamlined compared to the transit. It has optical micrometers that allow angles to be read more accurately with a single measurement than can be read by repetition with a transit.
Surveying and George Washington
One of our nation's most famous surveyors was our first president, George Washington (1732-99). Born in Virginia, he had little or no formal schooling, but taught himself mathematics and surveying. At the age of 17 he was invited to join a party to survey lands west of the Blue Ridge Mountains. His journey led him to take a lifelong interest in the development of western lands.
In July of 1749 he got his surveying license and that same summer he was appointed official surveyor for Culpeper County. During the next two years he made many surveys for landowners on the Virginia frontier. In 1770 he took part in the survey of 15,000 acres of land in the Ohio Valley.
The Chronometer and John Harrison (1693-1776)
John Harrison, a British carpenter turned clock maker, built the first chronometer (an extremely accurate clock) capable of keeping precise time at sea, making possible the determination of longitude. In locating the longitude of a ship at sea, an error of four seconds will cause an error in position of approximately a mile. Harrison's chronometer determined longitude to within a maximum error of 30 miles, an unheard of feat at the time.
Lines of latitude and longitude (the two co-ordinates that specify the location of a place on the earth's surface), first started to appear on maps at least three centuries BC. The equator marked the zero-degree parallel of latitude; hence, latitude is fixed by laws of nature. Finding longitude is a different story, though, as any line drawn from pole to pole may serve as well as any other for a starting line of reference. Early maps had zero-degree longitude running through such diverse places as Africa, Rome, Copenhagen, Jerusalem, St. Petersburg, Paris, Philadelphia, and finally London, which is its current location.
Gauging latitude was easy for a sailor. All he had to know was the length of the day, or the height of the sun, or certain stars above the horizon. However, to know longitude, the sailor needed to know what time it was aboard ship and also the time at the home port or another place of known longitudeat that very same moment. This precise knowledge was unattainable: the only accurate timepiece, the pendulum clock, would slow down, speed up, or stop running on a rolling ship.
In 1707, 2000 lives were lost when an English fleet unexpectedly struck rocks off the southwestern tip of England, over 100 miles off course. To prevent any more major tragedies, the British government set up its famed Longitude Act of 1714 and put up a prize of 20,000 pounds (several million dollars in today's currency) for a "Practicable and Useful" means of determining longitude.
Harrison set about winning the prize. He first made several remarkable wooden clocks that were the most accurate timekeepers of their day. Then, in 1737, he devised a "sea clock," which used a pair of dumb-bells linked by springs in place of a swinging pendulum. H1, as it was known, was tested by the Admiralty and proved a great success. Something of a perfectionist, Harrison then proceeded to improve his design, making a number of innovations along the way, including a bimetallic strip to compensate for temperature variations. Twenty-four years later he produced his masterpiece, H4. The clock was an oversized pocket watch, being five inches in diameter and weighing only three pounds.
Unfortunately for Harrison, now almost 70, the government proceeded to add further requirements to win the award. Harrison spent another 10 years working on H5 and in 1773, with the intervention of King George III, he was awarded his money. Shortly before Harrison's death in 1776, Captain James Cook proved the true value of the chronometer on his second voyage to the Pacific, where he used it accurately to map Australia and New Zealand.
Temperatures and the Thermometer
Temperature is the measurement of the relative 'hotness" or "coldness" of an object. Temperature is related to heat, but the two are not synonymous terms. Heat is an energy state, and it is measured in work units. Temperature is a measure of the relative heat intensity of an object.
The substance used as a basis for measuring temperature is water. The comparison is made by means of a scale, e.g., centigrade, Fahrenheit, or Kelvin. On each of these scales there are two fixed points, the melting point of ice and the boiling point of water.
The German physicist Gabriel Daniel Fahrenheit (1686-1736) devised one of the earliest scales, with the freezing point at 32 degrees and the boiling point at 212 degrees. The centigrade, or Celsius, scale, was invented by the Swedish astronomer Anders Celsius (1701-44). Used throughout most of the world, it assigns a value of 0 degrees to the freezing point, and 100 degrees to the boiling point. The Kelvin scale is used in scientific work and was invented by the British mathematician William Thomas, 1st Baron Kelvin ((1824-1907). Absolute zero is at -273.16 degrees and the degree intervals are identical to those measured on the centigrade scale.
The formula to convert Celsius to Fahrenheit is F = (9/5C) + 32, or simplified,F = (1.8C) + 32.
The formula to conver Fahrenheit to Celsius is
C = 5/9(F -32).
The first instruments which we would recognize as thermometers were made in Florence in the 1650s. Before that there had been instruments which would give a rough idea of temperature changes, using the expansion of air as it was heated to drive a column of water up a tube. But these instruments, known as thermoscopes, were open to the air and were therefore affected by changes in barometric pressure as well as changes in temperature. This made them hard to interpret.
The sealed thermometer, in which a column of liquid expands as it is heated and indicates temperature by the movement of its leading edge up a graduated scale, was invented by Ferdinand II, Grand Duke of Tuscany, in about 1654. The liquid used was alcohol, and the instrument consisted of a glass bulb to which a fine tube was attached.
Most contemporary thermometers have used mercury to indicate temperature, because its coefficient of expansion is nearly constant. However, mercury is a pollute and the half-gram of mercury in a thermometer is enough to pollute 5 million gallons of water. Hence, lately people are encouraged to buy non-mercury thermometers.
Automobile Instruments of Measurement
Several important instruments of measurements are used in the operation of the automobile:
1. The speedometer, to measure the speed of the car.
2. The odometer, to measure the distance driven.
3. The temperature gauge. Watch this one very carefully or an overheated engine can be quite costly. Note: take a look at the old motometer on display. A motometer is a old type of thermometer situated on top of a radiator.
4. Oil pressure gauge. No oil pressure will usually result in your needing a new engine.
5. Ammeter, to measure amps. If the generator or alternator is not working, you should have some jumper cables handy.
6. Tire pressure gauge. A necessity in the day of self-serve car stations.
7. Gas gauge, to measure gallons of gasoline. Pay attention to this measurement, even though many cars today will display on the dashboard how many miles to go before a refill is needed.
All in all, measurements play a very important role in both the manufacture and operation of the automobile of today.
The Standardization of Automobiles
There were no standardized measurements of automobile parts in the late nineteenth and early twentieth centuries. The first autos were largely hand-built, and interchangeability of parts difficult. This changed with an historical achievement in 1908. C. S. Bennett, the London importer for Cadillac, brought three single-cylinder Cadillacs to the Brooklands Racetrack near London. The cars were completely dismantled and the parts scrambled. The cars were then reassembled and run on the Brooklands track. This was done under the supervision of the Royal Automobile Club. Bennett introduced the concept of precision manufacturing and interchangeability of parts to the automobile industry.
Production lines used by Ford and later manufacturers would have been impossible without this concept. Most engine parts, for example, are made with a tolerance of one thousandth of an inch or less.
Even now a tailor called me to his shop
And therewithal took measure of my body.
Comedy of Errors, Shakespeare
How would this measure have been taken? In Shakespeare's time wealthier households possessed their own carefully marked measuring stick, or meteyard. It was often handed down through successive generations. Housewives buying ribbons, laces or fabrics from itinerant salesmen used the measuring stick to make sure they were not being cheated.
For her needlework, the housewife used the stick to make marks on a length of ribbon or tape. When the ribbon became worn or frayed, a new length was marked off, not from the old ribbon, but from the wooden stick. In this way the modern use of the word yardstick is derived, denoting a gauge by which value may be assessed.
On early ribbon measure, figures were seldom used. Long and short lines represented feet and inches and were marked with ink or embroidery. Some measures were marked N, HQ, Q, H, and Y, standing for nail, half quarter, quarter, half, and yard. A nail was 2 1/4 inches and was in use to the mid-19th century, mainly for measuring cloth. The metric system came into use in 1799 in Europe, but the yard had been a standard measurement of length ever since King Henry I of England (1100-1135) decided that it should the length of his arm!
As early as the 17th century, measures were housed in decorative cases of wood, ivory, brass and bone, and could be wound in with a handle. Later models used a spring mechanism for the re-winding. In the early 20th century measures were often uncased, yellow strips of tape, the first foot of which was stretched over stiff whalebone or steel, combining the functions of flexible tape and rigid ruler.
With the spread of numerary and calculating abilities among working and middle class women, accurately marked measuring tapes, yardsticks, hem markers, stitch gauges, pattern boards and the like became common products of the late 19th century.
The width of a fabric controls how a dress pattern is laid out and affects what LENGTH must be purchased.
The thread count is the sum of two measurements: the number of yarns per inch horizontally p lus the number of yarns per inch vertically. Bed linen descriptions often give the thread count l80 for everyday sheets and up to 300 for luxury sheets.
The weight of a fabric determines, to a considerable extent, how it feels, its appearance and its comfort. Many clothing catalogs give some information on weight:
Men's seersucker pants"the cotton fabric is a breezy 3.5 oz. weight."
"This skirt features 6.7 oz. combed cotton twill."
Jeans made with "sturdy 13.5 oz. denim."
. . .All useful information for a savvy consumer.
yarn size information is given by at least one clothing catalog:
A man's cotton shirt "made with 50s single ply yarns."
Yarn sizing is complicated. For cotton, the weight in pounds of a standard 850-yard skein is measured and then the number of skeins that would weigh one pound is calculated. Numbers range from 1s (very heavy) through 30s (medium) to 160s (very fine).
The denier is used for measuring silk and man-made fibers. It is equal to the weight in grams of 9000 meters of yarn and so increases with the coarseness of the yarn. Some hosiery has the denier number printed on the label. Warm tights, for example, might be 125 denier, while sheer stockings might be 15 denier.
Measurements in knitting instructions in the first pattern books, published during the last century, were vague, to say the least.
Gauge and tension are the two measurements used in knitting. Unfortunately they were often used interchangeably in knitting instructions.
Gauge is the number of stitches and the number of rows to the inch. It varies with the thickness of the yarn, the size of the needles and the tension used by the knitter.
"You should always check your gauge exactly by knitting up a square of the pattern, before starting to knit any garment. If you get fewer stitches than quoted, try needles a size larger." Anonymous
The sizing of knitting needles has traditionally varied from country to country. In Britain, sizing was based on the Steel Wire Gauge with large needles having a small size number. The reverse is true in the United States where large needles have large number sizes. Today many needles are given two sizes, the old number and the actual diameter in millimeters. For example, the long blue needles made by Boye are sizes 10 1/2 and 65 mm.
Early needle gauges, such as Walker's Bell Gauge and Chamber's Gauge, were similar to modern ones with graduated holes for sizing the needles. Despite the availability of these measuring devices, typical instructions suggested "use fine knitting needles," "knit with heavy pins," "work with fine, sharp, ivory needles," or "take large bone or wood pins." No wonder the size of the finished garment was quite haphazard. Instructions were given "for large man" or "for a medium boy." Even expert knitters were at a loss to predict the dimensions of the finished product.
Tension is a force, the stress that the fabric structure puts on the yarn. It is controlled by the type of needle and yarn and the "pull" put on the yarn by the knitter, either by knitting "tightly" or by the way in which the yarn is fed to the working finger. Two unusual ways of keeping the yarn in tension are to pass it behind the neck or to draw it through a hook pinned onto the front of the clothing. Tension is theoretically measurable but in fact it is intuitive and expressed through the fingers. This is where experience counts.
It is interesting that any piece of hand knitting will improve with time, and only wildly irregular stitches will fail to become even. This is why knitters of the past get praised for even stitches.
Dress Patterns in the Nineteenth Century
The home seamstress knew that an attractive garment of good fit required a well-drafted clothing pattern. Those who had money to spare could have a custom pattern drafted, but many more labored over patterns of their own devising.
In 1864, Demorest's Magazine , a fashion periodical, offered paper patterns for jackets and waists "custom fit" to a specific figure if the client mailed in 25 cents and her measurements. Demorest's also mass-produced a wide range of one-size-only patterns which a buyer would alter to fit her figure. Ebenezer Butterick was the first American to successfully mass-produce tissue-paper patterns that were sized according to a system of proportional grading.
By 1868, E. Butterick & Co. offered 15 standard pattern sizes in a variety of styles, and in 1870 James McCall was advertizing mass-produced patterns in his magazine The Queen . These graded patterns were tremendously successful: Butterick, alone, was reported to have sold six million in 1871and by 1900, Butterick and McCall had many competitors.
Measurements And Quilts
One might think of a quilt as a cloth sandwich: fabric on top, often made of small patches, a filling for warmth, and a fabric backing, all held together with rows of quilting stitches.
Accurate measuring is needed in constructing the geometric shapes of the pieced designs and in marking out the quilting patterns. Commercial patterns have been available since the mid-1800s and by the 1920s designs were pictured in many newspapers. They could be ordered for 10 cents but the thrifty women of the depression often clipped the illustrations and measured out their own templates. Thin cards or sandpaper were easy to mark and cut.
Accurately machined metal and plastic templates are now sold, but precise measuring and cutting of patterns at home is made easy with quilters' rulers, plastic graph paper and rotary cutters. When the three layers of the quilt have been prepared' the design for the quilting stitches is marked on the top.
In 1929 Ruth Finley wrote, "Straight quilting, such as cross bars and diamonds, was marked with a yardstick and pencil, or by means of a chalked string, stretched tightly across the quilt top and then snapped against the top."
Also from the depression era when few homes would possess a pair of compasses (which were used for making circular designs), came this quote: "My mother did a lot of quilting. She'd take a string and a piece of chalk and draw arcs on the quilt top." Other circular designs were created with cups, saucers, and plates. A wineglass was a popular marker, the stem and foot making a convenient handle.
The Electrocardiograph is also called the ECG or EKG machine [ kardia means heart, graphein, to write).
The electrocardiograph is an instrument that measures electrical activity spreading over the heart during each contraction and relaxation. The first electrocardiograph was devised by William Einthoven (1860-1927) of the University of Leyden in the Netherlands.
This is how Einthoven described his invention in 1903:
"This instrument the string galvanometer is essentially composed of a thin silver-coated quartz filament, which is stretched like a string, in a strong magnetic field. When an electric current is conducted through this quartz filament, the filament reveals a movement which can be observed and photographed by means of considerable magnification . . ."
Einthoven's original machine weighed 600 pounds and required five people to operate. Over the next decades this instrument was greatly simplified and now fits into a small box the size of a laptop computer. The tracings obtained are referred to as electrocardiograms, ECG's for short, or EKG's.
ECG tracings include several waves, arbitrarily called P Q R S T by Einthoven.
P waves record the spreading of the electrical impulse through the right and left atria (the heart's upper chambers).
Q waves are a recording of the same impulse going down the right and left bundle.
R and S waves result from the depolarization of the ventricles (the heart's lower chamber)
T waves reflect the repolarization of the ventricles.
By analyzing electrocardiograms, physicians are able to diagnose cardiac disorders, in particular arrhythmia (rhythm disorders) and myocardial infarctions, also known as heart attacks.
Myocardial infarctions begin with peaked P waves, then with ST segment elevations, on to Q wave development and T wave inversion. These changes may occur over periods of several hours or of several days.
A sphygmomanometer is an instrument designed to measure arterial blood pressure. The word is a combination of: sphygmos, "pulse" in Greek and manometer , another Greek term meaning "measuring device."
The arterial blood pressure was first measured in 1773 by the Rev. Stephen Hales, an English pastor interested in the physical sciences. In 1896 a mercury manometer, essentially the prototype of today's model, was introduced by the Italian Scipione Riva Rocci.
It was not until 1905 that the Russian Mikolai Korotkoff distinguished between:
1. the first sound, heard over an artery, when the constriction blocking an artery is gradually diminished, allowing blood to go through. This corresponds to the period of maximum ejection called systole.
2. the second sound, or in fact, the absence of sound which occurs during diastole, that fraction of time when the heart is at rest between two heartbeats.
The more important of these two measurements is the diastolic blood pressure. A normal blood pressure is 120/80 mm. of mercury (120 systolic and 80 diastolic). A diastolic pressure between 90 and 100 mm. of mercury is considered borderline and one above 100 is considered high blood pressure. High blood pressure may result in vascular accidents in the smaller and more fragile arteries in the body, as in cerebrovascular accidents in the brain, also called strokes.
The Druggist's Hand-Book
The Druggist's Hand-Book of Private Formulas by John H. Nelson was printed in 1882 in order to "place before every druggist reliable formulas for preparing well-known medicines not given in the US. Dispensatory or Pharmacopoeia. These preparations...have become so universal that there is scarcely a druggist within the length and breadth of the land who does not have more or less sale for them; and to be enabled to prepare them is not only a source of satisfaction, but is also very desirable on account of the increase of profits."
The following are some of the formulas printed in the book. Most of them are not familiar to us today, and not only do we wonder as to their use, but also as to their safety.
Elixir of Bismuth, Pepsin, and Strychnia
Elixir of Bismuth and Pepsin, 15 1/2 ounces;
Solution of Strychnia, 1/2 ounce.
Mix them thoroughly.
(Bismuth and Pepsin are to aid digestion, and Strychnia is used to induce vomiting.)
Wine of Iron (Vinum Ferri)
Steel Filings, 2 ounces;
Sherry Wine, 32 ounces.
Macerate for thirty days, agitating occasionally, and filter through paper.
(This was used for the treatment of anemia.)
Jackson's Cough Syrup
Syrup of Rhubarb, 4 ounces;
Syrup of Ipecac, 4 ounces;
Syrup of Senega, 4 ounces;
Syrup of Morphia, 12 ounces.
Syrup of Morphia, 8 ounces;
Essence of Anise, 1/2 ounce;
Syrup of Balsam of Tolu, U.S., 7 1/2 ounces.
(Although commonly prescribed for infants, this mixture, because of the morphine, was occasionally fatal.)
The Mullen Paper Tester
Although this paper tester is no longer manufactured, the company itself is still in business. The following quote is taken from the Perkins Catalog' p. 133 cat 1902:
"This machine is simply a small hydraulic press, with a High Class Standard Pressure Gauge attached' for registering the pressure exerted to break the paper. The paper is clamped over one end of a cylinder filled with liquid, with a flexible rubber diaphragm between the paper and liquid, put in such a way as not to require pressure to actuate it, and serving only to keep the liquid from moistening the paper. It will be readily seen that by this method the liquid, which is the breaking force, must conform perfectly to any irregularity which may happen to exist in the surface of the paper itself, or by reason of its not being properly clamped, and therefore the pressure will be exerted uniformly over the entire surface of the paper exposed.
By turning a hand-wheel at the other end of the cylinder, pressure is brought to bear on the paper and gauge at the same time, and through the same medium, viz., the liquid.
"As everyone familiar with hydraulics knows, there is no chance for variation caused by friction or lost motion, as in a tester where the force is transferred from the point where the paper is broken to an indicator by a complicated system of levers and springs.
When the paper breaks, the hand on the gauge remains stationary, until reset by a special attachment similar to that of a stop watch, registering the exact number of pounds per square inch required to break the paper.
This is the very latest improved machine, made with the best possible workmanship, has nickel trimmings, and is very handsome and highly ornamental.
"We claim that the results obtained from the Mullen Tester are no arbitrary scale, set up by us and meaning nothing in particular, except when spoken of in connection with this particular machine, but are the adopted standard the world over of all pressure registering gauges, viz., pounds per square inch. In Maine or California the result will be the same.
"We claim that two samples of paper, uniformly made of the same stock, of equal weight, thickness and finish, tested on this machine, will show exact uniform tests, w hether all the tests are made on the same machine or each on a separate one. Also, that if the two samples are not uniformly made, etc., the irregularity of the tests will show it.
The Mullen Paper Tester is the recognized standard among the largest mills, paper dealers, lithographers, printers, publishers, etc., throughout this country. Send for special catalog of the Mullen Paper Tester, giving further particulars regarding same, together with a large list of the well known houses using this machine exclusively."
"Price $150.00, less discount. Code Word, Mullen.
B. F. Perkins & Son, Holyoke, Mass., U.S.A."
The plumb line is mentioned quite frequently in this exhibit, as it is used with many instruments. Egyptian builders as far back as 3000 BC were the first to use a plumb line a weighted cord. They knew that their constructions stood exactly upright when they were parallel with the line.
The heaviest metal, and the heaviest element, is OSMIUM. A 12-inch cube of osmium would weigh about 1,345 pounds, or as much as 10 adults, each weighing 134 pounds. A 12-inch cube of LITHIUM, the lightest metal, would tip the scales at only 32 pounds - not much heavier than the average two-year-old child.
The heaviest living thing on Earth is the General Sherman giant SEQUOIA tree, in Sequoia National Park, California. It weighs an estimated 2, 765 tons.
Made of iron, the Paris Eiffel Tower, completed in 1889, weighs 9,653 tons. Every seven years the tower is repainted with 5.5 tons of paint. The Empire State Building, built between 1929 and 1931, in New York City weighs 365,000 tons and is made of concrete and steel. The Great Pyramid of Egypt, made from more than 2,000,000 blocks of limestone, weighs 5,750,000 tons!
The average pig weighs about 423 pounds. An adult bull African elephant, the largest animal on land, weighs about 5.5 tons, about the same as 26 pigs. A fully grown blue whale, the largest animal on Earth, weighs about 143 tons, about as much as 26 bull elephants.
The Statue of Liberty, excluding the pedestal, is surprisely light, weighing 225 tons. Dedicated in 1886, the statue is a thin layer of copper over an iron framework. The world's largest bell, The Tsar Kolokol bell at the Kremlin in Moscow, Russia, weighs 223 tons, almost the same as the Statue of Liberty. Cast in bronze in 1735, it cracked, and has never rung.
Gold is one of the heaviest of all metals. It is estimated that all the gold ever mined would weigh a total of about 165,347 tons. That sounds like a lot, but in fact it would only make a solid block about the size of a tennis court.
HYDROGEN is the lightest gas, RADON the heaviest. Graf Zeppelin II , one of the two largest ever airships, held almost 7,062,940 cubic feet of hydrogen. Filled with radon, it could not have flown.
The OSTRICH is the biggest bird on Earth, but, although it can run fast (small chicks can reach 30 mph) it is unable to fly. The heaviest bird that can fly is the BUSTARD, which can weigh as much as the average six-year-old child (about 46 pounds) and more than 13,000 times as much as a BEE HUMMINGBIRD.
From 20 inches and 7.5 pounds when born, the average human increases in height by slightly more than three times, and increases in weight about 18 times. Although there are slight fluctuations, the growth rates of boys and girls remain similar until early adulthood, when men's rate overtakes women's.
The newborn kangaroo, or joey, is about the size of a paper clip and weighs only .030 ounces. By the time it is fully grown (about 44-46 pounds) it is some 30,000 times heavier than when it was born.
From "Compton's Pictured Encyclopedia," 1928 edition
This book was written in 1688, by John Love for surveying in the American colonies. It is probably one of the books that George Washington used to teach himself surveying.
About 15 BC the Roman architect and engineer Vitruvius mounted a large wheel of known circumference in a small frame, in much the same fashion as the wheel is mounted on a wheelbarrow; when it was pushed along the ground by hand it automatically dropped a pebble into a container at each revolution, giving a measure of the distance traveled. It was, in effect, the first odometer.
Nini Kurhiraman Gen Leininger
Dr. Fernando Vescia
MOAH thanks the following organizations and individuals for their support of this exhibit:
Civic Bank of Commerce Davis Instrument Etak Hewlett-Packard Stanford Linear Accelerator Trimble Navigation U.S. Geological Survey Clint Steele, Geologist Mara Tongue, Cartographer
All trademarks, tradenames and proprietary images are the property of their owners.
This page last updated May 3, 2001Original content Copyright © 2000, 2001; Museum of American Heritage