it flawless. The Oxford English Dictionary illustrates it nicely with a quotation from the James Smith classic Panorama of Science and Art, first published in 1815, that “to grind one surface perfectly flat, it is … necessary to grind three at the same time.” While it has to be assumed that this basic principle had been known for centuries, it is commonly believed that Henry Maudslay was the first to put it into practice, and create thereby an engineering standard that exists to this day.
So accurate was Henry Maudslay’s bench micrometer that it was nicknamed “the Lord Chancellor,” as no one would dare have argued with it.
Photograph courtesy of the Science Museum Group Collection.
Three is the crucial number. You can take two steel plates and grind them and smooth them to what is believed to be perfect flatness—and then, by smearing each with a colored paste and rubbing the two surfaces together and seeing where the color rubs off and where it doesn’t, as at a dentist’s, an engineer can compare the flatness of one plate with that of the other. Yet this is a less than wholly useful comparison—there is no guarantee that they will both be perfectly flat, because the errors in one plate can be accommodated by errors in the other. Let us say that one plate is slightly convex, that it bulges out by a millimeter or so in its middle. It may well be that the other plate is concave in just the same place, and that the two plates then fit together neatly—giving the impression that the flatness of one is the same as the flatness of the other. Only by testing both these planes against a third, and by performing more grinding and planing and smoothing to remove all the high spots, can absolute flatness (with the kind of near-magical properties displayed by my father’s gauge blocks) be certain.
AND THEN THERE was the measuring machine, the micrometer. Henry Maudslay is generally also credited with making the first of this kind of instrument, most particularly one that had the look and feel of a modern device. In fairness, it must be said that a seventeenth-century astronomer, William Gascoigne, had already built a very different-looking instrument that did much the same thing. He had embedded a pair of calipers in the eyeglass of a telescope. With a fine-threaded screw, the user was able to close the needles around each side of the image of the celestial body (the moon, most often) as it appeared in the eyepiece. A quick calculation, involving the pitch of the screw in inches, the number of turns needed for the caliper to fully enclose the object, and the exact focal length of the telescope lens, would enable the viewer to work out the “size” of the moon in seconds of arc.
A bench micrometer, on the other hand, would measure the actual dimension of a physical object—which was exactly what Maudslay and his colleagues would need to do, time and again. They needed to be sure the components of the machines they were constructing would all fit together, would be made with exact tolerances, would be precise for each machine and accurate to the design standard.
As with Gascoigne’s invention of a century before, the bench micrometer’s measurement was based on the use of a long and skillfully made screw. It employed the basic principle of a lathe, except that instead of having a slide rest with cutting or boring tools mounted upon it, there would be two perfectly flat blocks, one attached to the headstock, the other to the tailstock, and with the gap between them opened or closed with a turn of the leadscrew.
And the width of that gap, and of any object that fitted snugly between the two flat blocks, could be measured—the more precisely if the leadscrew was itself made with consistency along its length, and the more accurately if the leadscrew was very finely cut and could advance the blocks toward one another slowly, in the tiniest increments of measurable movement.
Maudslay tested his own five-foot brass screw with his new micrometer and found it wanting: in some places, it had fifty threads to the inch; in others, fifty-one; elsewhere, forty-nine. Overall, the variations canceled one another out, and so it was useful as a leadscrew, but because Maudslay was so obsessive a perfectionist, he cut and recut it scores of times until, finally, it was deemed to be wholly without error, good and consistent all along its massive length.
The micrometer that performed all these measurements turned out to be so accurate and consistent that someone—Maudslay himself, perhaps, or one of his small army of employees—gave it a name: the Lord Chancellor. It was pure nineteenth-century drollery: no one would ever dare argue with or challenge the Lord Chancellor. It was a drily amusing way to suggest that Maudslay’s was the last word in precision: this invention of his could measure down to one one-thousandth of an inch and, according to some, maybe even one ten-thousandth of an inch: to a tolerance of 0.0001.
In fact, with the device’s newly consistent leadscrew sporting one hundred threads per inch, numbers hitherto undreamed of could be achieved. Indeed, according to the ever-enthusiastic colleague and engineer-writer James Nasmyth, who so worshipped Maudslay that he eventually wrote a rather too admiring biography, the fabled micrometer could probably measure with accuracy down to one one-millionth of an inch. This was a bit of a stretch. A more dispassionate analysis performed much later by the Science Museum in London goes no further than the claim of one ten-thousandth.
And this was only 1805. Things made and measured were only going to become more precise in the years ahead, and they would do so to a degree that Maudslay (for whom an abstraction, the ideal of precision, was perhaps the greatest of his inventions) and his colleagues could never have imagined. Yet there was some hesitancy. A short-lived hostility to machines—which is at least a part of what the Luddite movement represented, a mood of suspicion, of skepticism—briefly gave pause to some engineers and their customers.
And then there was that other familiar human failing, greed. It was greed that in the early part of the nineteenth century played some havoc with precision’s halting beginnings across the water, to where this story now is transferred, in America.
(TOLERANCE: 0.000 01)
A Gun in Every Home, a Clock in Every Cabin
To-day we have naming of parts. Yesterday,
We had daily cleaning. And to-morrow morning,
We shall have what to do after firing. But to-day,
To-day we have naming of parts.
—HENRY REED, “NAMING OF PARTS” (1942)
He was a soldier, his name unknown or long forgotten, a lowly young volunteer in Joseph Sterrett’s Fifth Baltimore Regiment. It was August 24, 1814, and I imagine the youngster was probably sweating heavily, his secondhand wool uniform patched and ill fitting and hardly suitable for the blazing late-summer sun.
He was waiting for the fighting to begin, for battle to be joined. He was hiding behind a tumbled stone wall outside a cornfield, not entirely certain where he was, though his sergeant had suggested he was in a small port city named Bladensburg, connected to the sea by a branch of the Potomac that led into the Chesapeake Bay. British forces, the word went, had landed there from ships and were now rapidly advancing from the east. Washington, the capital of his country, a country now not even forty years old as an independent nation, was eight miles to the west behind him, and he was part of a force of six thousand that had been deployed to protect it. Whispers along the line held that President James Madison himself was on the Bladensburg battlefield, determined to make sure the Britons were made to run back to their vessels and flee for their lives.
The young man doubted he would be of much use in the coming battle, for he had no gun—not a gun that worked, anyway. His musket, a new-enough Springfield 1795 model, had a broken trigger. He had fractured it, cracked the guard, and so ruined the trigger during a previous battle, an earlier skirmish of what they were starting to call the War of 1812.
In all other ways