History: Antimony Metal
Unlike many minor metals, antimony has been used by humans for millennia.
History of Antimony
Early Egyptians used forms of antimony in cosmetics and medicines around 5000 years ago. Ancient Greek doctors prescribed antimony powders for the treatment of skin disorders, and during the Middle Ages antimony was of interest to the alchemist who gave the element its own symbol. It has even been suggested that Mozart's death in 1791 was a result of excessive consumption of antimony-based medicines.
According to some of the first metallurgy books published in Europe, crude methods for isolating antimony metal were likely known by Italian chemists over 600 years ago.
One of antimony's earliest metallic uses came in the mid-15th century when it was added as a hardening agent in cast metal printing type used by Johannes Gutenberg's first printing presses.
By the 1500s, antimony was reportedly being added to alloys used to produce church bells because it resulted in a pleasant tone when struck.
In the mid-17th century, antimony was first added as a hardening agent to pewter (an alloy of lead and tin). Britannia metal, an alloy similar to pewter, which is made up of tin, antimony, and copper, was developed shortly thereafter, first being produced around 1770 in Sheffield, England.
More malleable than pewter, which had to be cast into form, Britannia metal was preferred because it could be rolled into sheets, cut and even lathed.
Britannia metal, which is still used to this day, was initially used to make teapots, mugs, candlesticks, and urns.
Around 1824, a metallurgist named Isaac Babbitt became the first US producer of table utensils made from Britannia metal. But his biggest contribution to the development of antimony alloys did not come until 15 years later when he began experimenting with alloys to reduce friction in steam engines.
In 1939, Babbitt created an alloy composed of 4 parts copper, 8 parts antimony and 24 parts tin, which would later come to be known simply as Babbitt (or Babbitt metal).
In 1784, British General Henry Shrapnel developed a lead alloy containing 10-13 percent antimony that could be formed into spherical bullets and used in artillery shells in 1784. As a result of the British military's adoption of Shrapnel's technology in the 19th century, antimony became a strategic war metal. 'Shrapnel' (the ammunition) was widely used during World War I, resulting in the global production of antimony more than doubled to a peak of 82,000 tons in 1916.
Following the war, the automobile industry in the US stimulated new demand for antimony products through the use of lead-acid batteries where it is alloyed with lead to harden the grid plate material. Lead-acid batteries remain the largest end-use for metallic antimony.
Other Historical Antimony Uses
In the early 1930s, the local government in Guizhou province, being short of gold, silver or any other precious metal, issued coins made from antimony-lead alloy. Half a million coins were reportedly cast, but being soft and prone to deterioration (not to mention, toxic), the antimony coins did not catch on.
Pewterbank.com. Britannia Metal is Pewter.
Wikipedia. Babbitt (metal).
Hull, Charles. Pewter. Shire Publications (1992).
Butterman, WC and JF Carlin Jr. USGS. Mineral Commodity Profile: Antimony. 2004.