This article will address the topic of Wedgwood scale, which is of great relevance today. Wedgwood scale is a topic that has generated great interest in various areas, from politics to science, through culture and society in general. Throughout history, Wedgwood scale has been the subject of study and debate, raising conflicting opinions and provoking deep reflections on its impact on everyday life. In this sense, it is essential to analyze in detail the different aspects related to Wedgwood scale, addressing its implications and consequences, as well as the possible solutions that may arise as a result of its presence. Therefore, the main objective of this article is to provide a broad and updated view of Wedgwood scale, in order to promote critical and constructive reflection on this very relevant topic.
The Wedgwood scale (°W) is an obsolete temperature scale, which was used to measure temperatures above the boiling point of mercury of 356 °C (673 °F). The scale and associated measurement technique were proposed by the English potter Josiah Wedgwood in the 18th century. The measurement was based on the shrinking of clay when heated above red heat, and the shrinking was evaluated by comparing heated and unheated clay cylinders. It was the first standardised pyrometric device. The scale began with 0 °W being equivalent to 1,077.5 °F (580.8 °C) and had 240 steps of 130 °F (72 °C) each. The origin and the sizing of the steps were later both found to be inaccurate.
The boiling point of mercury limits the mercury-in-glass thermometer to temperatures below 356 °C, which is too low for many industrial applications such as pottery, glass making and metallurgy.
To solve this problem, in 1782, Wedgwood created an accurately scaled pyrometric device, with details published in the Philosophical Transactions of the Royal Society of London in 1782 (Vol. LXXII, part 2). This led him to be elected a fellow of the Royal Society.[1][2][3][4] [5][6][7]
A 0.5-inch-diameter cylinder made from pipe clay was dried at the temperature of boiling water. This would prepare it for heating in the oven in which the temperature was to be measured. During the annealing, sintering (merging) of fine particles resulted in contraction of clay. After cooling, the temperature was evaluated from the diameter difference before and after heating assuming that the contraction is linear with temperature.[8]
To facilitate the temperature calculation, Wedgwood built a device which would directly read the temperature. Two metal bars with scales on them were fixed one above another on a metal plate and inclined at a small angle. The spacing between the bars was 0.5 inches at one end and 0.3 inches at the lower end. The scale was divided into 240 equidistant parts. The unheated piece of clay would fit the 0.5-inch gap giving the zero temperature reading. After annealing, the clay cylinder would shrink and fit somewhere in between the left and right ends of the bars, and the temperature could be read from the scales on the bars.[9][10]
The origin on the Wedgwood scale (0 °W) was set at the onset temperature of red heat, 1,077.5 °F (580.8 °C). The scale had 240 steps of 130 °F (72 °C) and extended up to 32,277 °F (17,914 °C).[8][11] Wedgwood tried to compare his scale with other scales by measuring the expansion of silver as a function of temperature. He also determined the melting points of three metals, namely copper (27 °W or 4,587.5 °F (2,530.8 °C)), silver (28 °W or 4,717.5 °F (2,603.1 °C)) and gold (32 °W or 5,237.5 °F (2,891.9 °C)). All these values are at least 2,500 °F (1,400 °C) too high.[12]
Louis-Bernard Guyton de Morveau used his pyrometer to evaluate the temperature scale of Wedgwood and came to the conclusion that the starting point should be significantly lower, at 517 °F (269 °C) instead of 1,077.5 °F (580.8 °C), and that the steps should be nearly halved from 130 °F (72 °C) to no more than 62.5 °F (34.7 °C). However, even after this revision the Wedgwood measurements overestimated the melting points of elements.[10]