The Newton scale is a temperature scale devised by Isaac Newton in 1701. He called his device a "thermometer", but he did not use the term "temperature", speaking of "degrees of heat" (gradus caloris) instead. Newton's publication represents the first attempt to introduce an objective way of measuring (what would come to be called) temperature (alongside the Rømer scale published at nearly the same time). With Newton using melting points of alloys of various metals such as bismuth, lead and tin, he was the first to employ melting or freezing points of metals for a temperature scale. He also contemplated the idea of absolute zero. Newton likely developed his scale for practical use rather than for a theoretical interest in thermodynamics; he had been appointed Warden of the Mint in 1695, and Master of the Mint in 1699, and his interest in the melting points of metals was likely inspired by his duties in connection with the Royal Mint.
Newton used linseed oil as thermometric material and measured its change of volume against his reference points. He set as 0 on his scale "the heat of air in winter at which water begins to freeze" (Calor aeris hyberni ubi aqua incipit gelu rigescere), reminiscent of the standard of the modern Celsius scale (i.e. 0 °N = 0 °C), but he has no single second reference point; he does give the "heat at which water begins to boil" as 33, but this is not a defining reference; the values for body temperature and the freezing and boiling point of water suggest a conversion factor between the Newton and the Celsius scale of between about 3.08 (12 °N = 37 °C) and 3.03 (33 °N = 100 °C) but since the objectively verifiable reference points given result in irreconcilable data (especially for high temperatures), no unambiguous "conversion" between the scales is possible.