Temperature scale in the context of "Newton scale"

Play Trivia Questions online!

or

Skip to study material about Temperature scale in the context of "Newton scale"

Ad spacer

⭐ Core Definition: Temperature scale

Scale of temperature is a methodology of calibrating the physical quantity temperature in metrology. Empirical scales measure temperature in relation to convenient and stable parameters or reference points, such as the freezing and boiling point of water. Absolute temperature is based on thermodynamic principles: using the lowest possible temperature as the zero point, and selecting a convenient incremental unit.

Celsius, Kelvin, and Fahrenheit are common temperature scales. Other scales used throughout history include Rankine, Rømer, Newton, Delisle, Réaumur, Gas mark, Leiden, and Wedgwood.

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<

👉 Temperature scale in the context of Newton scale

The Newton scale is a temperature scale devised by Isaac Newton in 1701. He called his device a "thermometer", but he did not use the term "temperature", speaking of "degrees of heat" (gradus caloris) instead. Newton's publication represents the first attempt to introduce an objective way of measuring (what would come to be called) temperature (alongside the Rømer scale published at nearly the same time). With Newton using melting points of alloys of various metals such as bismuth, lead and tin, he was the first to employ melting or freezing points of metals for a temperature scale. He also contemplated the idea of absolute zero. Newton likely developed his scale for practical use rather than for a theoretical interest in thermodynamics; he had been appointed Warden of the Mint in 1695, and Master of the Mint in 1699, and his interest in the melting points of metals was likely inspired by his duties in connection with the Royal Mint.

Newton used linseed oil as thermometric material and measured its change of volume against his reference points. He set as 0 on his scale "the heat of air in winter at which water begins to freeze" (Calor aeris hyberni ubi aqua incipit gelu rigescere), reminiscent of the standard of the modern Celsius scale (i.e. 0 °N = 0 °C), but he has no single second reference point; he does give the "heat at which water begins to boil" as 33, but this is not a defining reference; the values for body temperature and the freezing and boiling point of water suggest a conversion factor between the Newton and the Celsius scale of between about 3.08 (12 °N = 37 °C) and 3.03 (33 °N = 100 °C) but since the objectively verifiable reference points given result in irreconcilable data (especially for high temperatures), no unambiguous "conversion" between the scales is possible.

↓ Explore More Topics
In this Dossier

Temperature scale in the context of Temperature

Temperature quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.

Thermometers are calibrated in various temperature scales that historically have relied on various reference points and thermometric substances for definition. The most common scales are the Celsius scale with the unit symbol °C (formerly called centigrade), the Fahrenheit scale (°F), and the Kelvin scale (K), with the third being used predominantly for scientific purposes. The kelvin is one of the seven base units in the International System of Units (SI).

↑ Return to Menu

Temperature scale in the context of Kelvin

The kelvin (symbol: K) is the base unit for temperature in the International System of Units (SI). The Kelvin scale is an absolute temperature scale that starts at the lowest possible temperature (absolute zero), taken to be 0 K. By definition, the Celsius scale (symbol °C) and the Kelvin scale have the exact same magnitude; that is, a rise of 1 K is equal to a rise of 1 °C and vice versa, and any temperature in degrees Celsius can be converted to kelvin by adding 273.15.

The 19th century British scientist Lord Kelvin first developed and proposed the scale. It was often called the "absolute Celsius" scale in the early 20th century. The kelvin was formally added to the International System of Units in 1954, defining 273.16 K to be the triple point of water. The Celsius, Fahrenheit, and Rankine scales were redefined in terms of the Kelvin scale using this definition. The 2019 revision of the SI now defines the kelvin in terms of energy by setting the Boltzmann constant; every 1 K change of thermodynamic temperature corresponds to a change in the thermal energy, kBT, of exactly 1.380649×10 joules.

↑ Return to Menu

Temperature scale in the context of Delisle scale

The Delisle scale is a temperature scale invented in 1732 by the French astronomer Joseph-Nicolas Delisle (1688–1768). The Delisle scale is notable as one of the few temperature scales that are inverted from the amount of thermal energy they measure; unlike most other temperature scales, higher measurements in degrees Delisle are colder, while lower measurements are warmer.

↑ Return to Menu

Temperature scale in the context of Wedgwood scale

The Wedgwood scale (°W) is an obsolete temperature scale, which was used to measure temperatures above the boiling point of mercury of 356 °C (673 °F). The scale and associated measurement technique were proposed by the English potter Josiah Wedgwood in the 18th century. The measurement was based on the shrinking of clay when heated above red heat, and the shrinking was evaluated by comparing heated and unheated clay cylinders. It was the first standardised pyrometric device. The scale began with 0 °W being equivalent to 1,077.5 °F (580.8 °C) and had 240 steps of 130 °F (72 °C) each. The origin and the sizing of the steps were later both found to be inaccurate.

↑ Return to Menu