Logarithmic scale in the context of "PH"

Play Trivia Questions online!

or

Skip to study material about Logarithmic scale in the context of "PH"

Ad spacer

⭐ Core Definition: Logarithmic scale

A logarithmic scale (or log scale) is a method used to display numerical data that spans a broad range of values, especially when there are significant differences among the magnitudes of the numbers involved.

Unlike a linear scale where each unit of distance corresponds to the same increment, on a logarithmic scale each unit of length is a multiple of some base value raised to a power, and corresponds to the multiplication of the previous value in the scale by the base value. In common use, logarithmic scales are in base 10 (unless otherwise specified).

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<

👉 Logarithmic scale in the context of PH

In chemistry, pH (/pˈh/ or /pˈ/; pee-HAYCH or pee-AYCH) is a logarithmic scale used to specify the acidity or basicity of aqueous solutions. Acidic solutions (solutions with higher concentrations of hydrogen (H) cations) are measured to have lower pH values than basic or alkaline solutions. While the origin of the symbol 'pH' can be traced back to its original inventor, and the 'H' refers clearly to hydrogen, the exact original meaning of the letter 'p' in pH is still disputed; it has since acquired a more general technical meaning that is used in numerous other contexts.

The pH scale is logarithmic and inversely indicates the activity of hydrogen cations in the solution

↓ Explore More Topics
In this Dossier

Logarithmic scale in the context of Large numbers

Large numbers are numbers far larger than those encountered in everyday life, such as simple counting or financial transactions. These quantities appear prominently in mathematics, cosmology, cryptography, and statistical mechanics. Googology studies the naming conventions and properties of these immense numbers.

Since the customary decimal format of large numbers can be lengthy, other systems have been devised that allows for shorter representation. For example, a billion is represented as 13 characters (1,000,000,000) in decimal format, but is only 3 characters (10) when expressed in exponential format. A trillion is 17 characters in decimal, but only 4 (10) in exponential. Values that vary dramatically can be represented and compared graphically via logarithmic scale.

↑ Return to Menu

Logarithmic scale in the context of Orders of magnitude (numbers)

This list contains selected positive numbers in increasing order, including counts of things, dimensionless quantities and probabilities. Each number is given a name in the short scale, which is used in English-speaking countries, as well as a name in the long scale, which is used in some of the countries that do not have English as their national language.

↑ Return to Menu

Logarithmic scale in the context of Magnitude (astronomy)

In astronomy, magnitude is a measure of the brightness of an object, usually in a defined passband. An imprecise but systematic determination of the magnitude of objects was introduced in ancient times by Hipparchus.

Magnitude values do not have a unit. The scale is logarithmic and defined such that a magnitude 1 star is exactly 100 times brighter than a magnitude 6 star. Thus each step of one magnitude is times brighter than the magnitude 1 higher. The brighter an object appears, the lower the value of its magnitude, with the brightest objects reaching negative values.

↑ Return to Menu

Logarithmic scale in the context of Moment magnitude scale

The moment magnitude scale (MMS; denoted explicitly with Mw or Mwg and generally implied with use of a single M for magnitude) is a measure of an earthquake's magnitude ("size" or strength) based on its seismic moment. Mw was defined in a 1979 paper by Thomas C. Hanks and Hiroo Kanamori. Similar to the local magnitude/Richter scale (ML ) defined by Charles Francis Richter in 1935, it uses a logarithmic scale; small earthquakes have approximately the same magnitudes on both scales. Despite the difference, news media often use the term "Richter scale" when referring to the moment magnitude scale.

Moment magnitude (Mw ) is considered the authoritative magnitude scale for ranking earthquakes by size. It is more directly related to the energy of an earthquake than other scales, and does not saturate – that is, it does not underestimate magnitudes as other scales do in certain conditions. It has become the standard scale used by seismological authorities like the United States Geological Survey for reporting large earthquakes (typically M > 4), replacing the local magnitude (ML ) and surface-wave magnitude (Ms ) scales. Subtypes of the moment magnitude scale (Mww , etc.) reflect different ways of estimating the seismic moment.

↑ Return to Menu