An intelligence quotient (IQ) is a total score derived from a set of standardized tests or subtests designed to assess human intelligence. Originally, IQ was a score obtained by dividing a person's estimated mental age, obtained by administering an intelligence test, by the person's chronological age. The resulting fraction (quotient) was multiplied by 100 to obtain the IQ score. For modern IQ tests, the raw score is transformed to a normal distribution with mean 100 and standard deviation 15. This results in approximately two-thirds of the population scoring between IQ 85 and IQ 115 and about 2 percent each above 130 and below 70.
Scores from intelligence tests are estimates of intelligence. Unlike quantities such as distance and mass, a concrete measure of intelligence cannot be achieved given the abstract nature of the concept of "intelligence". IQ scores have been shown to be associated with factors such as nutrition, parental socioeconomic status, morbidity and mortality, parental social status, and perinatal environment. While the heritability of IQ has been studied for nearly a century, there is still debate over the significance of heritability estimates and the mechanisms of inheritance. The best estimates for heritability range from 40 to 60% of the variance between individuals in IQ being explained by genetics.