b9 wrote:
without consideration to the structure of questions in IQ tests, i wonder how "average mental age" scores are defined.
IQ is (mental age / chronological age) x 100.
if you are ten and you have the mental age of a 20 year old, then you have an IQ of 200 apparently.
if you have an IQ of 300, then you have the mental age of a 30 year old?
a 30 year old person can not solve problems (of a spontaneously presented nature) any better than a 20 year old, so how is the baseline "average ability" for an x year old (above 20) assessed?. surely not in a continously incremental improvement in actual intelligence with age.
if a 10 year old has an "iq" of 950, then they have the mental age of a 95 year old which would not be anything to be too happy about.
i think the present medical notion of "IQ" is a limited way of assessing young childrens capacities and is not really applicable to extreme intelligence or to people who are over 10 years old.
When the term "intelligence quotient" first came into use, it was defined as you describe - a quotient of effective to actual age - and was only applicable to children, since, as you point out, the concept of "mental age" is devoid of meaning past the developmental years.
Today, the term "IQ" refers to any of a number of standardised tests with several important statistical properties. Most significant is that when administered to a large random sample reflecting the entire population, and assuming normally distributed results, the results are calibrated so that mean is 100 and the standard deviation is 15 (16 on some older tests). Thus, an IQ of 115 no longer means performing at a level of someone 15% older; instead, it means performing at a level higher than about 84% of the overall population. Similarly, an IQ of 160, +4 standard deviations, means a level occurring in about 1 in every 30,000 people. An IQ of 175 occurs in 1 in almost 3.5 million people. As the assumption of normality likely breaks down at these extremes, it is difficult to accurately measure IQ when very high or very low, even if the instruments were infallible, which they are anything but.
Because IQ is essentially a way of expressing a percentile, if human intelligence on average tended to increase over time, through natural selection, improving health and nutrition, or whatever reason, an individual could expect to see her IQ steadily go
down over time, even though her actual intelligence has not changed, since IQ is measured relative to the current population average.
Conceivably, any type of performance test carefully calibrated to 100/15 could be presented as an "IQ", if one does not mind stretching the term a little bit.