Twins, IQ and real people
Time after time you see TV programmes that show identical twins reared apart who are similar in innumerable ways that include similar IQs. These so impressed scholars in the IQ area that they began to preach “post-twin pessimism”.
So-called twin pessimists were convinced of three things: that genes for intelligence were virtually all-powerful; that the human race could not increase its level of intelligence without upgrading the genes that dictated brain efficiency; that there is a pecking order in any generation that ranks the intelligence of individuals according to their genetic quality. In my research and books upon IQ and intelligence, over the last nine years I have shown that all three of these views are false.
The Flynn Effect
First, I documented the “Flynn Effect”. This demonstrated that during the 20th century, in industrialised nations, every generation has has made massive IQ gains. If you compared people in 1900 with people in 2000, the average person would get an IQ score of only 70, which is the cutting line for mental retardation. Yet we know that our ancestors were not intellectually incapable of running the society of their day.
The answer is that people’s cognitive abilities can meet (within reason – all cannot be Einsteins) whatever cognitive demands their society makes upon them. In America in 1900, fewer than 5 per cent of people were professionals and few had schooling beyond six years. Most were simply not asked to develop the intellectual skills of modern people. They were asked to develop their minds only insofar as they had to do subsistence farming or unskilled factory work.
Yet the first IQ tests were designed by scholars to measure conceptual skills needed for cognitively demanding jobs or to cope with a lot of formal schooling. These scholars were “pre-modern” people who played those roles – and they wanted to predict which people could follow in their footsteps. Little wonder that the average person got a lower score before they filled those roles than after they filled them.
The human brain is like a muscle. Our physical muscles develop in terms of the demands made, compare a weightlifter’s muscles with those of a swimmer. By 1940, most Americans were driving cars and this made new demands on their mapping-skills. These would be reflected in a larger hippocampus, the part of the brain that is the seat of map reading (for example, London taxi divers have very enlarged hippocampuses). Today we are getting automatic guidance systems and these skills will decline. This has nothing to do with better or worse genes but reflects whatever cognitive skills our society asks us to do.
Intellectual and moral progress
Second, I questioned what new mental exercise has schooling and cognitively demanding jobs demanded of us over the 20th century, and do these constitute “intelligence” gains. The question is trivial. They are not intelligence gains if that means our brains at conception are somehow enhanced. They are intelligence gains in the sense that we can deal with a more cognitively complex world.
Scholars mired in the dogma that “real” intelligence cannot increase over time dismiss them as mere skill gains acquired by better education. This is self-defeating. The genetic limitations of our brains were supposed to tell us who was capable of profiting from education. Nothing was more evident to the elite of 1900 that that the masses could never be trained to assume the demanding cognitive roles the elite monopolised at that time. Well, the entire modern world has proved them wrong.
Intellectual progress has brought moral progress. Among school-demanded skills is applying logic to generalised statements and taking the hypothetical seriously. People of the Victorian era saws moral maxims as concrete things, no more subject to logic than any other concrete thing. Unlike us – people of the late 20th and early 21st centuryeducated within an analytic scientific tradition – they would not see hypotheticals as universal criteria to be generalised.
The individual and his/her skills
My new book Does Your Family Make You Smarter? addresses the third assumption of post-twin pessimism. Twin studies show that as children age, their family’s effect on their cognitive abilities fade as their genes attain a better match with current environment (school, peers, etc). Does this mean that people, say by the age of 17, can claim no injustice because they have been born into a family of low cognitive quality? Does it mean that there is no room for human autonomy, for example, choosing to try to enhance one’s cognitive abilities?
My new Age Table method demonstrates that family background still handicaps the vocabulary of young people who, at the age of 17, take the SAT (Scholastic Aptitude Test) to qualify for university admission. Someone from a seriously deficient home suffers from a 9 IQ-point disadvantage or the equivalent of a handicap of 66 SAT points, which is sufficient to bar them from a distinguished university and confine them to an average university.
As to autonomy, analysis of the portion of environment, which never correlates with genes, shows that a student who comes back to university at adulthood can upgrade their vocabulary by leapfrogging 80 per cent of those ahead of them on the vocabulary scale.
This is often dismissed as the influence of “chance” factors. No doubt much of it is. At age 20, no matter what your genetic inheritance or what the cognitive quality of your current environment (job, home life, interacting with your peers), fate may put you in a less demanding environment.
In sum, at any time of your adult life you can make you own luck: you can go back to university, join a book club, retired from law write a book on jurisprudence rather than just watch TV. Twins aside, your brain will respond to the cognitive exercise that new demands happen to impose.
James R Flynn’s new book Does your family make your smarter? Nature, nurture and human autonomy is published by Cambridge University Press