Zach Musgrave writes at sleptlate.org:
“Scientific racism” is a slur in the academy, roughly analogous to calling something “psuedoscientific” in the mainstream scientific community. Largely because there are observed differences in the results of IQ tests of different races, it is politically correct in many academic circles to refer to general intelligence under the euphemism “whatever it is that IQ tests measure.”
And, in fact, it’s solid science that performance on such tests is strongly influenced by individuals’ own perceptions of their ability. Blacks taking a test that is presented as a “laboratory exercise” outperform those taking the same test presented as an exam. In Predictably Irrational, Dan Ariely relates an even more intriguing experimental result. Researchers seeking to understand the effect that stereotypes have on math test performance decided to see if they could study the interaction between two conflicting stereotypes: that Asians are good at math; and that women are bad at it. They tested a large sample of Asian women, subconsciously priming a third of them to think about their womanhood (by asking questions about child birth, motherhood, etc.), another third to think about their Asian-ness (by asking questions about the language spoken at home, immigration, etc.), and leaving a final third as a control group. Perhaps not surprisingly, they found that each test group lived up to the stereotype they were primed to think about — the Asian group did better, and the woman group worse, than the control group.
As a society, we find the very idea of cognitive differences between races so vile and reprehensible that anyone making such claims does so at the risk of their academic and scientific career. The Bell Curve, a book on intelligence distribution that includes a chapter on the black-white achievement gap and suggests it cannot be explained by social factors alone, has received more refutation (and its authors, more ostracism) than any other modern, mainstream scientific text.
I’m currently reading How the Mind Works, an aptly named treatise about how evolution designed the human brain to fill the “cognitive niche” that no other species does. The author, Steven Pinker, understands that any discussion about innate human behavior, no matter how polite, raises the hackles on many of his more critical readers, and so he spends the first couple chapters of his book hammering home the point that we, as a society, need to separate the concept of what is right from what is true. He warns about the dangers of the twin logical fallacies applied to this area of research: the naturalistic fallacy (because something is natural, it must be good); and its opposite, the moralistic fallacy (because something is good, it must be natural). He notes that in the 1980s UNESCO proactively refuted any scientific study that claimed humans have an innate, evolved tendency towards violence and war, asserting that it is “scientifically inaccurate” to make such claims.
But with the genetic revolution, any ethnic differences that do exist are inevitably going to come to the forefront. Jonathan Haidt of the University of Virginia is concerned about our ability to keep this discussion civil:
The most offensive idea in all of science for the last 40 years is the possibility that behavioral differences between racial and ethnic groups have some genetic basis. Knowing nothing but the long-term offensiveness of this idea, a betting person would have to predict that as we decode the genomes of people around the world, we’re going to find deeper differences than most scientists now expect. Expectations, after all, are not based purely on current evidence; they are biased, even if only slightly, by the gut feelings of the researchers, and those gut feelings include disgust toward racism…
The protective “wall” is about to come crashing down, and all sorts of uncomfortable claims are going to pour in. Skin color has no moral significance, but traits that led to Darwinian success in one of the many new niches and occupations of Holocene life — traits such as collectivism, clannishness, aggressiveness, docility, or the ability to delay gratification — are often seen as virtues or vices. Virtues are acquired slowly, by practice within a cultural context, but the discovery that there might be ethnically-linked genetic variations in the ease with which people can acquire specific virtues is — and this is my prediction — going to be a “game changing” scientific event…
I believe that the “Bell Curve” wars of the 1990s, over race differences in intelligence, will seem genteel and short-lived compared to the coming arguments over ethnic differences in moralized traits. I predict that this “war” will break out between 2012 and 2017.
It’s getting harder every year to profess the standard social science model of the “blank slate” embraced by Piaget and Freud (and many others). The more we learn about genetics and the brain, the more we learn that major aspects of our personalities and minds are determined at birth or earlier. For example, recent research suggests that executive function — one’s ability to control one’s thoughts and behavior — is almost entirely heritable. As time passes, the number of cognitive and behavioral traits in the “almost entirely heritable” list is guaranteed to grow, seriously challenging long-cherished beliefs about justice, merit, and agency.
Read more here.