By Steve Sailer
12/19/2010
Every three years, the Organization of Economic Co-operation and Development [OECD], the rich country’s club, announces the results of its Programme for International Student Assessment (PISA). These are tests of school achievement for 15-year-olds in the 34 OECD countries, plus 31 other countries or regions.
And, following the announcement, there is always wailing and gnashing of teeth about how the U.S. is doomed by the failures of the U.S. educational system relative to the rest of the world.
This time, for example, Secretary of Education Arne Duncan declared: "This is an absolute wake-up call for America. … we have to deal with the brutal truth. We have to get much more serious about investing in education." ['Wake-up call': U.S. students trail global leaders, MSNBC.com, December 7, 2010]
(Investing = spending more on teacher salaries).
Similarly, Chester Finn in the Wall Street Journal compared the release of the latest test scores to 1957, when the Soviet Union shocked the U.S. by putting Sputnik into orbit. [A Sputnik Moment for U.S. Education, December 8, 2010]
And The New York Times headlined Top Test Scores from Shanghai Stun Educators. [By Sam Dillon, December 7, 2010]
I wasn’t stunned. But then I’m not an educator.)
It took me two days of looking through the voluminous PISA results to create the simple graph below. It shows what the Great and Good don’t want you to know about the 2009 PISA results: When broken down by ethnicity, American students did reasonably well compared to the countries from which their ancestors came.
In this chart, I've depicted American ethnic groups in red to show where they fall relative to other countries, which are colored to reflect their dominant populations.
As my chart shows:
PISA test scores are allotted in roughly the same fashion as for the SAT. The mean for the 34 OECD First World countries was originally set at 500, with a standard deviation of 100. The PISA test was also given in 31 other places, mostly poorer countries, most of which score well under 500. Thus, in a country that scores at about 400, such as Indonesia, the median student would score at around the 16th percentile in the OECD.
Here’s my bar chart of American scores by ethnicity. Interestingly, American Hispanics did significantly better in reading in 2009 than they had done in science in 2006 and in math in 2003.
Why does my second graph have to compare reading scores from 2009 to science scores from 2006 and math scores from 2003?
Because PISA and the U.S. government apparently conspire to keep the ethnic breakdowns of American scores a secret, except for whichever subject is the main theme of that year’s PISA (reading in 2009, science in 2006, and math in 2003). Thus the only American scores broken down by ethnicity yet released for 2009 were for reading.
Yet all three subjects are tested each year, and scores for all subjects are released in mind-numbing detail cross-tabbed for every conceivable factor … except race.
Considering the hundreds of pages of data PISA releases on its website on all three tests for 2009, it’s ludicrous (yet unsurprising) that PISA won’t publish the ethnic breakdowns it has collected. The words "Hispanic" or "Latino" don’t appear anywhere in the otherwise endless PISA 2009 data.
Instead, PISA conveyed the ethnic data confidentially to the U.S. government — which then released the racial breakdowns for just the reading test on its National Center for Educational Statistics website (PDF).
The goal of this byzantine process is evidently to make it more inconvenient for crass outsiders and possible critics to grasp the patterns underlying the scores.
Where’s WikiLeaks when you really need it?
A few caveats about the strong performance of the U.S.:
First: in 2009, the U.S. did slightly better in reading than in science, and moderately better in science than in math. So, my top graph, showing the 2009 reading results, puts America’s best foot forward in what ought to be a three-legged race. But of course I can’t show you PISA scores by ethnicity for 2009 in science and math because they are, apparently, federal secrets.
Second: my top chart does not offer a true apples-to-apples comparison of whites in America to whites in other traditionally white countries. For example, New Zealand whites scored 541 on reading in 2009, 16 points above American whites. But the Kiwis' national scores are dragged down somewhat by the indigenous Maori and by Pacific Islander immigrants, who do more for the current competitiveness of New Zealand’s national rugby team than for the future competitiveness of its 21st Century economy.
(By the way, unlike Arne Duncan’s Department of Education, New Zealand’s Ministry of Education has released its 2009 scores by ethnicity not just for reading, but also for math — Kiwi whites averaged an impressive 537 in 2009 — and science — an excellent 555. This is how we can conclude that responsibility for stonewalling the release of two-thirds of American scores by race is the choice of the U.S. government, rather than of PISA. This suggests that a citizen might be able to use the Freedom of Information Act to compel the Department of Education to release all the PISA data by ethnicity. Or maybe some newly-elected Tea Party congressman can make an issue of it.)
Third: America pays royally for the results we do get. We spend more per student than any country in the world, other than Luxembourg, a small, rich tax haven. We spend about fifty percent more per student than Finland does.
What about China and India?
Rich, bustling Shanghai is not likely to be representative of China as a whole. Yet, Shanghai might not be the highest scoring part of China, either. Traditionally, the top scorers on the Imperial mandarin exams as well as on the current national college admission test tend to come from farther to the southeast, especially from Fujian province on the coast across from Taiwan. The Fujianese are the central component of the Overseas Chinese who dominate the economies of Southeast Asia.
The other gigacountry, India, is often lumped with China by American pundits like Tom Friedman. India certainly has a large number of highly intelligent people, but major questions remain about the Indian masses. India has never participated in PISA.
However, a version of the similar TIMSS international math test was given to a sample of Indian students in the states of Orissa and Rajasthan, as reported in a 2009 paper by Jishnu Das of the World Bank and Tristan Zajonc of Harvard, India Shining and Bharat Drowning: Comparing Two Indian States to the Worldwide Distribution in Mathematics Achievement. On average, the Indians performed poorly: "These two states fall below 43 of the 51 countries for which data exist." Note, though, that India was the second-most internally unequal country on TIMSS, behind only South Africa.
As I've argued before in VDARE.com, India is immensely complicated, and thus it’s very hard to predict its long-term potential. But, as I wrote in 2006: "Whatever the cause of the big IQ difference between China and India, it will very likely still remain for the rest of our lifetimes."
An important question remains: How much can we trust the PISA results?
Reading through the documentation, it’s obvious that PISA is lavishly funded and professionally run.
But we need to keep in mind the central paradox of testing: In all forms of well-run testing, the most trustworthy results are the ones covering the largest scale … and thus of the least interest to the media. For example, from an overwhelming preponderance of data across all the PISA tests, across similar international tests such as TIMSS and PIRLS, and across IQ tests as tabulated by Richard Lynn and Tatu Vanhanen, we can be confident that Northeast Asians average higher scores than Latin Americans.
Indeed, Heiner Rindermann’s 2007 paper in the European Journal of Personality, The g-Factor of International Cognitive Ability Comparisons: The Homogeneity of Results in PISA, TIMSS, PIRLS and IQ-Tests Across Nations demonstrated the striking correlation in national average performance among highly different tests administered to different age groups over several decades.
Then, aggregating countries by continent adds to the reliability.
But the fact that Northeast Asians tend to be smarter than Latin Americans is not the kind of thing you are supposed to discuss in the American MSM. It’s considered both too obvious and too irresponsible to mention.
Instead, the MSM gets most excited about trivial distinctions where the likelihood for noise in PISA results is worrisome: Did Shanghai outscore Singapore? How did the U.S. do in 2009 v. 2006?
Before reading too much into detailed results, here are some inherent problems with this kind of international testing that should borne in mind.
First: although the PISA officials worked hard to come up with representative samples of 15-year-old students in representative samples of schools, not all 15-year-olds are students. So, in countries with a high dropout rate, scores artificially boosted because the unstudious are not tested. For example, Mexico had a 61% "coverage rate" compared to 82% in the U.S. and 93% in Finland. (PDF, p. 176)
(Similarly, the overall ethnic gaps within the U.S. are larger than those shown in my second chart because of higher dropout rates among Hispanics and blacks.)
A partial exception to this problem appears to have been Argentina. Its dreadful 398 score appears to be the result of the Argentinean administrators diligently tracking down and testing representatives of the 39 percent of Argentinean 15-year olds who are not in regular schools. Normal students in Argentina averaged 439 in reading (compared to 425 for students in Mexico), but alternative schoolers averaged only 335. (PDF, p. 185).
Many other countries would likely see their overall scores drop, too, if they had been as assiduous as the Argentines at testing the full range of 15-year-olds.
Second: Might administrators cheat? It has happened. PISA noticed patterns of results suggesting large scale cheating in Azerbaijan. Yet Azerbaijan still finished ahead of only Kyrgyzstan. (By the way, wouldn’t you think that Kyrgyz students would at least be good at spelling?) More sophisticated countries than Azerbaijan might have figured out more sophisticated, and thus less detectable, ways to game the system. We can’t know for sure.
Third: how motivated students are to work hard on the test is a significant imponderable. Administrators and teachers are supposed to motivate the students, but not threaten them. So, at least in theory, PISA is a low-stakes test for students. Doing badly on it isn’t supposed to hurt them. Which means the temptation to slack off and "bubble in" some of the answers is always there.
PISA itself mentions the motivation quandary. The PISA report notes on p. 188 that Austria’s weak score of 470 was likely driven down by the teacher’s union call to boycott the PISA test as part of its labor dispute with Austria’s education minister: the "negative atmosphere … under which the assessment was administered … could have adversely affected student motivation to respond to the PISA tasks."
What about the flip side: positive motivation? Having watched the 2008 Olympics from China and seen how much China’s government cares about international victory, I wouldn’t be surprised if Shanghai school officials suspected that low scores coming out of their bailiwicks might get them reassigned to Inner Mongolia, pour l'encouragement des autres.
Finally, what about Finland, which always does very well on PISA? (Although Finland only scored above average when it participated in TIMSS in 1999.)
Finns tend to be both patriotic and public spirited, so it might not be hard to talk most Finnish teens into giving a solid effort on a two-hour PISA test by rationally explaining to them how that would be good for everybody.
Or, maybe Finns, who rank with Icelanders as the most northerly advanced culture on Earth, really are smarter (although IQ tests suggest they are about average for Western Europeans).
Of course, one obvious factor contributing to Finland’s high national scores: Finland benefits from not having its scores undermined by immigrants from low-scoring cultures. (Yet).
And maybe Finnish schools are better. But it’s noteworthy that American education reporters who have taken junkets to Finland since the PISA tests began a decade ago report back that Finnish schools are the laidback opposite of the pressure cooker cram schools popular in the Northeast Asian countries that score similarly to Finland.
It may be that, enjoying a non-diverse population, Finnish educators have been able to fine-tune their system to meet the particular needs of Finnish students.
That would be a major contrast to the United States. Here, the needs of low-scoring black and Hispanic students obsess educators and pundits and inspire a great deal of double-think and mendacity. But no-one has any idea what to do about it, especially in the one-size-fits-all culture of the government school industry.
Bottom line: Keeping the U.S. globally competitive turns out to depend less upon our endlessly-discussed need to "fix the schools "and more upon the need to "fix the demographic trends".
But this topic currently unmentionable in public debate and, for many in public life, literally unthinkable.
It might even lead to Americans doing something about immigration policy.
This is a content archive of VDARE.com, which Letitia James forced off of the Internet using lawfare.