It was bad news for the U.S. when. PISA (or the Programme for International Student Assessment), announced the results of its 2009 assessments of 5,000 randomly chosen 15-year-olds in 65 countries. The United States, the superpower still waiting for super-heroic education reform, didn’t fare well. We ranked as average in reading and science (similar to countries like France and Hungary) and below average in mathematics (along with Spain, Turkey, and Croatia).

The super-learners included the usual suspects: South Korea and Finland led the pack of sovereign nations, followed by Singapore, Canada, New Zealand, and Japan. A couple of standout Chinese cities also topped the list: Newcomer Shanghai pummeled the competition by ranking first in all categories, and Hong Kong, which has participated in the past and always performs well, ranked near the top too.

What do PISA results mean for U.S. parents?

On the most rudimentary level, the news is not comforting. Education Secretary Arne Duncan declared that the results are “a wake-up call,” revealing the “brutal truth” that our students are being “out-educated.” Especially in the face of top-performing Shanghai, the test offered more evidence that our supremacy is on the wane while other (not-so-long-ago desperately poor and uneducated) world powers are rising. Let little inconspicuous Finland be first in education, sure. But China? Them’s fightin’ words.

Even the blurb announcing the results on PISA’s website seemed carefully calibrated in anticipation of a panicked response to Shanghai’s performance: “Korea and Finland top OECD’s latest PISA survey of education performance. The next strongest performances were from Hong Kong-China, Singapore, Canada, New Zealand and Japan. The municipality of Shanghai also tops the rankings.”

“Also tops”? In fact, Shanghai outperformed the second-place countries by 17 points in reading, 38 points in math, and 21 points in science. The inclusion of the 20-million-strong modern metropolis marked the first time that PISA had tried to assess China’s mainland educational system. Of course, extrapolating much about a massive and diverse country from its most modern, well-educated city may not produce the most accurate picture of the nation as a whole. On the other hand, who cares? Could any metropolitan area of our country come close to Shanghai’s performance? Not likely. Even affluent, well-educated Massachusetts, the crown jewel of U.S. education, wouldn’t have made it into PISA’s top echelon.

Untangling significance from statistics

Yet parsing these results is easier said than done. Dive into the number-crunching debate about what these numbers really mean, and you’re likely to emerge psychically bloodied and thoroughly confused. To misquote w.shakes, that prototypical blogger: “Oh what a tangled web we weave when first we attempt to quantify high school academic performance across countries and languages!”

Ranking the rankings

On the one hand, PISA ranks and displays countries individually for each subject matter. For instance, the United States’ science score of 502 just edges out the Czech Republic’s 500 but falls short of Hungary’s 503. These numbers may provide fodder for interesting cocktail banter, but statistical uncertainties make such detailed rankings of little value. When GreatSchools worked with analysts to create the Education Nation Scorecard, our educational statisticians cautioned against displaying individual rankings at all.

Big generalities

PISA breaks up the results into broad categories: below average, average (always around 500), and above average. Acknowledging that differences across languages, curricula, and cultures can play a role in how students perform on tests, PISA queries participating countries to choose which questions would most accurately reflect the knowledge of its students, and calculates two kinds of results for each country — the overall result on all questions (the rankings it publishes) and the result based on the preferred questions. Presumably, if there were a huge distinction between the two, you could deduce that the general test was culturally biased against the students of a particular country. In the case of the United States, however, PISA reported that the preferred-questions results didn’t significantly differ from the general results.

Does PISA topple under the weight of scrutiny?

Critics of standardized testing question the whole endeavor: Such tests, by definition, always entail a reductionist approach to education. School systems like China’s and South Korea’s are always going to perform better than countries that don’t “teach to the test.” But PISA claims to be testing students’ ability to apply their knowledge to real-life problems that require ingenuity and creative problem solving. Sounds good, right? After sampling a few of the math, reading, and science questions, however, I marveled at the creative thinking of PISA’s marketing machine rather than the creative thinking required of test takers. It’s a smartly designed standardized test but no more. How it can claim to test creative problem solving, I have no idea.

Still, it’s not worth dismissing PISA out of hand. It’s not that we want our students to be test-taking machines. But we do want them to be able to read a short, easy passage about a worldly topic — like global warming or how running shoes are designed (to name two from the reading samples) — and understand it well enough to answer a few questions.

What gets less press but may be far more valuable than its test results is PISA’s analysis of countries’ achievement gap, immigration rates, and socioeconomic status in relation to their educational performance. Not surprisingly, socioeconomic status was linked to higher test scores, but many school systems — like Shanghai’s and Singapore’s — don’t allow socioeconomic diversity to translate into low expectations.

Probably the most interesting document for U.S. parents is a 259-page report buried on the PISA website, “Lessons From PISA for the United States,” which analyzes data from Canada, Asia, and other regions through an American prism. Before we decide this is the crisis to trigger the next “Sputnik moment” (as President Barack Obama has suggested) or a test with little merit, it’s worth stepping back from the hype and turning this into a learning moment — then bracing for the results of the next PISA assessment, due out in December, 2013.

Share on Pinterest