Sunday, June 17, 2012

More Things Wrong with International Assessments Like PISA

On February 17th, I posted to this blog some thoughts on the ubiquitous "international assessments" of achivement. From the access logs I can tell that such topics command wide attention. Unfortunately, the attention paid to criticisms of the international assessments is merely the other side of the coin: far too much is made of these virtually meaningless exercises. Why does anyone think that comparing how many kids in the U.S. vs. Slovenia or Singapore solve a math problem will inform the movement toward better schools? Outsiders always think that what they are observing is simpler than it is in reality, whether they are watching someone play a trombone, or hit a tennis ball, or teach a child to read.

The point I was making last February about international assessments had to do with the gullibility of not just the general public but of professional educators themselves. People swallow without a second thought the comparison of a few dozen nations on a test of reading comprehension! Seemingly no one questions the claim that a test can be written in two different languages while controlling for its difficulty. But it is a claim that is patently false. I wrote in February:

There is something far more wrong with these international assessments and comparisons than anyone seems interested in talking about. Think! ... The obvious question is “In what language is the test written?” And the obvious answer is “In the language of that nation.” But who is drawing the obvious conclusion? How in heaven’s name can you construct a reading test in dozens of different languages (English, Hungarian, Norwegian, and yes, Finnish) and be confident that the test is equally difficult in all of these languages? Well, the answer is that you can’t. It should be perfectly obvious to anyone who thinks about it for more than five minutes that it is impossible.

... A recent article in the Scandanavian Journal of Educational Research by Inga Arffman carries the title “Equivalence of translations in international reading literacy studies,” (Vol. 54, No. 1, 37-59). The paper summarizes a study that examined the problems encountered in translating texts in international reading assessments. And in spite of the fact that Arffman is a faculty member of the University of Jyväskylä in Finland—which has every motive possible to believe that PISA Reading assessments are the most valid tests in the history of psychometrics—the conclusion of the research is that '...it will probably never be possible to attain full equivalence of difficulty in international reading literacy studies....' Amen.

And now a few months later comes a great reference from the work of Jerry Bracey — our now departed and forever respected colleague to whom we all owe a continuing debt of gratitude for his marvelous critiques of scientistic buffoonery in education. (I thank David Berliner for bringing this reference to my attention.)

Posted by Susan Williams on February 24, 2012, to a Teachers College Record blog.

Gerald Bracey's research in the 1990s ... exposed the hidden "competitive edge" of Finland's schools: the Finnish language is eentsy weentsy compared to the English language. We have far, far more vocabulary words than they do.

So when a standardized test is internationalized — translated into Finnish, in this case — it gets really easy for the Finnish kids. ... But they're really not outdoing us.

Bracey used the example of an international test question in which students were to tell whether these two words are synonyms or antonyms: pessimistic and sanguine. Only 50% of Americans got that right. But 98% of Finnish kids did. Oh! Are they that much smarter? Are their schools that much better? Noooooo. Finnish has no equivalent for the word "sanguine," so the word substituted for it was a dead giveaway — optimistic. Duhhhh!

Gene V Glass
University of Colorado Boulder
Arizona State University