The results of the 2023 International Computer and Information Literacy Study (ICILS) were just released. ICILS is given to a sample of eighth-grade students and is supposed to measure their ability to use information and communications technology. Including the United States, 35 education systems worldwide participated in the study of “computer and information literacy,” while a subset of 24 also participated in the optional computational thinking module.
Given how much time American students spend on digital devices and “communications technology,” you would think our students would excel. But just like on the far better-known international assessment of 15-year-olds, the Programme for the International Student Assessment (PISA), our results on ICILS were mediocre at best. Our students performed at the international average—which includes countries such as Serbia, Oman, Kosovo, and Azerbaijan. Turning to individual country scores, 20 countries, those we might consider more like peers, such as Germany, France, Korea, and Spain, outperformed us. Our students performed at the international average—which includes countries such as Serbia, Oman, Kosovo, and Azerbaijan. Turning to individual country scores, 20 countries, including many OECD peers such as Germany, France, Korea, and Spain, beat us.
Twelve countries—including the US—took the previous administration of ICILS in 2018, so we can compare scores over time. ICILS scores worldwide declined from 2018 to 2023, but the US experienced the steepest decline, losing 37 points. This sharp drop is likely linked to learning loss during the COVID pandemic. Indeed, other large-scale assessments, like PISA and the US government’s National Assessment of Educational Progress. also showed steep declines in student performance from pre-pandemic levels.
Our students were also mediocre on the assessment of computational thinking, which is the “process of working out exactly how computers can help us solve problems.” In plain English, that means programming and manipulating datasets.
Students in the US did not even reach the top half of countries in computational thinking, lagging far behind countries such as Taiwan, South Korea, France, and Germany. Only nine countries participated in both the 2018 and 2023 administrations of this computational thinking module—and the US again experienced the greatest decline, matching the –37-point decline in the computer literacy module.
American students have performed poorly on international assessments for many years, but our economy outperforms every other large, advanced economy on earth. In the real world outside of ICILS, computers are ubiquitous in our economy, the nation is leading in the adoption of AI technology, but modern IT skills such as prompt engineering are not even touched upon in ICILS.
The computer skills needed by today’s workers look very different from key skills in 2013, when the first ICILS assessment was conducted. Still, sample questions from the most recent rendition of ICILS asked students to “design a presentation about breathing” and to “create an information sheet” about a museum—both tasks that can now be done using AI in a fraction of the time a human would take. Before wringing our hands about Americans’ test results, we should ask whether assessments like ICILS are measuring the skills that matter for the world of today.
In the meantime, perhaps it’s also best for the US (and other nations) to pause participation in these assessments to figure out what is being tested and determine the relevance of these assessments for today’s world. And perhaps instead of testing for information skills defined over a decade ago, the organizations that sponsor these large-scale international assessments should harness modern IT to update their assessments with skills that will remain relevant to the workforce of tomorrow.