A83d84cb en
A83d84cb en
A83d84cb en
21st‑Century Readers
This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over
any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area.
The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities. The use of
such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in
the West Bank under the terms of international law.
Note by Turkey
The information in this document with reference to “Cyprus” relates to the southern part of the Island. There is no single
authority representing both Turkish and Greek Cypriot people on the Island. Turkey recognises the Turkish Republic of
Northern Cyprus (TRNC). Until a lasting and equitable solution is found within the context of the United Nations, Turkey
shall preserve its position concerning the “Cyprus issue”.
Note by all the European Union Member States of the OECD and the European Union
The Republic of Cyprus is recognised by all members of the United Nations with the exception of Turkey. The
information in this document relates to the area under the effective control of the Government of the Republic of Cyprus.
PISA
ISSN 1990-8539 (print)
ISSN 1996-3777 (online)
This work is available under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 IGO (CC BY-NC-SA 3.0 IGO). For specific information regarding the scope and
terms of the licence as well as possible commercial use of this work or the use of PISA data please consult Terms and Conditions on www.oecd.org.
Editorial
21st Century Readers
Globalisation and digitalisation have connected people, cities, countries and continents in ways that vastly increase our individual
and collective potential. But the same forces have also made the world more volatile, more complex, more uncertain and more
ambiguous. In this world, education is no longer just about teaching students something but about helping them develop a
reliable compass and the tools to navigate ambiguity.
Literacy in the 20th century was about extracting and processing pre-coded and – for school students – usually carefully curated
information; in the 21st century, it is about constructing and validating knowledge. In the past, teachers could tell students to look
up information in an encyclopaedia and to rely on that information as accurate and true. Nowadays, Google presents them with
millions of answers and nobody tells them what is right or wrong, and true or not true. The more knowledge technology allows us
to search and access, the more important it is to develop deep understanding and the capacity to navigate ambiguity, triangulate
viewpoints, and make sense out of content.
PISA 2018 results show that when students were confronted with literacy tasks that required them to understand implicit cues
pertaining to the content or source of the information, an average of just 9% of 15-year-old students in OECD countries had
enough of a reading proficiency level to be able to successfully distinguish facts from opinions. True, this figure is up from
7% in 2000 but, in the meantime, the demand for literacy skills has fundamentally changed.
The fact that advancements in literacy skills have fallen sharply behind the evolution of the nature of information has profound
consequences in a world where virality seems sometimes privileged over quality in the distribution of information. In the
“post-truth” climate in which we now find ourselves, assertions that “feel right” but have no basis in fact become accepted as fact.
Algorithms that sort us into groups of like-minded individuals create social media echo chambers that amplify our views and leave
us insulated from opposing arguments that may alter our beliefs. These virtual bubbles homogenise opinions and polarise our
societies; and they can have a significant – and adverse – impact on democratic processes. Those algorithms are not a design
flaw; they are how social media work. There is a scarcity of attention but an abundance of information. We are living in this digital
bazaar where anything that is not built for the network age is cracking apart under its pressure.
The question is then: How can we live successfully in this new world of information? Do we approach the issue from a consumer
protection or supply side angle? In modern societies, it seems impossible to treat knowledge in the same way we treat physical
products; that is, by making sure they meet consumer protection regulations: requiring information to comply with “protective”
standards would be perceived as an immediate threat to democratic principles.
The result is that the market for information remains unregulated. Can and should we place certain constraints on the behaviour
and pronouncements of the influential and powerful? Can and should we introduce more robust standards for our gatekeepers,
the journalists, who play such an important role in holding power to account? Has the time come to extend consumer protection
to people as absorbers of information, who are – let us not forget – voters? And if we do so, how will this restrict freedom
of speech and creativity in knowledge creation? Transparency in political advertising in the social media sphere also merits
closer attention given its increasingly prevalent use. The degree and sophistication of targeting techniques being deployed are
astounding and they are poorly understood by the majority of social media users.
The latest PISA report, 21st Century Readers: Developing literacy skills in a digital world, takes a different and equally important
perspective, which is to focus on the skills angle. It looks at ways to strengthen students’ capacity to navigate the new world of
information. It studies the ways in which students access digital technology, how skilled they are with complex digital reading
tasks – and how this varies by geography, social background or gender. It also explores what teachers do to help students
navigate ambiguity and manage complexity. The good news is education can make a difference. The report shows that education
systems in which more students are taught digital skills have a higher percentage of students who can correctly distinguish facts
from opinions in the PISA tasks. On average across OECD countries, 54% of students said they were trained at school to recognise
whether information is biased or not. This proportion varies across countries. It is also interesting that the relationship between
students’ access to training on how to detect biased information and their capacity to actually distinguish fact from opinion
plays out quite differently across countries. There seems to be room for countries and schools to learn from each other how to
implement such training programmes most successfully.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 3
Editorial
The Covid-19 pandemic, which made digital technologies the lifeline for education, has increased the urgency with which this
needs to be addressed. It has also increased momentum among children, teachers and policy makers to support 21st-century
readers. To some schoolchildren and even teachers, disinformation in pre-pandemic times might have seemed a remote and
political concern with little relevance in the schoolyard and staff room. Today, the infodemic and general unease and uncertainty
it sows about basic scientific and health-related facts has captured the focus of 15-year-old students – and their desire for tools
and solutions.
As international debate focuses on foreign trolls and conspiracy theorists, the moment is emerging to integrate a new digital literacy
into learning and teaching that guarantees its independence from partisan and commercial influence. 21st-century literacy means
stopping to look left and right before proceeding online. It means checking facts before basing opinions on them. It means asking
questions about sources of information: Who wrote this? Who made this video? Is it a credible source? Does it even make sense?
What are my biases? All this belongs in school and teacher-training curricula. It has applications far beyond detecting fake news and
disinformation: to secure the act of making informed decisions is to secure the basis for functioning democracies.
The growing complexity of modern living for individuals, communities and societies suggests that the solutions to our problems
will be ever more complex: in a structurally imbalanced world, the imperative of reconciling diverse perspectives and interests
in local settings with sometimes global implications will require young people to become adept in handling tensions, dilemmas
and trade-offs. Being able to strike a balance between competing demands – equity and freedom, autonomy and community,
innovation and continuity, efficiency and democratic processes – all hinges on 21st-century literacy skills.
Last but not least, the report highlights how countries need to redouble their efforts to combat emerging digital divides.
Disadvantaged students from OECD countries are increasingly losing the cultural capital of having books in their home-learning
environments. And many of the most disadvantaged students can only access computers linked to the Internet at school.
The good news is that the strategies and tools to address these challenges and develop 21st-century literacy skills are ready. They
are being tried and tested by teachers all around the world who have understood what it means to educate students for their
future, rather than for our past.
Andreas Schleicher
Director for Education and Skills
Special Advisor on Education Policy
to the Secretary-General
4 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Foreword
Literacy in the 21st century is about constructing and validating knowledge. Digital technologies have enabled the spread
of all kinds of information, displacing traditional formats of usually more carefully curated information such as newspapers.
The massive information flow of the digital era demands that readers be able to distinguish between fact and opinion. Readers
must learn strategies to detect biased information and malicious content like fake news and phishing emails. The infodemic in
which events like the Covid-19 pandemic has immersed us makes it harder to discern the accuracy of information when reaction
time is crucial. It illustrates how essential it is to be a proficient reader in a digital world.
But the literacy landscape is not just about the quality and overload of information, it is also about how we spend an increasing
amount of time online, and in the case of children, autonomously. With school closures, students have had to do their schooling
at home and on their own. Though teachers have certainly tried their best to provide remote support, students have had to learn
in a much less structured and guided home environment. And, for the most part, their learning has taken place almost exclusively
through a computer and an online connection. This sudden autonomy highlights the need for students’ basic digital literacy.
What the PISA 21st-century readers report reveals is that students’ access to digital technologies and training on how to use them
greatly vary between countries and students’ socio-economic profiles.
The Covid-19 pandemic made digital technologies the lifeline for not just education but work, information and leisure.
Our unprecedented reliance on the digital world during this crisis has created momentum to harness its power for better
learning opportunities. Digitalisation can respond to a greater variety of learning needs, scale and disseminate effective practice,
achieve efficiency gains, and integrate learning and assessment better. At the same time, education stakeholders are beginning
to understand that we must counter some of the disruptive effects of digitalisation in and for education: this starts with digital
reading skills.
This report explores how 15-year-old students are developing reading skills to navigate the technology-rich 21st century. It sheds
light on potential ways to strengthen students’ capacity to navigate the new world of information. It highlights how countries
need to redouble their efforts to combat emerging digital divides. It also explores what teachers can do to help students navigate
ambiguity and manage complexity.
This report is the product of a collaborative effort between the countries and economies participating in PISA and the OECD
Secretariat. The report was prepared by Javier Suarez-Alvarez with contributions from Giannina Rech. Qiwei He (Educational
Testing Service (ETS)) analysed and drafted Chapter 3. Analytical and statistical support was provided by Pierre Gouëdard,
Rodrigo Castaneda Valle and Filippo Besa. Clara Young edited the report. Andreas Schleicher, Yuri Belfali, Miyako Ikeda,
Alfonso Echazarra and Francesco Avvisati provided valuable feedback at various stages of the report. This report also benefitted
from the input and expertise of Dominique Lafontaine and Jean-François Rouet, members of the PISA 2018 reading expert group
that guided the preparation of the PISA 2018 reading assessment framework and instruments. For Chapter 3, Irwin Kirsch,
Claudia Tamassia, Eugenio Gonzalez, and Frederic Robin of the Center for Global Assessment at ETS provided valuable
input at various stages of the development. Michael Wagner, Mathew Kandathil, Lokesh Kapur and Shuwen Zhang at ETS
prepared the log files used in this report. Production was co-ordinated by Alison Burke and Della Shin laid out the publication.
Administrative support was provided by Thomas Marwood and Lesley O’Sullivan. The development of the report was steered
by the PISA Governing Board, chaired by Michele Bruniges (Australia), with Peggy Carr (United States), Sukit Limpijumnong
(Thailand) and Carmen Tovar Sánchez (Spain) as vice-chairs. ETS provided partial in-kind support for the preparation of
Chapter 3. This report was prepared with the support of the Vodafone Germany Foundation.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 5
Table of contents
EDITORIAL. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
FOREWORD. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
TABLE OF CONTENTS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
EXECUTIVE SUMMARY. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
READER’S GUIDE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
CHAPTER 3 DYNAMIC NAVIGATION IN PISA 2018 READING ASSESSMENT: READ, EXPLORE AND INTERACT. . . . . . . . . . . 51
Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
• Multiple-source reading items and dynamic navigation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
What constitutes good dynamic navigation?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Students’ dynamic navigation behaviour in different countries and economies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
• Overall navigation activity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
• Task-oriented navigation activity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
• Time spent on initial pages and interval between navigation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
The relationship between reading performance and navigation behaviour. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
• Association between reading performance and quantity and quality of navigation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
• Association between reading performance and time spent in navigation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
CHAPTER 4 THE INTERPLAY BETWEEN DIGITAL DEVICES, ENJOYMENT, AND READING PERFORMANCE. . . . . . . . . . . . . . . . . . 77
Do 15-year-olds spend more time reading for enjoyment than two decades ago?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Do 15-year-olds spend more time reading for enjoyment on paper or digital devices?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Are digital technologies helping improve students’ reading experience? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 7
Editorial
Table of Contents
BOXES
Box 1.1. How has Internet use changed between 2012 and 2018?....................................................................................................... 20
Box 1.2. Changes between 2009 and 2018 in the PISA assessment of reading literacy............................................................................. 22
Box 1.3. What the literature says about digital reading compared to print reading................................................................................. 27
FIGURES
Figure 1.1 Time spent on the Internet........................................................................................................................................... 21
Figure 1.2 Time spent on the Internet in 2012, 2015, 2018................................................................................................................ 21
Figure 1.3 PISA 2018 Reading framework processes........................................................................................................................ 24
Figure 2.1 Change between 2009 and 2018 in access to a computer that they can use for schoolwork and a link to the Internet at home........ 40
Figure 2.2 Access to a computer linked to the Internet at home for doing schoolwork, by school’s socio-economic status............................. 41
Figure 2.3 Relationship between access to digital resources at home and emergent aspects of reading.................................................... 42
Figure 2.4 Reading item of distinguishing facts from opinions and access to training on how to detect biased information in school.............. 44
Figure 2.5 Reading item of distinguishing facts from opinions and reading performance........................................................................ 45
Figure 2.6 Correlations between access to learning digital skills in school and the reading item of distinguishing facts from
opinions in OECD countries.......................................................................................................................................... 46
Figure 2.7 Correlations between access to learning digital skills in school and the reading item of distinguishing facts from
opinions in all participating countries............................................................................................................................. 47
Figure 3.1 Screenshot of the first item in the Rapa Nui reading unit (CR551Q01)................................................................................... 53
Figure 3.2 Screenshot of an item with multiple-source requirement in the Rapa Nui reading unit (CR551Q10)............................................ 54
Figure 3.3 Distribution of reading performance of students who responded to the Rapa Nui reading unit.................................................. 55
Figure 3.4 Screenshot of instruction page...................................................................................................................................... 57
Figure 3.5 Average number of pages visited in items with single- and multiple-source requirements in Rapa Nui unit.................................. 58
Figure 3.6 Overall navigation quantity in single- and multiple- source items ......................................................................................... 59
Figure 3.7 Task-oriented navigation activities.................................................................................................................................. 60
Figure 3.8 Correlations between navigation quantity and navigation behaviour groups ......................................................................... 62
Figure 3.9 Median time spent on initial reading pages in items with single- and/or multiple- source requirements by countries/economies...... 63
Figure 3.10 Average ratio of time spent on initial reading page with single- and/or multiple- source requirements by countries/economies....... 64
Figure 3.11 Association between reading performance and average ratio of time spent on initial reading page with single- and/or
multiple-source requirements by countries/economies...................................................................................................... 65
Figure 3.12 Average ratio of effective visits in dynamic navigation with single- and/or multiple- source requirements by countries/economies... 66
Figure 3.13 Distribution of navigation behaviours, by reading proficiency levels...................................................................................... 67
Figure 3.14 Association between reading performance and navigation behaviour................................................................................... 68
Figure 3.15 Association between reading performance and click actions............................................................................................... 69
Figure 3.16 Association between reading performance and time spent on the initial page........................................................................ 70
Figure 3.17 Association between reading performance and average ratio of time spent on the initial page during the reading process ............ 71
Figure 3.18 Cluster centroids of visiting page sequence in CR551Q11................................................................................................... 72
Figure 3.19 Cluster centroids of transition time sequence in CR551Q11................................................................................................ 73
Figure 3.20 Distribution of reading performance scores by clusters of page and time sequence................................................................ 74
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 9
Editorial
Table of Contents
Figure 5.1 Index of perception of difficulty of the PISA reading test, by student characteristics.............................................................. 102
Figure 5.2 Relationship between the perception of the difficulty of the PISA reading test and performance in ‘multiple’ source text .............. 103
Figure 5.3 Relationship between the perception of the difficulty of the PISA reading test and reading performance.................................. 104
Figure 5.4 Relationship between the perception of the difficulty of the PISA reading test and single- and multiple-source scores................. 105
Figure 5.5 Perceived difficulty of the PISA test across levels of performance....................................................................................... 106
Figure 5.6 Index of knowledge of reading strategies for assessing the credibility of sources, by student characteristics.............................. 109
Figure 5.7 Relationship between the reading item of distinguishing facts from opinions and the index of knowledge of reading
strategies for assessing the credibility of sources............................................................................................................ 110
Figure 5.8 Relationship between knowledge of reading strategies for assessing the credibility of sources and reading performance............. 111
Figure 5.9 Relationship between knowledge of reading strategies for assessing the credibility of sources, and single- and
multiple-source scores............................................................................................................................................... 112
Figure 5.10 Index of knowledge of reading strategies for assessing the credibility of sources, by navigation behaviours and gender............... 113
Figure 5.11 Student’s self-perception of reading competence as a mediator of the relationship between socio-economic background,
gender, and reading performance................................................................................................................................. 114
Figure 5.12 Student’s knowledge of reading strategies as a mediator of the relationship between socio-economic background, gender,
and reading performance........................................................................................................................................... 115
Figure 5.13 Perception of reading competence, knowledge of reading strategies, socio-economic status and gender
as predictors of reading performance........................................................................................................................... 116
Figure 6.1 Index of teacher’s stimulation of reading engagement perceived by student, by student characteristics .................................... 121
Figure 6.2 Change between 2009 and 2018 in teachers’ stimulation of reading engagement.................................................................. 122
Figure 6.3 Reading performance, by the type of text read for school.................................................................................................. 123
Figure 6.4 System-level relationship between reading fiction for school and reading fiction for pleasure.................................................. 124
Figure 6.5 Length of the longest piece of text that students had to read for school............................................................................... 125
Figure 6.6 Reading performance, by the length of text read for school............................................................................................... 126
Figure 6.7 The length of text read for school, by proficiency levels and gender..................................................................................... 126
Figure 6.8 Reading performance, by the length of the text read for school......................................................................................... 128
Figure 6.9 System-level relationship between reading performance and the average length of the longest piece of text read for school......... 129
Figure 6.10 Frequency of use of digital device for teaching and learning in test language lessons............................................................ 130
Figure 6.11 Reading performance and time spent using digital devices for school................................................................................ 130
Figure 6.12 Frequency of activities on digital devices in school........................................................................................................... 131
Figure 6.13 Relationship between reading performance and the type of school activities done on digital devices......................................... 132
Figure 6.14 Reading performance and browsing the Internet for schoolwork........................................................................................ 133
Figure 6.15 Reading performance and playing simulations at school................................................................................................... 133
Figure C1.1 Distribution of number of pages visited on sequential seven items through the Rapa Nui unit................................................ 208
Figure C1.2 Distribution of time spent on sequential seven items in the Rapa Nui unit........................................................................... 209
TABLES
Table 1.1 Approximate distribution of tasks by targeted process and text source . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Table 2.1 Comparing countries’ and economies’ performance in reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Table 3.1 Item characteristics and difficulty in the Rapa Nui reading unit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Table 3.2 Overall average of students’ reading proficiency levels in the Rapa Nui unit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Table 3.3 A summary of navigation indicators developed in the Rapa Nui study ................................................................................... 56
Table 3.4 Overall average of time spent on instruction page in Rapa Nui unit, by students’ reading proficiency levels. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Table 3.5 Correlations between percentage of students in navigation behaviours groups and performance score . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Table 3.6 Number of pages visited, by reading proficiency levels ....................................................................................................... 67
Table B.1.3 Time spent on the Internet in total in 2012, 2015, 2018 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Table B.2.2 Change between 2009 and 2018 in the percentage of students with access to the Internet and having a computer
that they can use for schoolwork at home . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
Table B.2.6 Frequency of opportunity to learn digital literacy skills at school . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
Table B.3.9 Task-oriented navigation behaviours . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168
Table B.4.1 Enjoyment of reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
Table B.4.16 Average time of reading for enjoyment, reading performance, and enjoyment of reading, by the format of reading . . . . . . . . . . . . . . . . 178
Table B.5.1 Student’s perception of difficulty in taking the reading assessment .............................................................................. 184
Table B.5.11 Student’s knowledge of reading strategies for assessing the credibility of sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
Table B.6.11a Reading performance by the length of text read for school . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
Table B.6.15 Frequency of use and time using digital devices for teaching and learning during classroom lessons AND
outside classroom lessons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
Table C3.1 Confusion matrix of overall average percentage of students in four navigation categories in CR543 and CR544 given their
behaviour in the Rapa Nui unit (CR551) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
http://www.facebook.com/OECDPublications
http://www.linkedin.com/groups/OECD-Publications-4645871
http://www.youtube.com/oecdilibrary
OECD
Alerts http://www.oecd.org/oecddirect/
Look for the StatLinks2at the bottom of the tables or graphs in this book.
To download the matching Excel® spreadsheet, just type the link into your
Internet browser, starting with the http://dx.doi.org prefix, or click on the link from
the e-book edition. gpseducation.oecd.org
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 11
Executive Summary
Digital technologies revolutionised the written word in the 21st century. In the past, mass production of printed books made
information widely available and incentivised people to develop reading skills. Still, the production of books remained in the
hands of the few, not the many. With digital technologies, all that has changed. Everyone can become a journalist or a publisher.
People now find millions of answers to their questions on the Internet at the click of a button. But what they have lost is the
certainty of what is right or wrong, true or not true. Literacy in the 21st century is about constructing and validating knowledge.
The more information there is, the more readers have to know how to navigate through ambiguity, and triangulate and validate
viewpoints.
Reading in a digital world is even more challenging given the increasing production and consumption of media content.
Sometimes, it seems that the speed of information dissemination comes before the quality of the information itself.
This contributes to “fake news”, misinformation and a “post-truth” climate. Social media algorithms are designed to channel
the flow of likeminded people towards each other. This creates “echo chambers”, which reinforce our thoughts and opinions
rather than challenge them, fuelling people’s confirmation bias. The digital divide exacerbates these challenges for the most
disadvantaged. Many students do not have access to the Internet at home and must rely on schools to learn and practice their
digital skills. With the Covid-19 pandemic and school closures, students have had to do their schooling at home and on their
own. This crisis makes plain that it is urgent to develop autonomous and advanced reading skills to prepare young people for
an increasingly volatile, uncertain, and ambiguous world.
Reading was the main subject assessed in PISA 2018, and the reading framework was devised to include essential reading skills
in a digital world. This report provides important insights into how 15-year-old students are developing reading skills to navigate
the technology-rich 21st century.
• Half or less of students had access to both a connection to the Internet at home and a computer they could use for
schoolwork in the Dominican Republic, Indonesia, Malaysia, Mexico, Morocco, Peru, the Philippines, Thailand, and Viet Nam.
This percentage was lower than 20% in rural areas of Indonesia, Mexico, Morocco and the Philippines.
• Four in five disadvantaged students in Malaysia, Mexico, Morocco, Peru, the Philippines and Viet Nam do not have access to
the Internet at home but at school only.
Opportunity to learn
• On average across OECD countries, some 54% of students reported being trained at school on how to recognise whether
information is biased.
• Students were asked to click on the link of an e-mail from a well-known mobile operator and fill out a form with their data to win
a smartphone, also known as phishing e-mails. Approximately 40% of students on average across OECD countries responded
that clicking on the link was somewhat appropriate or very appropriate.
• Education systems with a higher proportion of students who were taught how detect biased information in school and who have
digital access at home were more likely to distinguish fact from opinion in the PISA reading assessment, even after accounting
for country per capita GDP.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 13
Editorial
Executive Summary
• On average across OECD countries, the index of knowledge of effective reading strategies for assessing the credibility of
sources is the most strongly associated with reading performance after accounting for students’ and schools’ socio-economic
status. The other two reading strategies (i.e. the indices of student knowledge of reading strategies for understanding and
memorising a text and summarising information) are also associated with reading performance.
• Boys reported they felt the PISA reading test was easier than girls did even though boys scored 25 points lower than girls in
reading after accounting for students’ socio-economic backgrounds.
• Almost two-thirds of the association between gender and reading performance can be accounted for by the difference
between boys’ and girls’ knowledge of effective reading strategies. Almost 30% of the association between socio-economic
background and reading performance can be accounted for by the difference between socio-economically advantaged and
disadvantaged students’ reported self-perception of reading competence.
• Compared to students who rarely or never read books, digital-book readers across OECD countries read for enjoyment about
3 hours more a week, print-book readers about 4, and those who balance both formats about 5 hours or more a week after
accounting for students’ and schools’ socio-economic background and gender.
Teachers’ practices
• Disadvantaged students and boys – who typically have a lower reading performance – perceived less stimulating reading
activities from their teachers in the 49 countries/economies participating in PISA 2018.
• Reading fiction and long texts for school more frequently was positively associated with reading performance in most
countries/economies after accounting for students’ and schools’ socio-economic profiles.
• The relationship between reading performance and time spent using digital devices for schoolwork was negative in
36 countries and economies after accounting for students’ and schools’ socio-economic status. However, this relationship was
positive in Australia, Denmark, Korea, New Zealand, and the United States.
Country coverage
This publication features data on 79 countries and economies, including all OECD countries (indicated in black in the figures) and
more than 40 partner countries and economies (indicated in blue in the figures).
The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities. The use of such data by
the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under the
terms of international law.
B-S-J-Z (China) refers to the four regions in China that participated in PISA 2018: Beijing, Shanghai, Jiangsu and Zhejiang.
Hong Kong (China), the Netherlands, Portugal and the United States: Data did not meet the PISA technical standards but were
accepted as largely comparable (see Annexes A2 and A4 from (OECD, 2019[1])).
In 2018, some regions in Spain conducted their high-stakes exams for tenth-grade students earlier in the year than in the past,
which resulted in the testing period for these exams coinciding with the end of the PISA testing window. Because of this overlap,
a number of students were negatively disposed towards the PISA test and did not try their best to demonstrate their proficiency.
Although the data of only a minority of students show clear signs of lack of engagement (see Annex A9 from (OECD, 2019[1])),
the comparability of PISA 2018 data for Spain with those from earlier PISA assessments cannot be fully ensured.
International averages
The OECD average corresponds to the arithmetic mean of the respective country estimates. It was calculated for most indicators
presented in this report.
The OECD total takes the OECD countries as a single entity, to which each country contributes in proportion to the number of
15-year-olds enrolled in its schools. It can be used to assess how a country compares with the OECD area as a whole.
In this publication, the OECD average is generally used when the focus is on comparing performance across education systems.
In the case of some countries, data may not be available for specific indicators, or specific categories may not apply. Readers should,
therefore, keep in mind that the terms “OECD average” and “OECD total” refer to the OECD countries included in the respective
comparisons. In cases where data are not available or do not apply for all sub-categories of a given population or indicator, the
“OECD average” is not necessarily computed on a consistent set of countries across all columns of a table.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 15
Editorial
Reader’s Guide
In analyses involving data from multiple years, the OECD average is always reported on consistent sets of OECD countries, and
several averages may be reported in the same table. For instance, the «OECD average-37» refers to the average across all 37 OECD
countries, and is reported as missing if fewer than 37 OECD countries have comparable data; the “OECD average-30” includes
only 30 OECD countries that have non-missing values across all the assessments for which this average itself is non-missing.
This restriction allows for valid comparisons of the OECD average over time.
The number in the label used in figures and tables indicates the number of countries included in the average:
• OECD average-37/OECD average: Arithmetic mean across all OECD countries.
• OECD average-31 (Chapter 2): Arithmetic mean across all OECD countries, excluding Chile, Colombia, Estonia, Israel,
Lithuania and Slovenia.
• OECD average-31 (Chapter 4): Arithmetic mean across all OECD countries, excluding Colombia, Estonia, Lithuania, the Slovak
Republic, Slovenia and Turkey.
• OECD average-25 (Chapter 1): Arithmetic mean across all OECD countries, excluding Canada, Colombia, France, Germany,
Lithuania, Luxembourg, the Netherlands, Norway, Portugal, Turkey, the United Kingdom and the United States.
The overall average corresponds to the arithmetic mean of the respective country/economy estimates. It was calculated for some
indicators presented in this report.
Rounding figures
Because of rounding, some figures in tables may not add up exactly to the totals. Totals, differences and averages are always
calculated on the basis of exact numbers and are rounded only after calculation.
All standard errors in this publication have been rounded to one or two decimal places. Where the value 0.0 or 0.00 is shown, this
does not imply that the standard error is zero, but that it is smaller than 0.05 or 0.005, respectively.
List of country codes - the following country codes are used in some figures of chapter 3
OECD countries ISO code OECD countries ISO code OECD countries ISO code
Australia AUS Hungary HUN New Zealand NZL
Austria AUT Iceland ISL Norway NOR
Belgium BEL Ireland IRL Poland POL
Canada CAN Israel ISR Portugal PRT
Chile CHL Italy ITA Slovak Republic SVK
Colombia COL Japan JPN Slovenia SVN
Czech Republic CZE Korea KOR Spain ESP
Denmark DNK Latvia LVA Sweden SWE
Estonia EST Lithuania LTU Switzerland CHE
Finland FIN Luxembourg LUX Turkey TUR
France FRA Mexico MEX United Kingdom GBR
Germany DEU Netherlands NLD United States USA
Greece GRC
Further documentation
For further information on the PISA assessment instruments and the methods used in PISA, see the PISA 2018 Technical Report
(OECD, forthcoming[1]).
12
This report has StatLinks at the bottom of tables and graphs. To download the matching Excel® spreadsheet, just type the link into
your Internet browser, starting with the https://doi.org prefix, or click on the link from the e-book version.
References
OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, [2]
https://dx.doi.org/10.1787/5f07c754-en.
OECD (Forthcoming), PISA 2018 Technical Report, OECD publishing, Paris. [1]
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 17
1
Digital literacy in the 21st century
This chapter discusses why reading
is key for citizens and societies in the
technology-rich 21st century and how
PISA 2018 defined and measured
reading literacy. The chapter highlights
how the PISA 2018 reading framework
was revised and expanded to include
essential reading skills in the digital
world.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 19
1 Digital literacy in the 21st century
– Students in Japan and Korea reported spending 23 and 22 hours per week connected to the Internet – which is 4 and
8 hours more than in 2015. In contrast, Denmark and Sweden’s students reported more than 45 hours per week online
– which is 10 and 8 hours more than in 2015.
Reading is increasingly embedded into a faster-paced digital- and screen-based culture. News is in real-time 24/7 and social
media reactions spread across the globe in a matter of seconds. At the same time, disinformation and fake news are jeopardising
democracies that function poorly when citizens are not well informed or worse, misled. Disinformation is not unique to digital
technologies but the Internet spreads and amplifies its impact. Students need to learn how to think critically, assess the accuracy
of information on the Internet, and solve problems on their own more than ever.
The ability to think critically is valued and emphasised in many countries and learning areas, suggesting that these skills are
highly transferable across most learning areas. For example, recent analyses from OECD Future of Education and Skills 2030
(hereafter Education 2030) suggest that critical thinking and problem-solving skills are mapped in over 60% of the curriculum on
average across countries participating in the Curriculum Content Mapping carried out in this study (OECD, 2020[2]). In addition
to critical thinking and problem-solving skills, students increasingly need to develop meta-cognitive skills and acquire knowledge
of effective reading strategies to navigate the Internet. Students need to be able to identify online risks like phishing emails.
They should be able to distinguish between fact and opinion when reading a piece of text on the Internet. Meta-cognitive skills,
however, are included in curricula to a lesser extent (OECD, 2020[2]).
Box 1.1. How has Internet use changed between 2012 and 2018?
In 1997, when the first PISA framework for reading began to be discussed, just 1.7% of the world’s population used
the Internet. By 2019, the number had grown to a global penetration rate of 53.6%, representing 4.1 billion people
(International Telecommunication Union, 2019[1]).
Since 2012, PISA has been asking students how frequently they use the Internet, both at school and outside of school.
On average across all countries and economies that distributed the optional ICT familiarity questionnaire between
2012 and 2015, the time that 15-year-olds reported spending on the Internet increased from 21 to 29 hours per week.
In PISA 2018, the total amount of time spent on the Internet increased to 35 hours per week (Figure 1.2). In other
words, this represents a 66% increase in just 6 years and almost as much time as a typical adult workweek. Of the total
time students spent on the Internet per week, around 77% was outside of school in 2018, even though the amount of
time spent on the Internet at school has increased from 13% to 23% of total time on the Internet from 2012 to 2018.
(Figure 1.2, Tables B.1.1 and B.1.2).
Despite the clear trend, there were still substantial differences across countries in student use of the Internet in 2018.
In Japan and Korea, for instance, students reported spending 23 and 22 hours per week connected to the Internet
– which is 4 and 8 hours more than in 2015 respectively. By contrast, Denmark and Sweden’s students reported more than
45 hours per week online – which is 10 and 8 hours more than in 2015 (Figure 1.2 and Table B.1.3).
1
Number of hours per week spent using the Internet
45
40
35
30
25
20
15
10
5
0
Panama
Japan
Denmark
Albania
Brazil
Malta
Greece
Chinese Taipei
Kazakhstan
Morocco
Sweden
Chile
New Zealand
Uruguay
Costa Rica
Bulgaria
Australia
United Kingdom
Hungary
Russia
Belgium
Singapore
France
Ireland
Czech Republic
Macao (China)
Slovenia
United States
Serbia
Latvia
Thailand
Iceland
Luxembourg
Estonia
Poland
Lithuania
Finland
Slovak Republic
OECD average
Spain
Israel
Brunei Darussalam
Switzerland
Turkey
Korea
Georgia
Dominican Republic
Mexico
Austria
Italy
Croatia
Countries and economies are ranked in descending order of the total number of hours per week spent using the Internet.
Source: OECD, PISA 2018 Database, Tables B.1.1 and B.1.2.
12 https://doi.org/10.1787/888934239306
Hours Total time 2012 Total time 2015 Total time 2018
50
45
40
35
30
25
20
15
10
5
0
Chile
Spain
Macao (China)
Chinese Taipei
Mexico
Japan
Slovenia
Hong Kong (China)
Denmark
Sweden
Costa Rica
Australia
Latvia
Estonia
Hungary
Singapore
Belgium
OECD average-25
Czech Republic
Slovak Republic
New Zealand
Uruguay
Russia
Iceland
Poland
Finland
Croatia
Austria
Switzerland
Korea
Ireland
Italy
Greece
Israel
Notes: All countries and economies that participated from PISA 2012 to PISA 2018, and with available data, are shown.
All differences between PISA 2018 and previous cycles are statistically significant, except the change between PISA 2015 and PISA 2018 for
Czech Republic.
OECD average-25 is the arithmetic mean across all OECD countries, excluding Canada, Colombia, France, Germany, Lithuania, Luxembourg,
the Netherlands, Norway, Portugal, Turkey, the United Kingdom and the United States.
Countries and economies are ranked in descending order of the total number of hours per week spent using the Internet in PISA 2018.
Source: OECD, PISA 2018 Database, Table B.1.3.
12 https://doi.org/10.1787/888934239325
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 21
1 Digital literacy in the 21st century
Reading is key to the growing and changing needs of an interconnected world. PISA 2018 showed that global competence
– the ability to easily move between local and global spheres – is strongly correlated with reading performance1 (OECD, 2020[3]).
This is not surprising as both reading and global competence require weighing the reliability and relevance of information,
reasoning with evidence and describing and explaining complex situations and problems. However, Education 2030 shows that
global competency is only explicitly articulated in 28% of the curriculum in the countries participating in the Curriculum Content
Mapping of this study (OECD, 2020[4]).
Digital literacy, on the other hand, has a stronger presence in the Education 2030 Curriculum Content Mapping (on average,
40% of content items). Estonia stands out with nearly 70% of its curriculum linked to digital literacy, followed by Korea and
Kazakhstan (just below 60%) (OECD, 2020[4]). Several countries have introduced (or are planning to introduce) one or more new
ICT-related subjects in the curriculum (OECD, 2020[2]). For example, Australia, Ireland, New Zealand, Portugal, India, Kazakhstan
and Viet Nam proposed computer science, technology or information technology as a separate subject. Australia, Chile, Estonia,
Hungary, Ireland, Japan, the Netherlands, New Zealand, Scotland (United Kingdom) and Wales (United Kingdom), Brazil, and
Kazakhstan reported the introduction of ICT as crosscutting content across multiple subjects or the entire curriculum.
Are students who had the opportunity to learn digital skills in school more likely to distinguish facts from opinions in the
PISA reading test? Which navigational skills are more strongly related to reading multiple-source items in the PISA reading test?
Do 15-year-olds spend more time reading for enjoyment on paper or digital devices? And who reads proficiently? What reading
strategies are effective in tackling inequality and gender gaps in reading performance? How is reading performance associated
with the kinds of texts and the length of texts used in school? How are schools enhancing teaching and learning in digital
environments? This report aims to answer these questions to understand better how 15-year-olds are developing reading skills
to navigate the technology-rich 21st century. Before introducing the results, the following section shows how the PISA reading
framework has been updated in 2018 to reflect recent changes in the nature of reading (OECD, 2019[5]).
Readers generate meaning in response to a text by using previous knowledge, processes, and strategies, and these processes
and strategies vary with context and purpose (Britt and Rouet, 2012[6]). For example, students may use different processes and
strategies to interpret extended pieces of continuous text such as novels or essays than when they navigate through information
on the Internet in search of facts. Even textbooks that have traditionally been examples of extended pieces of continuous
texts are being transformed into a repository of documents with many inserted tasks and less linearity (Weisberg, 2011[7]).
Digitalisation is also profoundly transforming social media and the transmission of (mis)information (Allcott, Gentzkow and Yu,
2019[8]). Increasingly, reading requires evaluating the quality and validity of different sources, navigating through ambiguity,
distinguishing between fact and opinion, and constructing knowledge. The PISA 2018 assessment and analytical framework
reflect the information-processing strategies involved in digital reading. Box 1.2 summarises the major changes in the reading
assessment between PISA 2009 and PISA 2018, while the rest of the chapter describes the PISA 2018 reading literacy framework
more thoroughly.
Box 1.2. Changes between 2009 and 2018 in the PISA assessment of reading literacy
The PISA 2018 reading literacy framework (OECD, 2019[5]) is similar in many respects to the PISA 2009 reading literacy
framework, which with some adjustments (e.g. delivery mode) was also used in PISA 2012 and 2015. The PISA 2018 reading
framework was also designed so that the former print and digital reading assessments (PISA 2009) could be fully integrated.
As a result, there is no longer a strict delineation of tasks typical of print or digital environment. The major differences
between the 2009 and 2018 assessments are:
• A greater emphasis on multiple-source texts, i.e. texts composed of several units of text, created separately by different
authors (Rouet, Britt and Potocki, 2019[9]). These types of text are more prevalent in the information-rich digital world,
and the digital delivery of the PISA reading assessment made it possible to present them to students. While the
availability of multiple sources does not necessarily imply greater difficulty, including multiple-source units helped to
expand the range of higher-level reading processes and strategies measured by PISA. In 2018, these included searching
for information across multiple documents, integrating across texts to generate inferences, assessing the credibility of
sources, and handling conflicting information (List and Alexander, 2018[10]; Barzilai, Zohar and Mor-Hagani, 2018[11];
Magliano et al., 2017[12]; Van Meter et al., 2020[13]; Salmerón et al., 2018[14]).
• The use of adaptive testing, whereby the electronic test form that a student saw depended on his or her answers to
1
earlier questions.
• The digital, on-screen delivery of text, which facilitated the first and second changes listed above. The 2009 assessment
was conducted on paper while the 2018 assessment was conducted (by default) on computer2. Students had to use
navigational tools to move between passages of text, as there was often too much text to fit onto one screen.
• The explicit assessment of reading fluency, defined as the ease and efficiency with which students can read text.
While a few countries/economies may have been affected more than others by these changes, the analysis in Box I.8.1
of PISA 2018 Results (Volume I) - What Students Know and Can Do (OECD, 2019[15]) shows that effects on country mean
scores were not widespread. The difference in the single- and multiple-source subscales in PISA 2018 is not correlated with
the change in reading performance between PISA 2015 and PISA 2018. Similarly, there was no correlation between the
change in countries and economies’ average reading performance between 2015 and 2018 and the estimated accuracy in
answering reading-fluency items. Therefore, it is possible to conclude that the greater emphasis on multiple-source texts
in PISA 2018 had a limited impact on changes in reading performance.
Since PISA 2000, the concept of reading has also changed to reflect the progress in the theoretical understanding of what it
means to know how to read, which encompasses cognitive, metacognitive and affective-motivational dimensions of behaviour.
For instance, reading in a digital world requires continuously evaluating the quality and validity of differences sources, navigating
through ambiguity, distinguishing between facts and opinions, and constructing knowledge. This increasingly requires individuals
to acquire effective strategies – to think about, monitor and adjust their activity to reach a particular goal (also known as metacognitive
reading strategies) and motivate themselves to persevere in the face of difficulties (also known as self-efficacy). Metacognitive
strategies are crucial elements that can be developed as components of reading literacy. Moreover, teachers can enhance reading
engagement and metacognition strategies through teaching and supportive classroom practices (Guthrie, Klauda and Ho, 2013[16];
Christenson, Reschly and Wylie, 2012[17]).
PISA recognises from its very beginning that reading is a daily activity for most people and that education systems need to prepare
students to be able to adapt to the variety of scenarios in which they will need to read as adults. These scenarios range from
their own goals and development initiatives to their experiences in further and continuing education, and to their interactions
at work, with public entities, in online communities and with society. It is not enough to be a proficient reader; students should
also be motivated to read and be able to read for a variety of purposes (Britt, Rouet and Durik, 2017[18]; van den Broek, 2011[19]).
Cognitive processes
The PISA 2018 framework defines two broad categories of reading processes: text processing and task management
(Figure 1.3). Three of the processes that readers activate when engaging with a piece of text were also identified in previous
PISA cycles: “locating information”, “understanding”, and “evaluating and reflecting”. ‘Reading fluency’ on the other hand, is new
to the PISA 2018 assessment.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 23
1 Digital literacy in the 21st century
Text processing
Locate information
- Access and retrieve information within a text Task
- Search and select relevant text Management
Understand
Set Goals
- Represent literal meaning and Plans
- Integrate and generate inferences
Monitor,
Evaluate and reflect regulate
- Assess quality and credibility
- Reflect on content and form
- Detect and handle conflict
PISA defines reading fluency as the ease and efficiency of reading texts for understanding. Reading fluency is an individual’s ability
to read words and text accurately and automatically and to phrase and process these words and texts in order to comprehend the
overall meaning of the text (Kuhn and Stahl, 2003[21]). Students who read fluently activate higher-level comprehension processes
that are associated with higher reading comprehension performance than those who have a weaker reading fluency (Cain and
Oakhill, 2004[22]). PISA 2018 evaluated reading fluency by presenting students with a variety of sentences, one at a time, and
asking them whether they made sense. These sentences were all relatively simple, and it was unambiguous whether they made
sense or not. An example of these sentences includes: Six birds flew over the trees; the window sang the song loudly; the man
drove the car to the store.
Daily readers most often use texts for purposes that require the location of specific information (White, Chen and Forsyth, 2010[23]).
This is particularly the case for readers when using complex digital information such as search engines and websites. Readers
must be able to judge the relevance, accuracy and credibility of passages, modulate their reading speed and skim through the
text until they find the relevant information. The 2018 framework defines two processes related to ‘locating information’ (known
in previous frameworks as “accessing and retrieving”), and they can be found at all levels of difficulty:
a. Accessing and retrieving information within a piece of text: readers must be able to scan a single piece of text in order
to retrieve target information composed of a few words, phrases or numerical values. It often requires comprehending the
text at the phrase level through literal or close to the literal matching of elements in the question and the text. Other times, it
requires finding several pieces of information and searching for embedded information.
b. Searching for and selecting relevant text: readers must be able to select information when faced with several pieces of text
where the amount of available information often vastly exceeds the amount readers can process. In PISA 2018, text search
and selection tasks involve the use of text descriptors such as headers, source information (e.g. author, medium, date), and
embedded or explicit links such as search engine result pages.
A large number of reading activities involve the parsing and integration of extended passages of text in order to form an
understanding or comprehension of the meaning conveyed in the passage. To understand a text the reader needs to construct
a mental representation of the literal meaning of the text and integrate that construction with one’s prior knowledge through
inference processes (McNamara and Magliano, 2009[24]). The 2018 framework defines two processes related to ‘understanding’
(known in previous frameworks as “integrating and interpreting”):
a. Acquiring a representation of the literal meaning of a text: readers must be able to comprehend sentences or short
passages, often involving a direct or paraphrased match between the question and target information within the passage.
b. Constructing an integrated text: readers must be able to generate various types of inferences ranging from simple
connecting inferences (such as the resolution of anaphora) to more complex coherence relationships (e.g. spatial, temporal,
causal or claim-argument links). Inferences might link different portions of the text together or information located in different
pieces of texts resulting in conflicting information.
24 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Digital literacy in the 21st century
Competent readers should be able to reason beyond the literal or inferred meaning of the text and reflect on the content and
form of the text. For example, to distinguish between facts and opinions, readers must be able to assess the quality and validity
of the information critically. These are defined in the PISA 2018 assessment framework as ‘evaluating and reflecting’, and they
are composed of the following processes:
1
a. Assessing quality and credibility: readers must be able to evaluate the quality and credibility of the information in a piece
of text, often involving whether the information and the source are valid. Sometimes readers may need to look at who wrote
it, when, and for what purposes to assess the quality and credibility of the text adequately.
b. Reflecting on content and form: readers must be able to reflect on the quality and style of the writing, often involving
evaluating the author’s purposes and viewpoints. In order to do so, they may need to use their prior knowledge and experience
to be able to compare different perspectives.
c. Detecting and handling conflict: readers need to be aware and be able to deal with conflicts when facing multiple pieces of
texts with contradictory information. Handling conflict typically requires readers to assign discrepant claims to their respective
sources and to assess the soundness of the claims and/or the credibility of the sources.
Reading involves being able to adequately respond to the demands of a particular situation, set goals and strategies, monitor
progress and self-regulate those goals and strategies (Winne and Hadwin, 1998[25]). In order to do so, readers need to use
metacognitive strategies that enable the dynamic updating of goals throughout the activity. The PISA 2018 assessment
framework highlights the importance of Task management processes, such as setting goals for reading (reading for pleasure,
reading for information or reading for learning) or monitoring his/her comprehension. In the context of PISA, it is not possible
to have a full evaluation of students’ task management processes. Instead, PISA focuses on those goals that readers form upon
receiving external prompts to accomplish a given task such as in school assignments. Chapter 3 of this report provides a detailed
examination of these processes using process data such as response time, the number of actions and time spent to the first
action.
Texts
The 2018 framework defines four dimensions of texts: source (single, multiple); organisational and navigational structure (static,
dynamic); format (continuous, non-continuous, mixed); and type (description, narration, exposition, argumentation, instruction,
interaction, transaction). The design of test materials that vary along these four dimensions ensure a broad coverage of the
domain and representation of traditional as well as emerging reading practices.
a. Source: a source is a unit of text. Like in most traditional printed books, single-source texts are defined by having a definite
author (or group of authors), time of writing or publication date, a reference title or number and are usually presented to the
reader in isolation from other texts. Multiple-source items are defined by having different authors, or by being published at
different times, or by bearing different titles or reference numbers.
b. Organisational and navigational structure: from a cell phone to multiple screen windows of information, the organisation
and navigational structure in digital environments vary dramatically. PISA 2018 framework distinguishes static texts, with a
simple organisation and low density of navigational tools (e.g. scroll bars and tabs), from dynamic texts that feature a more
complex, non-linear organisation and a higher density of navigational devices (e.g. table of contents, hyperlinks to switch
between segments of text or interactive tools such as in social networks).
c. Format: continuous texts are typically composed of sentences organised into paragraphs. These may fit into even larger
structures such as sections, chapters and books. Non-continuous texts are most frequently organised in a matrix format,
based on combinations of lists. Examples of non-continuous text objects are tables, graphs, diagrams and schedules.
Mixed texts such as articles in magazines and reports may combine both continuous and non-continuous formats.
d. Type: The type of text refers to why the text was written and how it is organised. Examples of types of texts are descriptions,
narration, exposition, argumentation, instruction or transaction.
While text format and type remained unchanged from PISA 2009, PISA 2018 computer-based assessment of reading presented
all texts on screen. Therefore, the dimension of medium (print or electronic format) that appeared in previous frameworks is
no longer relevant. On the other hand, the source dimension is related to the previous classification of environment (the text is
composed by an author or group of authors alone, or in a collaborative manner with potential contribution of the reader).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 25
1 Digital literacy in the 21st century
Scenarios
PISA questions or tasks are arranged in units of one or multiple texts. In most traditional PISA reading units, students are
presented with a series of unrelated passages on a range of general topics. Students answer a set of questions on each passage
and then move on to the next unrelated passage. Multiple-text units were present in PISA since 2000 (see, e.g. the released
unit Lake Chad), however, the category “multiple text unit” was dramatically expanded in PISA 2018 and special attention was
placed on these units so that the student would feel engaged in a scenario as opposed to a mere series of questions. PISA 2018
strengthens the assessment scenarios in which students are provided with an overarching purpose for reading a collection
of thematically related texts to complete a higher-level task (e.g. responding to some larger integrative question or writing a
recommendation based on a set of texts), along with traditional standalone PISA reading units.
The use of scenarios with thematically related texts was introduced in PISA 2018 to help students better engage in the task.
At the same time, it also allows assessing emergent aspects of reading, such as student’s ability to search for information,
evaluate different sources, read for comprehension and integrate across texts. Each scenario is made up of one or more tasks.
In each task, students may be asked questions about the texts contained therein ranging from traditional comprehension items
(locating information, generating an inference) to more complex tasks such as the synthesis and integration of multiple texts,
evaluating web search results or corroborating information across multiple texts. Each task is designed to assess one or more
of the processes identified in the framework. Table 1.1 presents a breakdown of the PISA 2018 reading literacy assessment by
process assessed.
Tanle 1.1 Approximate distribution of tasks by targeted process and text source
2018 Framework
2015 Framework
Single text Multiple text
Searching for and
Accessing and retrieving 25% Scanning and locating 15% 10%
selecting relevant text
Multiple-text inferential
Intergrating and interpreting 50% Literal Comprehension 15% 15%
Comprehension
Corroborating/handling
Reflecting and evaluating 25% Inferential Comprehension 15% 10%
conflict
In the PISA 2018 reading literacy assessment, a student might encounter an initial task in which he or she must locate a particular
document based on a search result. In the second task, the student might have to answer a question about information that is
stated explicitly in the text. Finally, in the third task, the student might need to determine if the author’s point of view in the first
text is the same as in a second text. In each case, these tasks are scaffolded so that if a student fails to find the correct document
in the first task, he or she is then provided with the correct document in order to complete the second task.
Response formats
The form in which evidence of student ability is collected – the response format – varies depending on the kinds of evidence
that are being collected, and according to the pragmatic constraints of a large-scale assessment. To ensure proper coverage
of the ability ranges, to ensure fairness given the inter-country and gender differences observed and to ensure a valid
assessment of the reflecting and evaluating process, both multiple choice and open constructed-response items continue to
be used in PISA reading literacy assessments regardless of the change in delivery mode. About one-third of the 245 items
solicited open-constructed responses marked by humans, and the rest were automatically marked selected-response questions
(e.g. multiple-choice, true/false, yes/no).
Digital reading has some benefits compared to print reading, but not without nuances. For example, technology provides
unique environments that interact in response to the actions of the learner and facilitate communication with other people
1
(Committee on How People Learn II: The Science and Practice of Learning et al., 2018[26]). Electronic books are typically
more cost-effective than paper books (Bando et al., 2016[27]), which facilitates the democratisation of knowledge.
And, the latest advances in screen technology are reducing eyestrain from reading digital texts even though books still
retain a comparative advantage here (Rosenfield et al., 2015[28]).
On the other hand, the most comprehensive meta-analyses have shown that a) reading from paper yielded better reading
comprehension compared to digital reading, especially when there is a time constraint (Delgado et al., 2018[29]), and
b) reading from paper is more efficient than reading from screens considering that there is better performance reading
from paper with similar time investments (Kong, Seo and Zhai, 2018[30]). Other studies have shown that while reading
comprehension is similar between print books and e-books, e-books readers were not as efficient as print readers in
ordering the events and locating them in the timeline in which they occurred (Mangen, Olivier and Velay, 2019[31]).
A recent meta-analysis synthesised experimental studies comparing students’ reading performance who have read the
same texts on screen and paper. The results showed that reading from screens is negatively associated with reading
performance (g3= - .25). However, this may have been limited to expository texts4 (g= - .32) as there was no association with
narrative texts (g= - .04). At the same time, reading texts from screens did not show differences in reading time compared
to reading from paper (g= .08) (Clinton, 2019[32]). The same study showed that text from screens caused less calibrated
and more overconfident predictions of performance than reading from paper (g= .20). Related to students’ perception
of competence, readers have also shown weaker performance and metacognitive awareness of their performance on
assessments based on reading from screens compared to paper (Ackerman and Lauterman, 2012[33]).
According to PISA data, strong readers perform well both in print and digital reading. The relationship between
socio-economic status and performance on computer-based assessments mostly reflects differences observed in
performance on paper-based assessments (OECD, 2015[34]). This implies that differences in reading performance, related
to socio-economic status, in the computer-based assessment of reading do vastly stem from differences in reading
proficiency and marginally from differences in navigation skills. The PISA 2018 reading framework aimed to capture every
kind of contemporary reading, regardless of whether the text is printed or digital. This does not mean that traditional
aspects of reading were cast aside but enhanced to cover new aspects of reading. In comparison to frameworks in previous
cycles, the PISA 2018 framework was expanded to better account for processes that are typically involved in digital reading,
such as searching and multiple source comprehension. Despite the changes in the assessment framework and the results
mentioned above, the PISA 2018 reading assessment remains comparable to previous cycles. Analysis in PISA 2018 Results
(Volume I) - What Students Know and Can Do (OECD, 2019[15]) showed changes in performance between 2015 and 2018
(e.g. see Box I.8.1 from Volume I). Yet, a few countries/economies might have been affected more than others. For example,
countries and economies whose students were relatively weaker in reading multiple-source texts might be expected to
have more negative trends between 2015 and 2018 than countries whose students were relatively stronger in reading such
texts. Nevertheless, the effects on trends in country mean scores were not widespread (see Figure I.8.2 from Volume I).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 27
1 Digital literacy in the 21st century
The computerised administration of PISA 2018 also provides a better understanding of how students navigate through multiple
text sources to search for and locate relevant information as well as how students evaluate the credibility of information sources.
Navigation is a critical component of digital reading, as readers “construct” their text through navigation. Thus, navigational
choices directly influence which text or portion of the text is eventually processed (OECD, 2011[35]). Chapter 3 of this report
provides item-level analyses using process (or log) data to illustrate how students navigate through the emerging aspects of
reading, such as multiple-source items.
Assessing students’ reading motivation, reading practices and awareness of reading strategies
Interest in and enjoyment of reading together with intrinsic motivation fuel students’ reading engagement. This acts as a possibly
powerful lever in enhancing reading proficiency and decreasing gaps between groups of students. PISA data consistently shows
that engagement in reading is strongly correlated with reading performance and is a mediator of gender or socio-economic status
(measured by the PISA index of economic, social and cultural status [ESCS]; OECD, 2010[36]; OECD, 2002[37]). Namely, students’
engagement in reading can mediate the effect of socio-economic status on reading performance. In PISA 2018, two other
related motivational constructs were included in the assessment framework: self-efficacy, an individual’s perceived capacity of
performing specific tasks, and self-concept, an individual’s own perceived abilities in a more general domain. Although previous
cycles measured self-efficacy and self-concept in relation to mathematics and science, PISA measured these constructs in relation
to reading for the first time in 2018.
PISA has traditionally evaluated reading practices through questionnaires by measuring how frequently students read different
types of texts in various media, including in digital environments. In PISA 2018, those practices were extended to capture the
emergent aspects of reading in digital environments. In addition to including emergent practices such as the use of e-books,
online search, or social media, PISA 2018 continued expanding the assessment of students’ awareness of reading strategies.
PISA defines metacognition as an individual’s ability to think about and control his or her reading and comprehension strategies.
While some of the dimensions of metacognition (i.e. knowledge of strategies for summarising, understanding and remembering)
remain unchanged from previous cycles, PISA 2018 assessed the students’ awareness of digital reading strategies: a new scenario
about the credibility of information was included.
Chapter 4 of this report focuses on students’ openness to reading with digital devices, as well as their reading practices, motivation
and attitudes towards reading. Chapter 5 focuses on the relationship between students’ reading attitudes and strategies,
socio-economic status, gender, and reading performance.
Teaching practices and classroom support for reading growth and engagement
Teacher’s scaffolding and support for autonomy, competence, and ownership of their tasks improve students’ reading proficiency,
awareness of strategies, and engagement in reading (Guthrie, Klauda and Ho, 2013[16]; Christenson, Reschly and Wylie, 2012[17]).
In PISA 2018, students, teachers and school principals provided information on teaching practices and classroom support to
enhance reading skills. Some of these practices included teachers’ stimulation of reading engagement, teaching practices about
the type and length of text, and school practices for using digital devices. PISA 2018 also assessed students’ opportunities to
learn reading strategies that best support the development of students’ reading skills in digital environments. This dimension is
particularly relevant in contexts where the digital divide between students is significantly large.
Chapter 2 provides a snapshot of the magnitude of the digital divide between countries and its association with reading
performance in PISA 2018. Chapter 6 focuses on teaching practices towards using digital technology for education and stimulating
digital learning environments.
1. The global competence assessment was conducted in 27 countries and economies, while the global competence module was included in
questionnaires distributed in 66 countries/economies and economies.
1
2. The PISA 2018 paper-based instruments were based on the PISA 2009 reading framework and the PISA 2006 science framework. Only the
mathematics framework was common to both the paper- and computer-based tests in 2018. The paper-based form was used in nine countries:
Argentina, Jordan, Lebanon, the Republic of Moldova, the Republic of North Macedonia, Romania, Saudi Arabia, Ukraine and Viet Nam.
3. Hedges’ g is an unbiased measure of effect size corrected by sample size. As a general rule of thumb, an effect size lower than 0.20 is a small
effect, 0.5 is a medium effect, and 0.80 is a large effect.
4. In education, expository texts typically refer to fact-based educational reading materials while narrative texts include fiction and nonfiction
reading materials such as novels. Narrative texts are generally easier to read than expository texts (Graesser and McNamara, 2011[38]).
In a broader sense, expository texts encompass any text that provides a description or an explanation about something.
5. Argentina, Jordan, Lebanon, the Republic of Moldova, the Republic of North Macedonia, Romania, Saudi Arabia, Ukraine and Viet Nam assessed
their students’ knowledge and skills in PISA 2018 using paper-based instruments and no new items were developed for the paper-based test.
References
Ackerman, R. and T. Lauterman (2012), “Taking reading comprehension exams on screen or on paper? A metacognitive analysis of [33]
learning texts under time pressure”, Computers in Human Behavior, Vol. 28/5, pp. 1816-1828, http://dx.doi.org/10.1016/j.chb.2012.04.023.
Allcott, H., M. Gentzkow and C. Yu (2019), “Trends in the diffusion of misinformation on social media”, Research & Politics, Vol. 6/2, [8]
p. 1-8, http://dx.doi.org/10.1177/2053168019848554.
Bando, R. et al. (2016), “Books or Laptops? The Cost-Effectiveness of Shifting from Printed to Digital Delivery of Educational Content”, [27]
NBER Working Paper, Vol. No. w22928, https://ssrn.com/abstract=2883965.
Barzilai, S., A. Zohar and S. Mor-Hagani (2018), “Promoting Integration of Multiple Texts: a Review of Instructional Approaches and [11]
Practices”, Educational Psychology Review, Vol. 30/3, pp. 973-999, http://dx.doi.org/10.1007/s10648-018-9436-8.
Britt, M. and J. Rouet (2012), “Learning with Multiple Documents”, in Kirby, J. and M. Lawson (eds.), Enhancing the Quality of Learning, [6]
Cambridge University Press, Cambridge, http://dx.doi.org/10.1017/cbo9781139048224.017.
Britt, M., J. Rouet and A. Durik (2017), Literacy beyond Text Comprehension, Routledge, http://dx.doi.org/10.4324/9781315682860. [18]
Cain, K. and J. Oakhill (2004), “Reading Comprehension Difficulties”, in Handbook of Children’s Literacy, Springer Netherlands, Dordrecht, [22]
http://dx.doi.org/10.1007/978-94-017-1731-1_18.
Christenson, S., A. Reschly and C. Wylie (eds.) (2012), Handbook of Research on Student Engagement, Springer US, Boston, MA, [17]
http://dx.doi.org/10.1007/978-1-4614-2018-7.
Clinton, V. (2019), “Reading from paper compared to screens: A systematic review and meta‐analysis”, Journal of Research in Reading, [32]
Vol. 42/2, pp. 288-325, http://dx.doi.org/10.1111/1467-9817.12269.
Committee on How People Learn II: The Science and Practice of Learning et al. (2018), How People Learn II, National Academies Press, [26]
Washington, D.C., http://dx.doi.org/10.17226/24783.
Delgado, P. et al. (2018), “Don’t throw away your printed books: A meta-analysis on the effects of reading media on reading [29]
comprehension”, Educational Research Review, Vol. 25, pp. 23-38, http://dx.doi.org/10.1016/j.edurev.2018.09.003.
Graesser, A. and D. McNamara (2011), “Computational Analyses of Multilevel Discourse Comprehension”, Topics in Cognitive Science, Vol. [38]
3/2, pp. 371-398, http://dx.doi.org/10.1111/j.1756-8765.2010.01081.x.
Guthrie, J., S. Klauda and A. Ho (2013), “Modeling the Relationships Among Reading Instruction, Motivation, Engagement, and [16]
Achievement for Adolescents”, Reading Research Quarterly, Vol. 48/1, pp. 9-26, http://dx.doi.org/10.1002/rrq.035.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 29
1 Digital literacy in the 21st century
International Telecommunication Union (2019), Measuring digital development. Facts and figures 2019,
https://www.itu.int/en/ITU-D/Statistics/Documents/facts/FactsFigures2019.pdf.
Kong, Y., Y. Seo and L. Zhai (2018), “Comparison of reading performance on screen and on paper: A meta-analysis”, Computers &
Education, Vol. 123, pp. 138-149, http://dx.doi.org/10.1016/j.compedu.2018.05.005.
[1]
[30]
Kuhn, M. and S. Stahl (2003), “Fluency: A review of developmental and remedial practices.”, Journal of Educational Psychology, Vol. 95/1, [21]
pp. 3-21, http://dx.doi.org/10.1037/0022-0663.95.1.3.
List, A. and P. Alexander (2018), “Toward an Integrated Framework of Multiple Text Use”, Educational Psychologist, Vol. 54/1, pp. 20-39, [10]
http://dx.doi.org/10.1080/00461520.2018.1505514.
Magliano, J. et al. (2017), “The Modern Reader”, in The Routledge Handbook of Discourse Processes, Routledge, [12]
http://dx.doi.org/10.4324/9781315687384-18.
Mangen, A., G. Olivier and J. Velay (2019), “Comparing Comprehension of a Long Text Read in Print Book and on Kindle: Where in the [31]
Text and When in the Story?”, Frontiers in Psychology, Vol. 10, http://dx.doi.org/10.3389/fpsyg.2019.00038.
McNamara, D. and J. Magliano (2009), “Chapter 9 Toward a Comprehensive Model of Comprehension”, in The Psychology of Learning and [24]
Motivation, Psychology of Learning and Motivation, Elsevier, http://dx.doi.org/10.1016/s0079-7421(09)51009-2.
OECD (2020), Curriculum Overload: A Way Forward, OECD Publishing, Paris, https://dx.doi.org/10.1787/3081ceca-en. [4]
OECD (2020), PISA 2018 Results (Volume VI): Are Students Ready to Thrive in an Interconnected World?, PISA, OECD Publishing, Paris, [3]
https://dx.doi.org/10.1787/d5f68679-en.
OECD (2020), What Students Learn Matters: Towards a 21st Century Curriculum, OECD Publishing, Paris, [2]
https://dx.doi.org/10.1787/d86d4d9a-en.
OECD (2019), PISA 2018 Assessment and Analytical Framework, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/b25efab8-en. [20]
OECD (2019), “PISA 2018 Reading Framework”, in PISA 2018 Assessment and Analytical Framework, OECD Publishing, Paris, [5]
https://dx.doi.org/10.1787/5c07e4f1-en.
OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, [15]
https://dx.doi.org/10.1787/5f07c754-en.
OECD (2015), Students, Computers and Learning: Making the Connection, PISA, OECD Publishing, Paris, [34]
https://dx.doi.org/10.1787/9789264239555-en.
OECD (2011), PISA 2009 Results: Students On Line: Digital Technologies and Performance (Volume VI), PISA, OECD Publishing, Paris, [35]
https://dx.doi.org/10.1787/9789264112995-en.
OECD (2010), PISA 2009 Results: Learning to Learn: Student Engagement, Strategies and Practices (Volume III), PISA, OECD Publishing, Paris, [36]
https://dx.doi.org/10.1787/9789264083943-en.
OECD (2002), Reading for Change: Performance and Engagement across Countries: Results from PISA 2000, PISA, OECD Publishing, Paris, [37]
https://dx.doi.org/10.1787/9789264099289-en.
Rosenfield, M. et al. (2015), “Cognitive demand, digital screens and blink rate”, Computers in Human Behavior, Vol. 51, pp. 403-406, [28]
http://dx.doi.org/10.1016/j.chb.2015.04.073.
Rouet, J., M. Britt and A. Potocki (2019), “Multiple-Text Comprehension”, in The Cambridge Handbook of Cognition and Education, [9]
Cambridge University Press, http://dx.doi.org/10.1017/9781108235631.015.
Salmerón, L. et al. (2018), “Chapter 4. Comprehension processes in digital reading”, in Studies in Written Language and Literacy, Learning [14]
to Read in a Digital World, John Benjamins Publishing Company, Amsterdam, http://dx.doi.org/10.1075/swll.17.04sal.
van den Broek, P. (2011), “When a reader meets a text: The role of standards of coherence in reading comprehension”, in McCrudden, [19]
M., J. (ed.), Text relevance and learning from text, Information Age Publishing,
https://www.amazon.com/Text-Relevance-Learning-Information-Publishing-ebook/dp/B01FNA2Y7I.
Van Meter, P. et al. (eds.) (2020), Handbook of Learning from Multiple Representations and Perspectives, Routledge, New York, [13]
NY : Routledge, 2020., http://dx.doi.org/10.4324/9780429443961.
Weisberg, M. (2011), “Student Attitudes and Behaviors Towards Digital Textbooks”, Publishing Research Quarterly, Vol. 27/2, pp. 188-196, [7]
http://dx.doi.org/10.1007/s12109-011-9217-4.
White, S., J. Chen and B. Forsyth (2010), “Reading-Related Literacy Activities of American Adults: Time Spent, Task Types, and Cognitive [23]
Skills Used”, Journal of Literacy Research, Vol. 42/3, pp. 276-307, http://dx.doi.org/10.1080/1086296x.2010.503552.
Winne, P. and A. Hadwin (1998), “Studying as self-regulated learning.”, in Metacognition in educational theory and practice., Lawrence [25]
Erlbaum Associates Publishers, Mahwah, NJ, US.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 31
2 Reading performance and the digital divide in PISA 2018
– On average across OECD countries, 54% of students reported being trained at school on how to recognise whether
information is biased or not. Among OECD countries, more than 70% of students reported receiving this training in
Australia, Canada, Denmark, and the United States. However, less than 45% of students reported received this training in
Israel, Latvia, the Slovak Republic, Slovenia, and Switzerland.
– PISA 2018 shows that two factors are associated with the estimated percentage correct in the item of distinguishing
facts from opinions in the PISA reading assessment after accounting for per capita GDP: education systems with a higher
proportion of students who reported being taught how to detect biased information in school (R=0.60, OECD average)
and students’ digital access at home (R=0.42, all countries/economies).
Population coverage differs across countries (see Chapter 3 of PISA 2018 Results (Volume I) - What Students Know and Can
Do (OECD, 2019[2])). While the sampling standards ensured that PISA results are representative of the target population in
all adjudicated countries/economies, they cannot be readily generalised to the entire population of 15-year-olds in countries
where many young people of that age are not enrolled in lower or upper secondary school. Coverage Index 3 included in this
report indicates the proportion of 15-year-olds who were covered by the PISA sample (see Figure I.3.1 of Volume I (OECD,
2019[3])).
Differences in performance between students within the same country are, in general, larger than between-country differences in
performance (OECD, 2019[2]). For example, in every country and economy, the performance gap between the highest-scoring
5% of students and the lowest-scoring 5% of students in reading is larger than the difference in mean performance between
the highest-performing country and the lowest-performing country.
Countries with higher national incomes tend to score higher in PISA to a certain threshold (see Figure I.4.3 of Volume I;
(OECD, 2019[2]). Approximately 44% of the variation in countries and economies mean reading performance is related to per
capita GDP. Higher-income countries are often more capable of spending more on education while a lower national income
constrains other countries. This is particularly relevant when interpreting the performance of middle-income countries
such as Colombia, the Republic of Moldova, Morocco and the Philippines. At the same time, more spending in education
not always means better performance. Countries with higher spending per student tend to score higher in PISA but after
a certain point spending is much less related to performance. For example, Estonia, which spends less than the OECD
average per student, was one of the top-performing OECD countries in reading, mathematics, and science in PISA 2018.
The strength of the relationship between socio-economic background and student’s performance varies across education
systems (OECD, 2019[4]). Socio-economically advantaged students1 usually perform better in PISA than disadvantaged
students, but the gap in reading performance related to socio-economic status varies considerably across countries.
In Baku (Azerbaijan), Kazakhstan, and Macao (China), the percentage of variance in reading performance explained by
student’s socio-economic status was below 5%. In contrast, in Peru and Belarus it was of at least 20%. Not all education
systems succeed in achieving both excellence and equity, and frequently they sacrifice one for the other. However,
PISA consistently shows both things are not mutually exclusive. In 11 of the 25 countries and economies that scored
above the OECD average in reading in PISA 2018, the strength of the relationship between student performance and
2
socio-economic status was significantly below the OECD average. In Australia, Canada, Denmark, Estonia, Finland,
Hong Kong (China), Japan, Korea, Macao (China), Norway and the United Kingdom, for example, average performance
was higher than the OECD average while the relationship between socio-economic status and reading performance was
weaker than the OECD average.
Whether students commonly speak the language of instruction at home is associated with how students read in that
language (OECD, 2019[4]). In many countries, students with an immigrant background who speak the language of
instruction at home scored higher in reading than those who do not. Therefore, not speaking the language of instruction
represents an additional barrier to attaining high proficiency in reading. This is particularly relevant in countries such as
Brunei Darussalam, Lebanon, Luxembourg, Malta, Morocco and the Philippines, where the proportion of students who
speak a language other than the language of instruction at home is over 80%.
The performance in PISA is the result of a cumulative process. In addition to the quality of lower and upper secondary
education, PISA results also reflect the quality of learning in earlier stages of education, and the cognitive, emotional and
social competences students had acquired before they even entered school (OECD, 2020[5]). PISA results should also be
interpreted in light of differences in how education is organised across grade levels, particularly in school systems where
students progress through different types of educational institutions at the pre-primary, primary, lower secondary and
upper secondary levels. In most cases, 15-year-old students have been in their current school for only two to three years.
This means that much of their academic development took place earlier, in other schools, which may have little or no
connection with the school in which they were enrolled when they sat the PISA test. Last but not least, different school
systems stratification policies such as grade repetition would make PISA sampled students being from different grades
(OECD, 2020[6]).
Among all participating countries and economies in PISA 2018, Beijing, Shanghai, Jiangsu and Zhejiang (China) (hereafter
“B-S-J-Z [China]”) (555 points) and Singapore (549 points) were the top performers in reading. Among OECD countries,
Estonia (523 points), Canada (520 points), Finland (520 points), and Ireland (518) were the top performers in reading. The mean
reading performance of Korea (514 points) was similar to top-performers such as Canada, Finland, and Ireland but significantly
lower than Estonia. The mean reading performance of Poland was similar to Ireland but significantly lower than Estonia, Canada,
and Finland (Table 2.1).
Table 2.12 shows each country’s/economy’s mean score, and indicates for which pairs of countries/economies the differences
between the means are statistically significant. Small differences that are not statistically significant should not be overly
emphasised. For each country/economy shown in the middle column, the countries/economies whose mean scores are not
statistically significantly different are listed in the right column. For example, B-S-J-Z (China) and Singapore scored significantly
higher in reading than all other countries/economies that participated in PISA 2018. However, the mean reading performance of
B-S-J-Z (China) was not statistically significantly different from that of Singapore.
In Table 2.1 countries and economies are divided into three broad groups: those whose mean scores are statistically around the
OECD mean (white), those whose mean scores are above the OECD mean (blue), and those whose mean scores are below the
OECD mean (grey)3.
The gap in reading performance between the highest- and lowest-performing OECD countries was 111 score points. In contrast,
this gap was 216 score points between all education systems that took part in PISA 2018. This means that OECD countries
represent a relatively homogeneous group compared to all participating countries. Nevertheless, differences within countries
are typically larger than between countries. For example, the difference between the 95th and 5th percentile of performance was
327 points on average across OECD countries and 312 points on average in all countries and economies (Table B.2.1a).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 33
2 Reading performance and the digital divide in PISA 2018
Comparison country/ Countries and economies whose mean score is not statistically significantly different
Mean score economy from the comparison country's/economy's score
555 B-S-J-Z (China) Singapore
549 Singapore B-S-J-Z (China)
525 Macao (China) Hong Kong (China)1, Estonia, Finland
524 Hong Kong (China)1 Macao (China), Estonia, Canada, Finland, Ireland
523 Estonia Macao (China), Hong Kong (China)1, Canada, Finland, Ireland
520 Canada Hong Kong (China)1, Estonia, Finland, Ireland, Korea
520 Finland Macao (China), Hong Kong (China)1, Estonia, Canada, Ireland, Korea
518 Ireland Hong Kong (China)1, Estonia, Canada, Finland, Korea, Poland
514 Korea Canada, Finland, Ireland, Poland, Sweden, United States1
512 Poland Ireland, Korea, Sweden, New Zealand, United States1
Korea, Poland, New Zealand, United States1, United Kingdom, Japan, Australia, Chinese Taipei, Denmark,
506 Sweden
Norway, Germany
506 New Zealand Poland, Sweden, United States1, United Kingdom, Japan, Australia, Chinese Taipei, Denmark
Korea, Poland, Sweden, New Zealand, United Kingdom, Japan, Australia, Chinese Taipei, Denmark, Norway,
505 United States1
Germany
504 United Kingdom Sweden, New Zealand, United States1, Japan, Australia, Chinese Taipei, Denmark, Norway, Germany
Sweden, New Zealand, United States1, United Kingdom, Australia, Chinese Taipei, Denmark, Norway,
504 Japan
Germany
503 Australia Sweden, New Zealand, United States1, United Kingdom, Japan, Chinese Taipei, Denmark, Norway, Germany
503 Chinese Taipei Sweden, New Zealand, United States1, United Kingdom, Japan, Australia, Denmark, Norway, Germany
501 Denmark Sweden, New Zealand, United States1, United Kingdom, Japan, Australia, Chinese Taipei, Norway, Germany
499 Norway Sweden, United States1, United Kingdom, Japan, Australia, Chinese Taipei, Denmark, Germany, Slovenia
Sweden, United States1, United Kingdom, Japan, Australia, Chinese Taipei, Denmark, Norway, Slovenia,
498 Germany
Belgium, France, Portugal1
495 Slovenia Norway, Germany, Belgium, France, Portugal1, Czech Republic
493 Belgium Germany, Slovenia, France, Portugal1, Czech Republic
493 France Germany, Slovenia, Belgium, Portugal1, Czech Republic
492 Portugal1 Germany, Slovenia, Belgium, France, Czech Republic, Netherlands1
490 Czech Republic Slovenia, Belgium, France, Portugal1, Netherlands1, Austria, Switzerland
485 Netherlands1 Portugal1, Czech Republic, Austria, Switzerland, Croatia, Latvia, Russia
484 Austria Czech Republic, Netherlands1, Switzerland, Croatia, Latvia, Russia
484 Switzerland Czech Republic, Netherlands1, Austria, Croatia, Latvia, Russia, Italy
479 Croatia Netherlands1, Austria, Switzerland, Latvia, Russia, Spain, Italy, Hungary, Lithuania, Iceland, Belarus, Israel
479 Latvia Netherlands1, Austria, Switzerland, Croatia, Russia, Spain, Italy, Hungary, Lithuania, Belarus
479 Russia Netherlands1, Austria, Switzerland, Croatia, Latvia, Spain, Italy, Hungary, Lithuania, Iceland, Belarus, Israel
477 Spain* Croatia, Latvia, Russia, Italy, Hungary, Lithuania, Iceland, Belarus, Israel
476 Italy Switzerland, Croatia, Latvia, Russia, Spain, Hungary, Lithuania, Iceland, Belarus, Israel
476 Hungary Croatia, Latvia, Russia, Spain, Italy, Lithuania, Iceland, Belarus, Israel
476 Lithuania Croatia, Latvia, Russia, Spain, Italy, Hungary, Iceland, Belarus, Israel
474 Iceland Croatia, Russia, Spain, Italy, Hungary, Lithuania, Belarus, Israel, Luxembourg
474 Belarus Croatia, Latvia, Russia, Spain, Italy, Hungary, Lithuania, Iceland, Israel, Luxembourg, Ukraine
470 Israel Croatia, Russia, Spain, Italy, Hungary, Lithuania, Iceland, Belarus, Luxembourg, Ukraine, Turkey
470 Luxembourg Iceland, Belarus, Israel, Ukraine, Turkey
*For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
1. Data did not meet the PISA technical standards but were accepted as largely comparable (see PISA 2018 Results (Volume I): What Students Know and Can Do,
Annexes A2 and A4).
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934239344
2
Comparison country/ Countries and economies whose mean score is not statistically significantly different
Mean score economy from the comparison country's/economy's score
466 Ukraine Belarus, Israel, Luxembourg, Turkey, Slovak Republic, Greece
458 Slovak Republic Ukraine, Greece, Chile
457 Greece Ukraine, Turkey, Slovak Republic, Chile
452 Chile Slovak Republic, Greece, Malta
448 Malta Chile
439 Serbia United Arab Emirates, Romania
432 United Arab Emirates Serbia, Romania, Uruguay, Costa Rica
Serbia, United Arab Emirates, Uruguay, Costa Rica, Cyprus, Moldova, Montenegro, Mexico, Bulgaria,
428 Romania
Jordan
427 Uruguay United Arab Emirates, Romania, Costa Rica, Cyprus, Moldova, Mexico, Bulgaria
426 Costa Rica United Arab Emirates, Romania, Uruguay, Cyprus, Moldova, Montenegro, Mexico, Bulgaria, Jordan
424 Cyprus Romania, Uruguay, Costa Rica, Moldova, Montenegro, Mexico, Bulgaria, Jordan
424 Moldova Romania, Uruguay, Costa Rica, Cyprus, Montenegro, Mexico, Bulgaria, Jordan
421 Montenegro Romania, Costa Rica, Cyprus, Moldova, Mexico, Bulgaria, Jordan
420 Mexico Romania, Uruguay, Costa Rica, Cyprus, Moldova, Montenegro, Bulgaria, Jordan, Malaysia, Colombia
420 Bulgaria Romania, Uruguay, Costa Rica, Cyprus, Moldova, Montenegro, Mexico, Jordan, Malaysia, Brazil, Colombia
419 Jordan Romania, Costa Rica, Cyprus, Moldova, Montenegro, Mexico, Bulgaria, Malaysia, Brazil, Colombia
415 Malaysia Mexico, Bulgaria, Jordan, Brazil, Colombia
413 Brazil Bulgaria, Jordan, Malaysia, Colombia
412 Colombia Mexico, Bulgaria, Jordan, Malaysia, Brazil, Brunei Darussalam, Qatar, Albania
408 Brunei Darussalam Colombia, Qatar, Albania, Bosnia and Herzegovina
407 Qatar Colombia, Brunei Darussalam, Albania, Bosnia and Herzegovina, Argentina
405 Albania Colombia, Brunei Darussalam, Qatar, Bosnia and Herzegovina, Argentina, Peru, Saudi Arabia
Bosnia and
403 Brunei Darussalam, Qatar, Albania, Argentina, Peru, Saudi Arabia
Herzegovina
402 Argentina Qatar, Albania, Bosnia and Herzegovina, Peru, Saudi Arabia
401 Peru Albania, Bosnia and Herzegovina, Argentina, Saudi Arabia, Thailand
399 Saudi Arabia Albania, Bosnia and Herzegovina, Argentina, Peru, Thailand
393 Thailand Peru, Saudi Arabia, North Macedonia, Baku (Azerbaijan), Kazakhstan
393 North Macedonia Thailand, Baku (Azerbaijan)
389 Baku (Azerbaijan) Thailand, North Macedonia, Kazakhstan
387 Kazakhstan Thailand, Baku (Azerbaijan)
380 Georgia Panama
377 Panama Georgia, Indonesia
371 Indonesia Panama
359 Morocco Lebanon, Kosovo
353 Lebanon Morocco, Kosovo
353 Kosovo Morocco, Lebanon
342 Dominican Republic Philippines
340 Philippines Dominican Republic
*For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
1. Data did not meet the PISA technical standards but were accepted as largely comparable (see PISA 2018 Results (Volume I): What Students Know and Can
Do, Annexes A2 and A4).
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934239344
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 35
2 Reading performance and the digital divide in PISA 2018
Each item in the PISA 2018 computer-based reading assessment was assigned to either the single-source or the multiple-source
text category, depending on the number of sources required to construct the correct answer. Multiple-source items are defined
by having different authors, or by being published at different times, or by bearing different titles or reference numbers.
In PISA 2018, some units required only a single source to construct the answer while in other units, all questions were multiple
sources, such as in Rapa Nui (see Box 2.2). In other cases, the unit started with a single stimulus text, and after some initial
questions, the scenario was updated to introduce a second text. It is important to have in mind that multiple source items in PISA
are not intrinsically more difficult than items involving single texts of comparable length and complexity.
Table 2.2 shows the country/economy mean for the overall reading scale and for each of the text-source subscales. It also
includes an indication of which differences along the (standardised4) subscale means are significant, through which
a country’s and economy’s relative strengths and weaknesses can be inferred. Standardisation was particularly important for the
single- and multiple-source subscales in order to assess a country/economy’s strength and weakness between two scales relative
to other countries/economies as in the large majority of countries/economies the multiple-source scores were higher than the
single-source scores. A simple difference in the subscale scores would not show which education systems were relatively stronger
in each subscale. Indeed, although the mean multiple-source subscale scores in Australia and Chinese Taipei were both five score
points higher than the mean single-source subscale scores, students in neither Australia nor Chinese Taipei were deemed to
be relatively stronger at multiple-source reading. Small differences that are not statistically significant or practically meaningful
should not be overly emphasised.
In general, students who perform well in one aspect of reading also tend to perform well in others. The percentage of variance in
single-source subscale explained by multiple source subscale ranged from 80% in Kazakhstan to 94% in Malta (Table 2.2).
On average, students in OECD countries were relatively stronger in multiple-source reading subscale compared to partner
countries/economies. At the same time, higher-performing countries tend to be relatively stronger on multiple-source items.
Indeed, of the countries above the OECD average in the overall reading assessment, only Hong Kong (China) and Singapore
were relatively stronger on single-source. In contrast, the rest of the countries/economies either were not particularly stronger
in any of the subscales – i.e. 11 countries/economies – or relatively stronger on the reading multiple-source texts subscale
– i.e. 10 countries/economies. Of the countries below the OECD average, 16 countries/economies were relatively stronger on
single-source text subscale, and 5 countries/economies were relatively stronger on multiple-source text subscale (Table 2.2).
HOW IS THE DIGITAL DIVIDE ASSOCIATED WITH EMERGENT ASPECTS OF READING PERFORMANCE?
It is undeniable that digital technologies offer great opportunities as to what, how, where, and when people learn. However, digital
divides mirror prevailing economic gaps and often even amplify the disadvantages of students from less wealthy backgrounds,
widening existing differences in learning and outcomes (Kuhl et al., 2019[7]; UNICEF, 2017[8]). Despite digital devices and the
Internet increasingly becoming globally available, not all students have equal opportunities to access and use digital devices at
home and in school. These digital divides are not only a question of having or not having physical access to a digital device but
about the differences in how, when, and for what purposes technology is used (Dolan, 2015[9]; Echazarra, 2018[10]). The kind
of scepticism that claims that technology is not needed in school, or even has negative side effects such as Internet addiction,
often conflicts with evidence that shows that students who locate, browse, and access different information resources and who
are knowledgeable about the context under which the information was created perform better both in overall grades and in
academic competence (Leung and Lee, 2012[11]). Needless to say, technology use for education becomes crucial during periods
of school closure such as summer holidays or pandemics.
In PISA 2018, 89% of students had a computer that they could use for schoolwork at home on average across OECD countries.
More than 90% of students in about half of the countries/economies participating in PISA had a computer that they could
use for schoolwork at home. However, at the same time, not even half of the students in the Dominican Republic, Indonesia,
Morocco, the Philippines, and Viet Nam had a computer that they could use for schoolwork at home (Table B.2.2).
*For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
1. Relative strengths that are statistically significant are highlighted; empty cells indicate cases where the standardised subscale score is not significantly higher
compared to other subscales, including cases in which it is lower. A country/economy is relatively stronger in one subscale than another if its standardised score,
as determined by the mean and standard deviation of student performance in that subscale across all participating countries/economies, is significantly higher
in the first subscale than in the second subscale.
2. Data did not meet the PISA technical standards but were accepted as largely comparable (see PISA 2018 Results (Volume I): What Students Know and Can Do,
Annexes A2 and A4).
Notes: Only countries and economies where PISA 2018 was delivered on computer are shown.
Although the OECD mean is shown in this table, the standardisation of subscale scores was performed according to the mean and standard deviation of students
across all PISA-participating countries/economies.
The standardised scores that were used to determine the relative strengths of each country/economy are not shown in this table.
Countries and economies are ranked in descending order of mean reading performance.
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934239363
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 37
2 Reading performance and the digital divide in PISA 2018
Table 2.2 [2/2] Comparing countries and economies on the single- and multiple-source subscales
*For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
1. Relative strengths that are statistically significant are highlighted; empty cells indicate cases where the standardised subscale score is not significantly higher
compared to other subscales, including cases in which it is lower. A country/economy is relatively stronger in one subscale than another if its standardised score,
as determined by the mean and standard deviation of student performance in that subscale across all participating countries/economies, is significantly higher
in the first subscale than in the second subscale.
2. Data did not meet the PISA technical standards but were accepted as largely comparable (see PISA 2018 Results (Volume I): What Students Know and Can Do,
Annexes A2 and A4).
Notes: Only countries and economies where PISA 2018 was delivered on computer are shown.
Although the OECD mean is shown in this table, the standardisation of subscale scores was performed according to the mean and standard deviation of students
across all PISA-participating countries/economies.
The standardised scores that were used to determine the relative strengths of each country/economy are not shown in this table.
Countries and economies are ranked in descending order of mean reading performance.
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934239363
The biggest increased in access to a computer at home for schoolwork across OECD countries happened between 2003 and
2009, with an increase from 78% to 92%. For most OECD countries, the percentage of students with a computer that they could
use for schoolwork at home remained consistently high over the last decade – except for Japan (61%) where this percentage
was comparatively lower than the OECD average (89%) and was even 8 percentage points higher in 2009. In Albania (71%),
2
Georgia (78%), Kazakhstan (74%), and the Republic of Moldova (84%), however, the rate of students with a computer that they
could use for schoolwork at home has increased by more than 20 percentage points over the last 10 years (Table B.2.2).
Nevertheless, student’s access to a computer at home for schoolwork is substantially lower than the percentage of students
with a link to the Internet at home. In PISA 2018, on average across OECD countries, 96% of students had a connection to the
Internet at home, which is nine percentage points higher than in PISA 2009 and 33 percentage points higher than in PISA 2003.
Furthermore, Albania (81%), Jordan (84%), Kazakhstan (89%), and Thailand (82%) have practically doubled the percentage of
students who had Internet at home over the last 10 years (Table B.2.2).
Remote learning, such as the one most students around the world experienced as a consequence of the COVID-19 global health
crisis, often requires or benefits from having access to a computer linked to the Internet at home for schoolwork. Figure 2.1
shows the change between PISA 2018 and PISA 2003 in the percentage of students with a computer that can be used for
schoolwork at home and access to the Internet. In PISA 2018, 88% of students had both a connection to the Internet at home
and a computer that they could use for schoolwork. However, in the Dominican Republic, Indonesia, Malaysia, Mexico, Morocco,
Peru, the Philippines, Thailand, and Viet Nam, half or less of students had access to both. Students’ access to a computer linked
to the Internet at home for schoolwork increased by more than 28 percentage points on average across OECD countries between
PISA 2003 and PISA 2018. However, a comparatively small increase (about 3 percentage point) was observed between PISA 2009
and PISA 2018 (Table B.2.2). In Hong Kong (China), Japan, Luxembourg, Qatar, Singapore and Chinese Taipei, this percentage
decreased between PISA 2009 and PISA 2018 by at least 5 percentage points. However, this is due to a decreased percentage of
students who reported having a computer that they could use for schoolwork at home rather than a decrease in Internet access
at home, which remained the same or increased (Table B.2.2). The same happened, although to a lesser extent, in Macao (China),
Finland, Germany, Ireland, Korea, the United Kingdom, and Sweden. These results do not necessarily mean that access to digital
devices is decreasing in those countries. But, they could mean that students are increasingly provided with other digital devices
for schoolwork such as smartphones or tablets instead of computers. Nonetheless, not all digital devices are equally suitable
for schoolwork activities. For example, digital devices with larger screens and a physical keyboard may help in navigating and
organising content on the Internet more efficiently.
In Denmark, Iceland and Poland, 95% or more of students attending disadvantaged schools5 reported that they had a computer
linked to the Internet for doing schoolwork at home. In contrast, this percentage is lower than 20% in Indonesia, Mexico, Morocco,
Panama, Peru, the Philippines, and Viet Nam (Figure 2.2). The largest digital divide between advantaged and disadvantaged
schools is in Mexico and Peru, with a difference of more than 70 percentage points.
Similarly, more than 95% of students from rural areas in Austria, Denmark, Iceland, Malta, Poland and Switzerland reported
having both a link to the Internet at home and a computer that could be used for schoolwork, but this percentage was lower than
20% in rural areas of Indonesia, Mexico, Morocco and the Philippines. The largest digital divide between rural and urban schools
is, again, in Mexico and Peru, with a difference of 57 and 45 percentage points, respectively (Table B.2.3).
The digital divide between public and private schools is comparatively narrower than between schools from different
socio-economic backgrounds or location. On average across OECD countries, about 92% of students from private schools
reported having a computer linked to the Internet for doing schoolwork at home compared to 87% of students from public
schools. However, in Colombia, Panama and Peru, students from private schools are almost twice as likely to have those resources
at home compared to students from public schools (Table B.2.3).
For many of the most disadvantaged students, schools are the only way they have to access and use computers linked to the
Internet (OECD, 2015[12]). In Malaysia, Mexico, Morocco, Peru, the Philippines and Viet Nam, in particular, more than 80% of
the most disadvantaged students have access to the Internet at school, but not at home (Table B.2.4). This means that out of
disadvantaged students who have access to the Internet, four in five students have access at school only.
As explained in Chapter 1, the PISA 2018 reading framework was designed to integrate print and digital reading assessments.
As a result, there is no longer a strict delineation of tasks typical of print or digital environments. For example, the text source
(i.e. single and multiple sources) is an important dimension in PISA reading literacy. Still, some single-source tasks were set in
online environments, while some multiple-source tasks had little to do with digital reading. Another important dimension in
PISA reading literacy is cognitive processes. In locating information, for example, some tasks included scanning a single piece of
text. Conversely, other tasks involved searching for and selecting relevant text from several pieces of text, which is closer to how
information is displayed in a digital environment. Therefore, studying the relationship between the digital divide and emergent
aspects of reading would require looking at particular items of the PISA reading assessment.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 39
2 Reading performance and the digital divide in PISA 2018
Figure 2.1 Change between 2009 and 2018 in access to a computer that they can use for schoolwork and a link to the
Internet at home
PISA 2003 PISA 2006 PISA 2009 PISA 2018
%0 10 20 30 40 50 60 70 80 90 100 0 10 20 30 40 50 60 70 80 90 100 %
Notes: Statistically significant differences between PISA 2018 and earlier cycles are shown in darker tones.
Costa Rica, Georgia, Malta and Moldova conducted the PISA 2009 assessment in 2010 as part of PISA 2009+.
OECD average-31 is the arithmetic mean across all OECD countries, excluding Chile, Colombia, Estonia, Israel, Lithuania and Slovenia.
Countries and economies are ranked in descending order of the percentage of students who reported having access to the Internet and a computer that can use
for schoolwork at home in 2018.
Source: OECD, PISA 2018 Database, Table B.2.4.
12 https://doi.org/10.1787/888934239382
40 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
status
Figure 2.2 Access to a computer linked to the Internet at home for doing schoolwork, by school’s socio-economic
Advantaged schools
2
Indonesia OECD average
Peru Ireland
Morocco Montenegro
Philippines Italy
Mexico Singapore
Viet Nam Israel
Panama France
Dominican Republic Germany
Thailand New Zealand
Colombia Bosnia and Herzegovina
Malaysia Korea
Brazil Belarus
Turkey Spain
Albania Belgium
Argentina Australia
Costa Rica North Macedonia
Jordan Estonia
Lebanon Russia
Brunei Darussalam Luxembourg
Baku (Azerbaijan) Macao (China)
Japan Malta
Saudi Arabia Austria
Kazakhstan Netherlands
B-S-J-Z (China) Portugal
Uruguay Canada
Chile Serbia
Georgia United Kingdom
Qatar Czech Republic
Moldova Croatia
Kosovo Latvia
Romania Lithuania
Chinese Taipei Sweden
United States Finland
Greece Switzerland
Ukraine Slovenia
Bulgaria Norway
Hong Kong (China) Iceland
United Arab Emirates Poland
Hungary Denmark
Slovak Republic
%0 10 20 30 40 50 60 70 80 90 100 0 10 20 30 40 50 60 70 80 90 100 %
1. A socio-economically disadvantaged (advantaged) school is a school whose socio-economic profile (i.e. the average socio-economic status of the students
in the school) is in the bottom (top) quarter of the PISA index of economic, social and cultural status amongst all schools in the relevant country/economy.
Note: Statistically significant values are shown in a darker tone.
Countries and economies are ranked in ascending order of the percentage of students who reported having access to the Internet and a computer that can be used
for schoolwork at home, in disadvantaged schools.
Source: OECD, PISA 2018 Database, Table B.2.5.
12 https://doi.org/10.1787/888934239401
An average of around 8.7% of students in OECD countries were top performers in reading, meaning that they attained Level 5
or 6 in the PISA reading test. At these levels, students are able to comprehend lengthy texts, deal with concepts that are abstract
or counterintuitive, and establish distinctions between fact and opinion based on implicit cues in the content or related to the
source of the information. This chapter will go one step further in paying special attention to the estimated percentage correct in
the PISA reading released item that focuses on distinguishing fact from opinion as one of the most emergent aspects of reading
in digital environments. Although item-level analysis is not expected to be as robust for cross-cultural comparisons and to cover
the full extent of a construct as full-scaled results, it can still provide meaningful insights into the relationships with students’
outcomes.
The PISA 2018 reading assessment included one item-unit (i.e. Rapa Nui Question 3, CR551Q06) that tested whether students
can distinguish between facts and opinions (Box 2.2). Figure 2.3 shows the system-level relationship between the estimated
percentage correct in that item and the percentage of students who reported having access to a computer linked to the
Internet at home. The PISA reading item that focuses on distinguishing fact from opinion was estimated to be 47% correct6 on
average across OECD countries. The estimated percentage correct of this item was higher than 60% in Australia, Canada, the
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 41
2 Reading performance and the digital divide in PISA 2018
Netherlands, New Zealand, Turkey, the United Kingdom and the United States while lower than 20% in Georgia, Indonesia, Kosovo,
Morocco, Panama, and the Philippines. Among OECD countries, the estimated percentage correct was lower than 30% in Colombia,
Costa Rica, the Czech Republic, Korea, and the Slovak Republic. As pointed out before, 88% of students in PISA 2018 had both a
connection to the Internet at home and a computer that they could use for schoolwork.
Most importantly, students’ access to a computer linked to the Internet at home for schoolwork is associated with the estimated
percentage correct in the item that focuses on distinguishing facts from opinions in the PISA reading assessment (R=0.54)7. The
partial correlation8 after accounting for per capita GDP was 0.42. Even after accounting for the country per capita GDP, access to
digital resources at home is associated with the estimated percentage correct in the item of distinguishing facts from opinions
in the PISA reading assessment.
Figure 2.3 Relationship between access to digital resources at home and emergent aspects of reading
1. United Arab Emirates 5. Croatia 9. Latvia 13. Iceland 17. Macao (China) 21. Estonia 25. Finland 29. Canada
2. Greece 6. Belarus 10. Spain1 14. Slovenia 18. Germany 22. Singapore 26. Sweden 30. Netherlands
3. Italy 7. Luxembourg 11. Russia 15. Poland 19. Hungary 23. Israel 27. Belgium
4. France 8. Malta 12. Switzerland 16. Austria 20. Ireland 24. Portugal 28. Australia
80
Above-average item performance and
Above-average access to the Internet and a computer at home
from opinions (Equated P+, Rapa Nui Question 3)
Percentage correct in the reading item of distinguishing facts
average: 88%
Indonesia Georgia
Kosovo
OECD
below-average access to the Internet and a computer at home
0
20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100
Percentage of students who reported having access to the Internet and a computer that can be used for schoolwork at home
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Source: OECD, PISA 2018 Database, Tables B.2.4 and B.2.8.
12 https://doi.org/10.1787/888934239420
Are students who had the opportunity to learn digital skills in school more likely to distinguish facts from
opinions?
Students from more advantaged socio-economic backgrounds not only had greater access to digital devices connected to the
Internet at age 15 but had comparatively earlier exposure to computers when they were six years or younger than students
from lower socio-economic backgrounds (OECD, 2015[12]). Although these factors matter, providing students with digital devices
connected to the Internet is not enough to ensure that they will become proficient in digital literacy and, at the same time, avoid
online risks such as disinformation or breaches of privacy.
Adolescents’ digital skills are positively associated with both online risks and opportunities (Rodríguez-de-Dios, van Oosten and
Igartua, 2018[13]). Parents play an essential role in providing access and encouraging an appropriate use of digital devices at
home – e.g. for social support or learning goals. However, they do not always succeed in maximising online opportunities while
reducing the risks (Livingstone et al., 2017[14]). Providing equal opportunities to learn digital skills at school while reducing online
risks is not only beneficial to all students but could also help to mitigate some of the learning gaps presented in the previous
section as a result of the digital divide. PISA 2018 asked students whether during their entire school experience they were
taught: a) how to decide whether to trust information from the Internet, b) how to compare different web pages and decide what
information is more relevant for their schoolwork, c) to understand the consequences of making information publicly available
online, d) how to detect phishing or spam emails, and e) how to detect whether the information is subjective or biased during
their entire school experience.
On average across OECD countries, 54% of students reported being trained on how to recognise whether information is
biased. Among OECD countries, more than 70% of students reported receiving this training in Australia, Canada, Denmark, and
the United States. However, less than 45% of students reported received this training in Israel, Latvia, the Slovak Republic,
Slovenia, and Switzerland (Table B.2.6).
Not all students had equal access to learning digital skills at school. The percentage difference in students who were taught
how to detect biased information on the Internet between students from advantaged and disadvantaged backgrounds across
OECD countries was 8 percentage points in favour of advantaged students. In Belgium, Brunei Darussalam, Denmark, Germany,
Luxembourg, Sweden, the United Kingdom and the United States, this difference is around 14 percentage points or higher
(Table B.2.6).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 43
2 Reading performance and the digital divide in PISA 2018
As pointed out before, the PISA 2018 reading assessment included one item-unit (Rapa Nui Question 3, CR551Q06) that tested
whether students can distinguish between facts and opinions (Box 2.2). The estimated percentage correct in this item is 47% on
average across OECD countries, and higher than 60% in Australia, Canada, the Netherlands, New Zealand, Turkey, the United
Kingdom and the United States. In contrast, this percentage is lower than 20% in Georgia, Indonesia, Kosovo, Morocco, Panama,
and the Philippines. On the other hand, among OECD countries, more than 70% of students in Australia, Canada, Denmark, and
the United States were taught how to detect whether the information is biased while less than 45% of students were taught this
in Colombia, Israel, Latvia, the Slovak Republic, Slovenia, and Switzerland (Figure 2.4).
The opportunity for students to learn in school how to detect whether information is subjective or biased is strongly associated
with the estimated percentage correct in the item that focuses on distinguishing facts from opinions in the PISA reading
assessment among OECD countries (R=0.68), and moderately associated among all participating countries and economies in
PISA 2018 (R=0.38)9 (Figure 2.4). The partial correlation10 after accounting for per capita GDP was 0.66 and 0.31. The partial
correlations11 after accounting for average reading performance were 0.60 and 0.32 respectively. Therefore, it is the access
students have to education on how to detect biased information in school rather than overall reading performance that is driving
a strong association with the estimated percentage correct in the item of distinguishing fact from opinion.
Figure 2.4 shows that the percentage of students in Hong Kong (China) and Singapore who had access to training in school
on how to detect biased information as well as their estimated percentage correct in the distinguishing fact from opinion item
is above the OECD average. However, students in other partner countries and economies fall below the OECD average in the
estimated percentage correct in this item. Furthermore, students in Chinese Taipei scored below the OECD average in this item
even though the proportion of students reporting that they were taught how to detect biased information in school was well
above the OECD average.
Figure 2.4 Reading item of distinguishing facts from opinions and access to training on how to detect biased
information in school
New Zealand
United States
OECD countries
United Kingdom
Netherlands R = 0.68
Turkey Canada
60 Ireland Australia Singapore
Estonia Belgium Denmark
Israel Germany All countries
Sweden and economies
Portugal Hong Kong (China)
Poland Chile Finland R = 0.38
Slovenia 13
Switzerland Hungary 11 14 Japan
OECD average: 47% 12 15
Latvia
40 Spain1 10 8 United Arab Emirates
Brunei Darussalam 9
Malta Chinese Taipei
Qatar
Luxembourg 7 Mexico
Norway Brazil 4 6
Uruguay Thailand
5 Bulgaria
Costa Rica 3 Albania
2 Serbia
Slovak Republic Montenegro
Korea
20 Peru 1 Kazakhstan
Baku (Azerbaijan)
Panama
Morocco Philippines
Dominican Republic Indonesia
Georgia
Kosovo
0 10 20 30 40 50 60 70 80 90 100
Percentage of students who were taught how to detect whether the information is subjective or biased
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Source: OECD, PISA 2018 Database, Table B.2.8.
12 https://doi.org/10.1787/888934239439
2
and above the average in the total reading score (505). However, Korea, who performed above the OECD average in reading (514),
scored below the average in this particular item (26%) while Turkey, who performed below the OECD average in reading (466), is
the country with the highest percentage correct (63%) after the United States (69%) and the United Kingdom (65%). Again, this is
just an item and not a full-scaled construct to assess students’ capacity to distinguish between fact and opinion. Yet, these results
may reflect differences in curriculum across countries as well as different practices and out-of-school experiences. Learning how
to distinguish facts from opinions in school likely helps improve PISA reading scores. It is also likely to help students benefit more
fully from online resources while reducing online risks.
Figure 2.5 Reading item of distinguishing facts from opinions and reading performance
United States
Montenegro
Philippines Panama Bosnia and Herzegovina
Georgia
Kosovo
Indonesia Baku (Azerbaijan)
320 340 360 380 400 420 440 460 480 500 520 540 560 580
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Source: OECD, PISA 2018 Database, Tables B.2.1 and B.2.8.
12 https://doi.org/10.1787/888934239458
Figure 2.6 shows the system-level correlations (OECD countries) between indicators collected in PISA 2018 on access to learning
digital skills in school which have a meaningful association with the estimated percentage of correct responses in the reading
item of distinguishing facts from opinions (Rapa Nui Question 3, CR551Q06). Among these indicators, student access to school
training on how to detect whether information is biased was the indicator most strongly correlated with estimated percentage
correct in the reading item of distinguishing facts from opinions. Although other indicators such as learning how to understand
the consequences of making information publicly available online are still moderately associated with performance in this item,
these associations are weaker in magnitude.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 45
2 Reading performance and the digital divide in PISA 2018
Figure 2.6 Correlations between access to learning digital skills in school and the reading item of distinguishing facts
from opinions in OECD countries
90
80
70
60
R² = 0.46
R² = 0.10 R² = 0.11
50
40 R² = 0.14
30
20
10
0 10 20 30 40 50 60 70 80 90 100
Percentage of students
12 https://doi.org/10.1787/888934239477
Figure 2.7 shows the same system-level correlations but among all participating countries and economies in PISA 2018. In this
case, the magnitude of the correlations is generally smaller than among OECD countries but still relevant. This is particularly the
case among students who were taught how to understand the consequences of making information publicly available and to
detect biased information.
The findings presented in this chapter highlight the importance of providing digital resources for educational purposes both
at school and at home in fostering students’ reading performance. This is also important for emergent reading aspects, such
as students’ capacity to distinguish facts from opinions. For many disadvantaged students, schools are the only way they have
to access and use computers linked to the Internet. It is, therefore, reasonable to expect that existing reading gaps among
students from different socio-economic backgrounds, including emergent aspects of reading, would be amplified during long
periods of school closures such as the ones experienced during pandemics. Altogether, these results show the crucial role of
providing equal opportunities to learn digital skills at school and its strong association with students’ performance in emergent
reading aspects. More analysis showing the association between teaching practices, access to learning digital skills at school and
emerging aspects of reading are provided in Chapter 6.
Percentage of students who reported that during their entire school experience they were taught the following:
2
How to decide whether to trust information from the Internet
How to compare different web pages and decide what information is more relevant for your schoolwork
To understand the consequences of making information publicly available online on <Facebook©>, <Instagram©>, etc
How to detect whether the information is subjective or biased
100
from opinions (Equated P+, Rapa Nui Question 3)
Percentage correct in the reading item of distinguishing facts
90
80
70
60
R² = 0.28
50 R² = 0.15
40 R² = 0.06
R² = 0.10
30
20
10
0 10 20 30 40 50 60 70 80 90 100
Percentage of students
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 47
2 Reading performance and the digital divide in PISA 2018
Notes
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS). A socio-economically disadvantaged
(advantaged) student is a student in the bottom (top) quarter of the ESCS in the relevant country/economy.
2. While Spain’s data met PISA 2018 Technical Standards, some data showed implausible response behaviour among students. Further analysis of
Spain’s data by testing time showed that some regions in Spain conducted their high-stakes exams for tenth-grade students earlier in the year
than in the past, which resulted in the testing period for these exams coinciding with the end of the PISA testing window. Because of this overlap,
a number of students were negatively disposed towards the PISA test and did not try their best to demonstrate their proficiency. Although
the data from only a minority of students showed clear signs of lack of engagement, the comparability of PISA 2018 data for Spain with those
from earlier PISA assessments cannot be fully ensured. After careful consideration of the results based on the further analysis of Spain’s data,
PISA 2018 data for reading for Spain were released in July 2020. They were, therefore, included in this report. While all data are released, Spain’s
performance results in PISA 2018 might be subject to a possible downward bias in performance results for the reasons previously explained.
3. Because the membership of the OECD has changed over time, the three categories (around, above and below the OECD mean) are not
comparable to the corresponding categories used in earlier PISA reports.
4. In order to identify relative strengths and weaknesses, the scores are first standardised by comparison to the mean and standard deviation
across all PISA-participating countries. When the standardised score in one subscale is significantly higher than that in another subscale in
a country/economy, it can be said to be relatively stronger in the first subscale compared to the average across PISA-participating education
systems.
5. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS). A socio-economically disadvantaged
(advantaged) school is a school in the bottom (top) quarter of the ESCS in the relevant country/economy.
6. Rapa Nui Question 3 is a partial credit item where non-credit is scored 0, partial credit is scored 0.5, and full credit is scored 1. Therefore, the
estimated percentage correct for full credit in this item is lower than 47% on average across OECD countries. This item was estimated to be 39%
correct on average across all PISA 2018 participating countries and economies. Rapa Nui Question 3 is a Level 5 item. This means that students
need to have a proficiency Level 5 to have a 62% probability of getting full credit in this item (see Figure I.2.1, (OECD, 2019[2])).
7. Countries which administered the paper-based form had no available data to perform this analysis: Argentina, Jordan, Lebanon,
the Republic of Moldova, the Republic of North Macedonia, Romania, Saudi Arabia, Ukraine and Viet Nam.
8. The partial correlation is calculated using the percentage of students with access to a computer linked to the Internet at home for schoolwork
in PISA 2018 (Table B.2.2) and the percentage correct in the reading assessment items to assess the capacity to distinguish facts from opinions
(Table B.2.6), after accounting for per capita GDP (Table B3.1.4, (OECD, 2019[2])).
9. Countries which administered the paper-based form had no available data to perform this analysis: Argentina, Jordan, Lebanon,
the Republic of Moldova, the Republic of North Macedonia, Romania, Saudi Arabia, Ukraine and Viet Nam.
10. The partial correlation is calculated using the percentage of students who reported to learn in school how to detect whether the information is
subjective or biased (Table B.2.6) and the percentage correct in the reading assessment items to assess the capacity to distinguish facts from
opinions (Table B.2.6), after accounting for per capita GDP (Table B3.1.4, (OECD, 2019[2])).
11. The partial correlation is calculated using the percentage of students who reported to learn in school how to detect whether the information is
subjective or biased (Table B.2.6) and the percentage correct in the reading assessment items to assess the capacity to distinguish facts from
opinions (Table B.2.6), after accounting for average reading performance (Table B.2.1a).
12. The total reading score includes the item that is being correlated, so this R2 value would be slightly lower if the item is extracted from the total
reading score.
Dolan, J. (2015), “Splicing the Divide: A Review of Research on the Evolving Digital Divide Among K–12 Students”, Journal of Research on
Technology in Education, Vol. 48/1, pp. 16-37, http://dx.doi.org/10.1080/15391523.2015.1103147.
2[9]
Echazarra, A. (2018), “How has Internet use changed between 2012 and 2015?”, PISA in Focus, No. 83, OECD Publishing, Paris, [10]
https://dx.doi.org/10.1787/1e912a10-en.
Klieme, E. (2020), “Policies and Practices of Assessment: A Showcase for the Use (and Misuse) of International Large Scale Assessments [1]
in Educational Effectiveness Research”, in International Perspectives in Educational Effectiveness Research, Springer International Publishing,
Cham, http://dx.doi.org/10.1007/978-3-030-44810-3_7.
Kuhl, P. et al. (2019), Developing Minds in the Digital Age: Towards a Science of Learning for 21st Century Education, Educational Research [7]
and Innovation, OECD Publishing, Paris, https://dx.doi.org/10.1787/562a8659-en.
Leung, L. and P. Lee (2012), “Impact of Internet Literacy, Internet Addiction Symptoms, and Internet Activities on Academic Performance”, [11]
Social Science Computer Review, Vol. 30/4, pp. 403-418, http://dx.doi.org/10.1177/0894439311435217.
Livingstone, S. et al. (2017), “Maximizing Opportunities and Minimizing Risks for Children Online: The Role of Digital Skills in Emerging [14]
Strategies of Parental Mediation”, Journal of Communication, Vol. 67/1, pp. 82-105, http://dx.doi.org/10.1111/jcom.12277.
OECD (2020), Early Learning and Child Well-being: A Study of Five-year-Olds in England, Estonia, and the United States, OECD Publishing, [5]
Paris, https://dx.doi.org/10.1787/3990407f-en.
OECD (2020), PISA 2018 Results (Volume V): Effective Policies, Successful Schools, PISA, OECD Publishing, Paris, [6]
https://dx.doi.org/10.1787/ca768d40-en.
OECD (2019), “Percentage of 15-year-olds covered by PISA: Coverage Index 3”, in PISA 2018 Results (Volume I): What Students Know and Can [3]
Do, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/5f07c754-en.
OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, [2]
https://dx.doi.org/10.1787/5f07c754-en.
OECD (2019), PISA 2018 Results (Volume II): Where All Students Can Succeed, PISA, OECD Publishing, Paris, [4]
https://dx.doi.org/10.1787/b5fd1b8f-en.
OECD (2016), PISA 2015 Results (Volume I): Excellence and Equity in Education, PISA, OECD Publishing, Paris, [17]
https://dx.doi.org/10.1787/9789264266490-en.
OECD (2015), Students, Computers and Learning: Making the Connection, PISA, OECD Publishing, Paris, [12]
https://dx.doi.org/10.1787/9789264239555-en.
Rodríguez-de-Dios, I., J. van Oosten and J. Igartua (2018), “A study of the relationship between parental mediation and adolescents’ [13]
digital skills, online risks and online opportunities”, Computers in Human Behavior, Vol. 82, pp. 186-198,
http://dx.doi.org/10.1016/j.chb.2018.01.012.
UNICEF (2017), The State of the World’s Children 2017: Children in a Digital World, [8]
https://www.unicef.org/publications/files/SOWC_2017_ENG_WEB.pdf.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 49
3
Dynamic Navigation in PISA 2018 Reading Assessment:
Read, Explore and Interact
This chapter provides item-level analyses
using process (or log) data to illustrate
how students navigate through multiple
text sources to search for and locate
relevant information. This chapter
focuses on a scenario-based reading
unit, Rapa Nui, which was developed
with multiple-source text environments.
The relationship between navigation
skills and reading performance is also
examined in this chapter.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 51
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
– More than 30% of students in B-S-J-Z (China), Korea, and Singapore tended to actively explore the whole reading unit
in both single- and multiple-source environments. These students checked different accessible pages beyond item
requirements to complete the task.
– On average, over 70% of students in 70 countries/economies demonstrated limited or no navigation. More than 15% of
students with limited navigation were found in Hungary, New Zealand, Peru, Poland, Spain, and Turkey. More than 75%
of students in Baku (Azerbaijan), Bosnia and Herzegovina, Colombia, the Dominican Republic, Kosovo, Morocco, Panama,
and Uruguay showed no navigation.
– Strictly focused navigation and actively explorative navigation are positively correlated with performance (0.69 and 0.43,
OECD average). In contrast, limited navigation and no navigation are negatively associated with performance (-0.28 and
-0.32, OECD average).
DYNAMIC NAVIGATION IN PISA 2018 READING ASSESSMENT: READ, EXPLORE AND INTERACT
The rise of digital technology has promoted the emergence of new text forms beyond traditional printed texts. Readers in
the digital age must master several emerging reading skills to understand these new text-based genres and socio-cultural
practices. They need to apply information and communication technology (ICT) knowledge to understand and operate devices;
access the texts they need using search engines, links and tabs; read from multiple sources; distinguish what is high-quality,
credible information; and corroborate information, detect potential conflicts and resolve them (OECD, 2019[1]).
Among these emerging reading skills, navigation is recognised as a key component of reading in the digital environment
as readers “construct” their text through navigation and spend time retrieving information from eventually targeted texts.
Evidence has proven that better readers tend to minimise their visits to irrelevant pages and locate necessary pages efficiently
(OECD, 2011[2]). Stronger readers choose strategies that are suited to the demands of individual tasks (OECD, 2011[2]; Lawless
and Kulikowich, 1996[3]; Salmerón and García, 2011[4]; Salmerón, Kintsch and Kintsch, 2010[5]; Naumann et al., 2007[6]; Lawless
and Schrader, 2008[7]).
The description of readers’ navigation process demands tremendous support from log files. Data stored in log files, referred to
as process data in this chapter, contain information on the actions undertaken by test takers in terms of computer interaction
and time spent on each action during the process. This sort of data provides extra information beyond response data that
typically show correctness or incorrectness for accuracy only (He, Borgonovi and Paccagnella, 2019[8]; He, Borgonovi and
Paccagnella, 2021[9]; von Davier et al., 2019[10]).
The interactive nature of PISA computer-based assessments makes them ideal candidates for analyses based on process data
(Vörös, Kehl and Rouet, 2020[11]; Goldhammer et al., 2014[12]). Given the promise of this valuable information from new data
source in log files, the fine-grained student-level process data are deeply explored in this chapter to better understand how
students navigated and allocated their time on emerging multiple-source reading items in the PISA 2018 reading assessment.
The relationship between navigation skills and performance in reading in digital environments is also examined and illustrated
through case studies in this chapter.
Dynamic texts feature as an emerging aspect of reading, especially in a multiple-source environment. They give the reader
some level of decision-making power as to how to read them. This kind of text generally goes along with a more complex and
non-linear1 organisation and integrates more navigation tools. The interactive design in PISA reading assessments embeds
features of dynamic texts to support different types of texts such as authored web pages with combinations of lists, paragraphs
3
and, often, graphics, and message-based texts with online forms, e-mail messages and forums. Readers need to construct
their own pathways to complete any reading activity associated with dynamic texts. Students have flexibility to decide which
information is important to the individual task and switch between pages. Knowledge not only of what students’ responses
were but how they reached their responses through dynamic navigation enables a deeper understanding of students’ cognitive
process in dynamic reading.
This chapter focuses on a scenario-based reading unit, Rapa Nui (CR551), which was developed with multiple-source text
environments. It consists of three texts: a webpage from the professor’s blog, a book review, and a news article from an online
science magazine. In these multiple-text reading situations, readers must make decisions as to which of the available pieces
of text is the most important, relevant, accurate or truthful (Rouet and Britt, 2011[13]). Figure 3.1presents the screenshot of the
first item in the Rapa Nui unit. Similar to the layout presented in the computer-based assessments in previous PISA cycles, the
assessment interface is divided into two parts: the item response area on the left side of the screen and simulation area on the
right side of the screen. The three different text sources – blog, book review and science news – are embedded using three tabs.
The contents of these three sources connect to and supplement each other to give a whole picture of Rapa Nui from different
perspectives. For further information on this unit see https://www.oecd.org/pisa/test/.
Figure 3.1 Screenshot of the first item in the Rapa Nui reading unit (CR551Q01)
The Rapa Nui unit consists of seven items ranging in levels from moderate to high difficulty (Table 3.1). The first five items
(item 1 to item 5) are items with single-source requirements where students are instructed to complete the task with reference to
a single page. Navigation to the accessible pages is optional. The last two items (item 6 and item 7) are items with multiple-source
requirements (see Figure 3.2 for a screenshot of item 6). Each item instructs students to refer to all three sources to complete the
task, requiring navigation to other pages. As a reminder, Rapa Nui was included in the second unit of the testlet as being of high
difficulty2. The non-response rate was expected to be high, especially towards the end of the test when students might run out
of time3. Specifically, the average non-response rate in the Rapa Nui unit was 16%. The non-response rate was the highest, 31%,
in the last item (CR551Q11) of this unit (Table B.3.1).
Embedded sources, that is, references to other authors or texts, are included in the Rapa Nui unit. When students scroll down to
the end of the blog on the first page of this unit, two hyperlinks appear at the end of the passage. The multiple-source structure
is activated by clicking the hyperlinks. The other two tabs (Book Review and Science News) appear in the navigation bar for further
navigation. If students miss the hyperlinks on the first page, “Book Review” and “Science News” are later activated by default in
item 3 and item 4.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 53
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Figure 3.2 Screenshot of an item with multiple-source requirement in the Rapa Nui reading unit (CR551Q10)
Table 3.1 Item characteristics and difficulty in the Rapa Nui reading unit
Source required
Item difficulty
Item order Item difficulty for item Response format
level (single/multiple)
Simple Multiple
CR551Q01 Item 1 559 Level 4 Single
Choice
Complex Multiple
CR551Q06 Item 3 654 Level 5 Single
Choice
Simple Multiple
CR551Q08 Item 4 634 Level 5 Single
Choice
Simple Multiple
CR551Q09 Item 5 597 Level 4 Single1
Choice
Complex Multiple
CR551Q10 Item 6 665 Level 5 Multiple
Choice
1. The source required for item CR551Q09 could be classified as requiring only a single source, however the item contained evidence that supported the overall
theory which is akin to working with multiple sources.
Source: OECD, PISA 2018 Database.
The studies presented in this chapter focus on a total of 76 270 students from 70 countries and economies who responded to
the Rapa Nui reading unit (CR551)4. It is necessary to note that the multistage adaptive testing design (MSAT; Yamamoto, Shin
and Khorramdel, 2019[14]) was applied to the PISA 2018 reading domain for the first time. In accordance with the routing rule in
the MSAT, students who performed better in the previous stage have a higher chance of being allocated to a more difficult testlet.
The Rapa Nui unit was intended to be of moderate-to-high difficulty and was labelled as being of high difficulty in the testlet.
Therefore, the subsample used in this study could be slightly biased towards students who have higher reading skills than the
average level.
As shown in Figure 3.3, though the subsample (N=76 270) covers a broad range of reading proficiency levels, from
“Below Level 1c” to “Level 6” (see note below Table 3.2 for a description of these proficiency levels), their overall average reading
performance score is 517 points, marginally higher than the average reading score of the full sample 460 points from the
70 countries and economies. The distribution of this subsample is slightly shifted to the right compared with the full sample
3
distribution, signalling a marginal bias towards students who have higher reading performance skills than the average level.
As a result, background variables such as gender and socio-economic status (measured by the PISA index of economic, social and
cultural status [ESCS]) also showed as slightly biased. Of the 76 270 students, 52% of students were girls (with student weights
computed), slightly higher than the 50% of girls in the full sample. The index of economic, social and cultural status (ESCS)
was -0.05 in the subsample of Rapa Nui respondents, which was a bit higher than that of the full sample: -0.28 (Table B.3.10).
It is noted that all the computation in the current study was conducted based on 10 plausible values with students’ weight of
80 replicates. A distribution of reading proficiency levels of students who responded to the Rapa Nui unit is presented in Table 3.2.
Figure 3.3 Distribution of reading performance of students who responded to the Rapa Nui reading unit
2000
1000
1000
100
120
140
160
180
200
220
240
260
280
300
320
340
360
380
400
420
440
460
480
500
520
540
560
580
600
620
640
660
680
700
720
740
760
780
800
820
840
860
880
900
920
940
960
980
Reading performance score
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934239515
Table 3.2 Overall average of students’ reading proficiency levels in the Rapa Nui unit
1. Reading proficiency levels are defined based on reading performance plausible values: Level 6: Above 698.32 score points; Level 5: From 625.61 to less than
698.32 score points; Level 4: From 552.89 to less than 625.61 score points; Level 3: From 480.18 to less than 552.89 score points; Level 2: From 407.47 to
less than 480.18 score points; Level 1a: From 334.75 to less than 407.47 score points; Level 1b: From 262.04 to less than 334.75 score points; Level 1c: From
189.33 to less than 262.04 score points; Below Level 1c: Less than 189.33 score points. Refer to PISA 2018 Results (Volume I) - What Students Know and Can
Do, Chapter 5.
2. Overall average is computed on students who responded to the Rapa Nui unit.
Note: The computation is based on 10 plausible values with student weights 80 replicates.
Source: OECD, PISA 2018 Database.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 55
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
a. Locate information: to access and retrieve information within a text to search for and select relevant texts.
b. Understand: to comprehend the literal meaning of passages and integrate different portions of the text together.
c. Evaluate and reflect: to assess the quality and credibility of information extracted from the text, reflect on the content, form
opinions, and detect and handle conflicting information from multiple texts.
Competent readers can adapt to the purpose of each reading task and play a latent role in coordinating the different cognitive
processes. As a purpose-driven activity, reading is always performed with goals in mind (Anmarkrud et al., 2013[16]; Rouet, Britt
and Durik, 2017[17]; Vidal-Abarca, Mañá and Gil, 2010[18]). Good navigation can be characterised as navigational behaviour that is
consistent with these goals and which supports the whole cognitive process.
To describe students’ navigation behaviour, the sequences of pages visited by students and time spent on each page as well as
fine-grained action sequences (e.g., mouse moving and clicks) in the process of solving each task are extracted from the log files
recorded by the test administration platform.
A first measure of students’ navigation activity is the length of navigation sequences, which corresponds to the number of
transitions between different pages recorded in log files. The number of pages visited beyond the default initial page may be
associated with students’ engagement and skills in information-locating and assessing (e.g. Naumann, 2015[10]; Sahin and Colvin,
2020[11]; Hahnel et al., 2016[12]). As navigation behaviour may not be the same for single- and multiple-source items, this measure
was executed separately by two environments. Some single-source items in a digital reading environment hardly require any
navigation (i.e., a single short page of text presented on a computer screen). In contrast, longer sequences are often required
to solve more complex tasks (e.g., He et al., 2019; Han et al., 2019; Tang et al., 2020[13]). A multiple-source reading task typically
involves comparisons between text information, locating information in difference sources and understanding the content.
A second measure focuses on students’ navigation quality and strategies, for instance, whether actively executed explorative
navigation beyond the item required pages or strictly followed item instructions to visit the required pages only. The non-linear
reading structure in PISA 2018 allows students to judge whether they need and on which page they should navigate through the
whole reading unit with dynamic texts. It means that students could access to other pages beyond the page that is required to read
on the purpose to solve the current task. Test-takers may have decided to explore the given task to prepare themselves for later
questions even though they were aware that this question did not require them to do so. Alternatively, test-takers could strictly follow
the task instruction and read only the pages that are required to solve the tasks. These students are expected to actively navigate
to relevant pages in items with multiple-source requirements and limit their navigation in items with single-source requirements.
In addition to navigation quantity and quality, a third measure focuses on examining the time information extracted from the
navigation process, e.g. time spent on each page and transition time between pages. Competent readers spend enough time
on relevant pages to successfully understand the content. Quick switches between pages and frequent clicks on pages back and
forth suggest a quick skim or unfocused navigational behaviour. In this study, an a priori assumption has been made that a quick
transition of less than three seconds is considered an ineffective page visit5.
Table 3.3 summarises the navigation behaviour indicators developed for this study. It is noted that all the measures were conducted
under single- and multiple- source items respectively. The indicators were analysed on system-level and within-country level and
associated with reading performance.
Table 3.3 A summary of navigation indicators developed in the Rapa Nui study
Number of pages visited Navigation behaviour and strategy Median time spent on initial page
Note: All the measures are conducted under single- and multiple- source items respectively.
Table 3.4 Overall average of time spent on instruction page in Rapa Nui unit, by students’ reading proficiency
levels
1. Reading proficiency levels are defined based on reading performance plausible values: Level 6: Above 698.32 score points; Level 5: From 625.61
to less than 698.32 score points; Level 4: From 552.89 to less than 625.61 score points; Level 3: From 480.18 to less than 552.89 score points; Level
2: From 407.47 to less than 480.18 score points; Level 1a: From 334.75 to less than 407.47 score points; Level 1b: From 262.04 to less than 334.75
score points; Level 1c: From 189.33 to less than 262.04 score points; Below Level 1c: Less than 189.33 score points. Refer to PISA 2018 Results
(Volume I) - What Students Know and Can Do, Chapter 5.
2. Overall average is computed on students who responded to the Rapa Nui unit.
Source: OECD, PISA 2018 Database.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 57
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Figure 3.5 shows students’ navigation quantity in items with single- and multiple-source requirements in the Rapa Nui unit
across 70 countries and economies. On average, students in OECD countries visited 1.08 pages in items with multiple-source
requirements and 0.76 pages in items with single-source requirements after the default first pages. By this simple measure,
East Asian countries and economies (Korea, Singapore, B-S-J-Z (China), Chinese Taipei, Macao (China), Hong Kong (China) in
decreasing order of their mean value on this index) stand out for having the highest average number of page visits. A gap as
substantial as four pages has been observed between countries and economies with the highest and lowest number of pages
visited in this unit (Figure 3.5 and Table B.3.2).
Figure 3.5 Average number of pages visited in items with single- and multiple-source requirements in Rapa Nui unit
Items with single-source requirement
Items with multiple-source requirement
Korea France
Singapore Spain¹
B-S-J-Z (China) Indonesia
Chinese Taipei Italy
Macao (China) Portugal
Hong Kong (China) Sweden
United Kingdom Slovak Republic
New Zealand Albania
United Arab Emirates Philippines
Australia Lithuania
Canada Switzerland
Brunei Darussalam Chile
Japan Greece
Slovenia Bulgaria
United States Luxembourg
Netherlands Latvia
Finland Mexico
Turkey Serbia
Poland Norway
Ireland Kazakhstan
Israel Peru
Croatia Montenegro
Estonia Iceland
Malta Denmark
Thailand Costa Rica
Qatar Brazil
Belarus Georgia
OECD average Bosnia and Herzegovina
Austria Panama
Malaysia Colombia
Overall average Uruguay
Czech Republic Baku (Azerbaijan)
Belgium Dominican Republic
Russia Kosovo
Hungary Morocco
Germany
0 1.0 2.0 3.0 4.0 5.0 0 1.0 2.0 3.0 4.0 5.0
Average number of pages visited Average number of pages visited
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Note: This figure shows the average rank of students in the international comparison of students taking the same test unit of Rapa Nui.
Countries and economies are ranked in a descending order of the total average number of pages visited beyond default initial pages (sum of number of pages visited
in single-source and multiple-source items).
Source: OECD, PISA 2018 Database, Table B.3.2.
12 https://doi.org/10.1787/888934239534
58 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Students’ average dynamic navigation behaviour – quantified by the indices of overall navigation activity – is found to be positively
associated with students’ reading performance score in items with both single-source and multiple-source requirements.
As shown in Figure 3.6, the quantity of navigation activities is more strongly correlated with reading performance in multiple-source
items (r=0.75) than in single-source items (r=0.65), and show strong linear relationships (R2=0.57) in multiple-source items and
3
(R2=0.42) in single-source items respectively. The reason could be that students usually execute longer navigation sequences in
multiple-source items as is expected. In single-source items, students are not compulsory to execute a navigation sequence to
complete the task unless they click on hyperlinks to activate the multiple-source in specific items6. The percentage of no-navigation
behaviour in multiple-source items has a strong negative correlation (r=-0.65) with students’ reading performance score (Table 3.5).
Students’ navigation quantity and reading performance are also shown to be positively associated within each country, though
not as strong as the system-level correlation. Like the results of the system-level correlation, the association between navigation
length and reading performance is stronger in multiple-source than single-source items. No significant correlation was found
between the number of page visits in single-source items and reading performance in Belgium, Denmark, Latvia, Luxembourg,
Switzerland, and Turkey. No significant correlation was found in Spain in both single- and multiple-source items, which could be
evidence of the low engagement issues that have been identified in this country8 (Table B.3.2). The correlation between number
of pages visited and reading performance value within each country is reported in Table B.3.2.
Figure 3.6 Overall navigation quantity in single- and multiple- source items
500 500
350 350
Overall average
Overall average
OECD average
OECD average
300 300
250 250
200 200
0 10 20 30 40 50 0 10 20 30 40 50 60 70
Percentile rank Percentile rank
Figure 3.7 presents the task-oriented navigation according to the four behaviour categories defined above. On average across
OECD countries, approximately one out of five students took the strategy of strictly focused navigation when solving the Rapa Nui
unit. As shown in Table 3.5, the percentage of students in this navigation group shows the highest correlation with the reading
performance score (r=0.73 across all participating countries and r=0.69 among OECD countries). On average, more than 27% of
students in B-S-J-Z (China), Hong Kong (China), Japan, Korea, Russia, Singapore, Chinese Taipei and the United Kingdom were in
the strictly-focused group (Table B.3.9). These students tended to be the most selective in their dynamic navigation in carefully
selecting the pages that were relevant to the tasks and limiting irrelevant page visits.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 59
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
%0 10 20 30 40 50 60 70 80 90 100 %
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Note: This figure shows the average rank of students in the international comparison of students taking the same test unit of Rapa Nui.
Countries and economies are ranked in descending order of the percentage of students in strictly focused navigation group.
Source: OECD, PISA 2018 Database, Table B.3.9.
12 https://doi.org/10.1787/888934239572
60 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
navigation
Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Table 3.5 Correlations between percentage of students in navigation behaviours groups and performance score
OECD average 0.69 (0.12) 0.43 (0.09) -0.28 (0.09) -0.32 (0.14)
Overall average 0.73 (0.09) 0.59 (0.09) 0.17 (0.09) -0.65 (0.15)
More than 15% of students were also found in the actively explorative navigation group in East Asian countries and economies
as well as the Netherlands, New Zealand, the United Arab Emirates, the United Kingdom and the United States (Table B.3.9).
These students tended to actively explore the whole reading unit in both items with single- and multiple-source requirements.
They checked different accessible pages beyond the required pages to complete the task.
Limited navigation behaviours were widely found in most countries and economies. More than 15% of students in this group
were found in, Hungary, New Zealand, Peru, Poland, Spain and Turkey. These students tended to navigate in single-source items
though they were not restricted to them and did not show further navigation behaviours in multiple-source items. It is noted
that New Zealand had a high proportion of both the actively explorative navigation and limited navigation groups but a relatively
low proportion (44%) in the no navigation group. This is 15 percentage points lower than the overall average (59%) across
70 countries. Students in New Zealand showed a wide variety of navigation patterns (Table B.3.9).
Over 50% of students did not execute any navigation in the Rapa Nui reading unit, suggesting unfamiliarity with the dynamic text
environments or possibly fatigue or low motivation in the last unit of the test. More than 75% of students in Baku (Azerbaijan),
Bosnia and Herzegovina, Colombia, the Dominican Republic, Kosovo, Morocco, Panama and Uruguay showed no navigation.
Among these students, 92% of students responded to at least one item in the Rapa Nui unit, and nearly half responded to
all seven items, though did not navigate to any other pages beyond the initial one (Table B.3.3). This phenomenon has been
observed more often in countries with lower proficiency level.
It is noted that both the strictly-focused and actively explorative navigation groups presented efficient reading processes and
are associated with higher reading performance compared with the limited-navigation and no-navigation groups. This finding
also holds for most countries except Panama where students in the limited-navigation group showed slightly higher average
reading performance score than both the actively explorative and strictly-focused groups by 10 and 4 score points, respectively;
however this finding was not statistically significant. In B-S-J-Z (China) and Singapore, the proportion of students in the actively
explorative navigation group and the strictly-focused navigation group were both as high as more than 30%. In contrast,
Denmark, Japan, Latvia, and Russia show a higher proportion of students in the strictly-focused navigation group (approximately
15 percentage-points more) than students in the actively explorative navigation group (Figure 3.7 and Table B.3.9).
As the scatter plot in Figure 3.8 shows, the navigation quantity and proportion of students who employed effective navigation
strategies (i.e., actively explorative and strictly focused) showed positive associations in multiple-source items. A positive
correlation was also shown in the actively explorative group against the number of pages visited in single-source items while no
significant association was found in the strictly-focused group.
Approximately 30% of students in Japan and Russia were categorised in the strictly focused group while nearly 10% of students
belonged to the actively explorative navigation or limited-navigation groups in these two countries. It suggests that students
in these two countries did not navigate very much but focused on navigation that was highly related to the tasks. In contrast,
B-S-J-Z (China) and Korea showed the highest quantity of page visits. The greatest share of students in these two countries
belonged to the actively explorative navigation group, suggesting that these students’ navigation sequences were long but not
necessary for the task requirements (Tables B.3.2 and B.3.9).
A high positive correlation was found between the percentage of students in the two efficient navigation categories
(i.e., strictly-focused navigation and actively explorative navigation) by country/economy and the country average performance
score. The percentage of students without any navigation activities showed a strong negative correlation with the country reading
score. The correlation between the percentage of students in the limited-navigation group and country reading score was a weak
positive one across all countries and economies but a negative correlation across OECD countries. The possible reason for this
inconsistency could be a mixture of non-responses in this group (Table 3.5).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 61
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Figure 3.8 Correlations between navigation quantity and navigation behaviour groups
40
Navigation Quantity in Single-source items
40
Navigation Quantity in Multiple-source items
Overall
average
Overall
average
Percentage
Percentage
35 35
R² = 0.39
R² = 0.78
30 30
25 R² = 0.90 25
R² = 0.89
20 20
Overall average Overall average
15 15
Strictly focused Strictly focused
10 10
Overall average Overall average
Actively exploration Actively exploration
5 5
0 0
0 0.5 1.0 1.5 2.0 0 0.5 1.0 1.5 2.0 2.5 3.0
Number of pages visited Number of pages visited
Notes: Each blue dot represents the intersection between the mean number of pages visited of a country/economy and percentage of students in strictly
focused navigation group.
Each yellow dot represents the intersection between the mean number of pages visited of a country/economy and percentage of students in actively explorative
navigation group.
Source: OECD, PISA 2018 Database, Tables B.3.2 and B.3.9.
12 https://doi.org/10.1787/888934239591
Box 3.2. Which students are likely to activate the multiple-source environment on their own?
In some reading units like Rapa Nui, students can activate the multiple-source environment earlier than the default
design. If students read through the first page (shown in item 1) by scrolling down to the bottom of the passage, two
hyperlinks are clickable that activate the new tabbed pages. Students who are more likely to make this self-activation
would need to satisfy at least two conditions: (1) highly motivated to scroll down to the end of the passage; (2) enough
computer skills to understand the hyperlinks are clickable.
Almost a quarter of students across the 70 countries and economies activated the multiple-source by clicking the
hyperlinks before it was activated by default. The percentage of students who activated the multiple-source is reported
in Figure 3.7 (shown in this figure next to country names). A high positive correlation of 0.82 is derived between the
percentage of students who activated the multiple-source environment and the percentage of students who exerted
navigations in single-source items.
Students who have higher reading skills, especially those with higher evaluation and reflection subskills, are more likely
to use this function. Students who self-activated both pages (Book Review and Science News) have on average 40 more
score points in reading than students who never used this function (Table B.3.8).
Around 28% of boys activated the multiple-source environments by clicking the hyperlinks, approximately 9 percentage
points higher than girls (Table B.3.8).
Navigation that is too quick or too slow does not help efficient and effective reading. How long students spend on the default
initial page may impact their strategies in navigation. Too short of a time in transition between pages suggests insufficient time
spent reading the visited page, signalling quick skim, aimless exploration, low engagement in navigation, and possibly poor
understanding of the reading goal. A good navigator is expected to spend at least a certain amount of time (e.g., three seconds)
3
on each page for it to be considered effective navigation, that is, to spend the time required to grasp the information they are
looking for rather than simply transitioning back and forth. They are also expected to spend sufficient time on the default initial
pages in single-source items and not 100% of time merely on the initial page in multiple-source items.
The first measurement focuses on how much time students spent on the default initial page (before any navigation is
executed). This variable describes how much time students needed to comprehend the reading task and form the reading goal.
Students’ concentration on the initial page would help them decide whether further navigation was needed and where
to navigate.
Figure 3.9 presents the average time students spent on the default initial page by countries and economies on single-source and
multiple-source items in the Rapa Nui unit. Students on average spent more time reading the initial page in single-source items
than multiple-source items. The median time spent on the initial page in items with single-source requirements ranges from
46 to 89 seconds across countries/economies with an overall average of 64 seconds and OECD average of 65 seconds.
Comparatively, the median duration of time students spent on the initial pages in items with multiple-source requirements is
53 seconds – around 10 seconds shorter than in the single-source items (Table B.3.4).
Interestingly, students in countries with high reading proficiency levels spent a moderate amount of time reading the initial page
in both single- and multiple-source items. In contrast, students in countries with low reading proficiency levels spent either a very
long or very short time on the default initial pages.
A moderately high correlation (r=0.55) was also found between time spent on the initial page in single- and multiple- source items
within country-level7. However, similar to the system-level analysis, no linear associations were found between time spent on the
initial page and students’ reading performance within countries. The median time spent on initial pages within each country is
reported in Table B.3.4.
Figure 3.9 Median time spent on initial reading pages in items with single- and/or multiple- source requirements by
countries/economies
90
IDN R² = 0.57
80
BLR TUR
70 PHL IRL
MYS MAR
12 MEX
MAC KAZ ARE 24 PER
60 CRI
KSV CAN PAN
USA 13 15 16 23
Overall average ISR QCI TAP 10 11 9 21
22
50 AUS 8 14 17 18 19
OECD average MLT 6 7 20 CHE ISL
KOR GBR 2
3 45 EST HRV LTU CZE
HKG BIH QAT 1
40 URY
MNE NOR
BGR SRB
SVK SWE
QAZ
GEO SGP
30
DOM
Overall average
20
OECD average
10
40 45 50 55 60 65 70 75 80 85 90
Median time on initial page in Single-source items (second)
Note: The correspondence of the country codes are found in the reader’s guide
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Source: OECD, PISA 2018 Database, Table B.3.4.
12 https://doi.org/10.1787/888934239610
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 63
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
The second time measurement is the ratio of time spent on the initial page over the students’ reading process in each item
throughout the Rapa Nui unit. A good navigator is expected to spend sufficient time reading the default initial page in items with
single-source requirements but spend less time on the initial page in items with multiple-source requirements and allocate time
to other related pages in order to collect information in a comprehensive way.
Figure 3.10 presents the countries’ mean ratio of time spent on the initial reading page in items with single- and multiple-source
requirements. On average, students spent approximately 97% of their total response time on initial page in items with single-source
requirements. This percentage goes down to 88% in items with multiple-source requirements (Table B.3.5). The ratio of time spent
on the initial reading page in single- and multiple-source items is positively correlated on the country level (r=0.79), with strong linear
relationship (R2=0.62).
A negative correlation was found between the mean reading performance and average percentage of time spent on the initial
reading page in single- and multiple-source items by country/economies (-0.66 and -0.75 respectively) as shown in Figure 3.11.
It suggests that, on average, the higher the students’ reading score, the lower ratio of time they were likely to spend on the initial
reading page, thus allocating a higher ratio of time to explore other pages beyond the initial pages. Students in countries with
a higher reading performance such as B-S-J-Z (China), Hong Kong (China), Japan, Korea, Macao (China), Singapore and Chinese
Taipei spent, on average, less than 80% of the total time on the initial pages in multiple-source items while students with a lower
reading performance spent over 90% of the total time on the initial pages (Table B.3.5).
The system-level correlation is also supported by the correlation results on the country-level though is much weaker than the
system-level. A stronger negative correlation was found between reading performance and the ratio of time spent on the
initial pages in multiple-source items than in single-source items. A higher correlation (lower than -0.35) in multiple-source was
highlighted in Brunei Darussalam, the Dominican Republic, Indonesia, Qatar and the United Arab Emirates, suggesting this
indicator is more predictable of students’ performance scores in these countries. The average ratio of time on the initial pages
and the correlation between single- and multiple-source items and average performance in reading within each country is
reported in Table B.3.5.
Figure 3.10 Average ratio of time spent on initial reading page with single- and/or multiple- source requirements by
countries/economies
1.10
Average ratio of time on initial page in multiple-source items
R² = 0.62
1.00
OECD average
0.80
0.70
0.60
Overall average
OECD average
0.50
0.40
Figure 3.11 Association between reading performance and average ratio of time spent on initial reading page with
single- and/or multiple- source requirements by countries/economies
600
Multiple - source items
3
450 450
Overall average Overall average
400 400
350 350
Overall average
OECD average
OECD average
300 R² = 0.43 300 R² = 0.56
average
Overall
250 250
200 200
0.93 0.94 0.95 0.96 0.97 0.98 0.99 1.00 0.65 0.70 0.75 0.80 0.85 0.90 0.95 1.00
Average ratio of time on initial page in single-source items Average ratio of time on initial page in multiple-source items
In addition to the time spent on the initial pages, the time interval between each navigation was computed to describe students’
behaviour during the transition and examine whether the reading on visited page was effective or not. A very quick switch between
pages does not guarantee enough time for reading. Fast back-and-forth behaviour may also suggest students’ skimming strategy
(i.e., quick scanning of a page to spot key information), which is actively taught in some countries (e.g., Germany) or a possible
loss of the reading goal in the process.
The third time-related measurement is the ratio between effective navigation (at least three seconds between two adjacent
transitions defined in the current study) and the total number of transitions. Quick switches between pages may cause ineffective
navigation, which were more often observed in countries with a lower reading performance, suggesting a possible loss of reading
goals, lack of motivation, and difficulties in understanding the instructions or unfamiliarity with page locations and a multiple-source
environment.
Figure 3.12 presents the countries’ mean ratio of effective visits in dynamic navigation by single- and multiple-source items.
The general average ratio across OECD countries is 0.93 and 0.92 in single- and multiple-source items, respectively. It suggests
that most students navigated effectively in reading tasks but that 7% and 8% of navigation in single- and multiple-source items,
respectively, were too quick to be considered effective transitions. Students showed quick switches between pages slightly more
often in multiple-source items than single-source items. Students in Colombia, the Dominican Republic, and Sweden show a
relatively lower ratio of effective navigation (mean ratio lower than 0.86), suggesting that students in these countries quickly
switched between pages more often than their peers from other countries. The ratio of effective page transitions in single- and
multiple-source items within each country is reported in Table B.3.6.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 65
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Figure 3.12 Average ratio of effective visits in dynamic navigation with single- and/or multiple- source requirements
by countries/economies
1.00
Ratio of effective visits in multiple-source items
0.95
Overall average
0.90 OECD average
0.85
0.80
0.75
R² = 0.90
0.70
Overall average
OECD average
0.65
0.60
Not only does the quantity but also the quality of navigation shows strong association with students’ reading performance.
Consistently with the analysis above, the four navigation behaviour groups – no navigation, limited navigation, active explorative
navigation, and strictly focused navigation – are mapped with the average reading performance score.
Figure 3.13 exhibits the distribution of navigation behaviour categories by reading proficiency levels. The no-navigation category
takes the biggest proportion in the lower levels of reading proficiency (Level 1c and Below Level 1c). Conversely, the no-navigation
category only takes around 10% in the highest proficiency level. With the increase in reading proficiency level, the proportion
of students in the actively explorative navigation and strictly-focused navigation groups becomes larger. The active exploration
group takes the biggest proportion (over 40%) in the highest proficiency level (Level 6).
1. Reading proficiency levels are defined based on reading performance plausible values: Level 6: Above 698.32 score points; Level 5: From 625.61 to less than
698.32 score points; Level 4: From 552.89 to less than 625.61 score points; Level 3: From 480.18 to less than 552.89 score points; Level 2: From 407.47 to
less than 480.18 score points; Level 1a: From 334.75 to less than 407.47 score points; Level 1b: From 262.04 to less than 334.75 score points; Level 1c: From
189.33 to less than 262.04 score points; Below Level 1c: Less than 189.33 score points. Refer to PISA 2018 Results (Volume I) - What Students Know and Can
Do, Chapter 5.
Source: OECD, PISA 2018 Database.
Overall average
Actively explorative navigation Strictly focused navigation Limited navigation No navigation
%
100
13%
13% 10%
90 25%
18%
34%
34%
17%
80 44% 19%
54%
70 23%
69%
26%
60 31%
93% 29% 30%
30%
50
28%
40 30%
25%
27%
30 21%
21%
0
I read books more often on digital devices
(e.g. ereader, tablet, smartphone, computer) PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 67
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Figure 3.14 demonstrates that the overall average reading performance significantly varies according to navigation behaviour
with representative lines by OECD average and overall average. A consistent pattern of decreasing performance score with the
reduction of activeness in navigation was found in most countries and economies (Table B.3.9). A substantial 66 score-point
difference was found between students who actively navigated among pages and those who did not execute navigation activities.
Among the sample in this study, 11% of students who belonged to the actively explorative navigation group had the highest
average performance score in reading in digital environments (Table B.3.9). These students actively navigated through both
single- and multiple-source items. Their navigation in items with single-source requirements was beyond the required number
of pages to complete the reading task. Not only was the required page read but other accessible pages as well. Such navigation
did not seem relevant to the current reading task but may have helped students get a general overview of the whole reading
unit in advance and prepare them for the collection of information that may be required later in the items with multiple-source
requirements (in the dynamic-text unit, the items with single-source requirements were always located before the items with
multiple-source requirements). Their propensity for evaluating further reading pages beyond the current task was reflected in a
higher success rate in multiple-source items that were located later within the unit. This may also help explain the reason for the
average higher reading score in the active explorative group than the strictly-focused group whose navigation patterns strictly
followed task instructions.
The pages that may not have been immediately relevant to a specific task could be regarded as potentially relevant to the whole
unit. For instance, in the Rapa Nui item, the item specified “Refer to the Professor’s Blog”, making the “Professor’s Blog” the direct
relevant page. However, the other two pages in this unit were still highly linked to the content.
575
550
525
500
475
450
Actively Explorative Strictly Focused Limited Navigation No Navigation
Navigation Navigation
Over one-fifth of students belonged to the strictly-focused navigation group that ranks in second place in the average
performance score. Students in this group strictly followed the task instructions by executing the navigation merely in
multiple-source items while focusing on the required page only in the single-source items. The students in this group did not take
the explorative efforts to check other accessible pages even though they had the chance to do so. This suggests that the students
were probably not familiar with the dynamic-text environment, lacked motivation for any further exploration, misunderstood the
task or did not know how the platform worked.
A large difference was observed in the reading score of the limited-navigation group compared to the previous two groups.
Students in this group only made optional navigations in single-source items but limited navigation in multiple-source items.
Even though the navigation in this group was not effective enough to help the students locate the required information, their
willingness to make some exploration was still helpful in completing the reading task. Compared to the group without any
navigation activity, which was also the group with the lowest reading performance score, some exploration and navigation
seemed better than nothing.
In test-takers’ behaviour analysis, it was found that 16% of the students who responded to the Rapa Nui unit executed
clicks on the sentences in paragraphs, images or other areas when reading through the passage and/or judging whether
the statement was fact or opinion. For instance, students clicked on “pg2p2” (the second sentence in paragraph 2) to
match the third statement on the list. Interestingly, this group of students was more likely to achieve higher scores than
their peers who randomly clicked on the interface or made no clicks.
Also notable was that students with higher reading proficiency level tended to click on the sentences along the
paragraphs more frequently. However, random clicks on other areas do not show a significant association with the
proficiency level (see Figure 3.15).
The action of scanning sentences in the paragraph provides evidence of students’ high engagement in solving the task.
0.6
0.5
0.4
0.1
Note: Reading proficiency levels are defined based on reading performance plausible values: Level 6: Above 698.32 score points; Level 5: From
625.61 to less than 698.32 score points; Level 4: From 552.89 to less than 625.61 score points; Level 3: From 480.18 to less than 552.89 score
points; Level 2: From 407.47 to less than 480.18 score points; Level 1a: From 334.75 to less than 407.47 score points; Level 1b: From 262.04 to less
than 334.75 score points; Level 1c: From 189.33 to less than 262.04 score points; Below Level 1c: Less than 189.33 score points. Refer to PISA 2018
Results (Volume I) - What Students Know and Can Do, Chapter 5.
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934239762
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 69
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Figure 3.17 illustrates the association between students’ reading proficiency level and the ratio of time spent on the default initial
page. The two curves – the ratio of time spent on the initial page in single-source and multiple-source items – both display a
negative correlation with reading performance. That is, as reading proficiency level increased, students spent a smaller ratio of
time on the initial page during the whole reading navigation process. This tendency was more obvious in the multiple-source
items. Students in Level 1a or below spent almost all the time reading the default initial page while students in Level 6 spent
only two-thirds of their time on the initial page but one-third of time navigating to other pages. Because of much less navigation
expected in the single-source items, the ratio of time allocated to the initial page displays a marginal 5 percentage-point drop
from the lowest proficiency level to the highest proficiency level.
Figure 3.16 Association between reading performance and time spent on the initial page
Overall average
80
70
Overall average (single)
60
OECD average (single)
Overall average (multiple)
50
OECD average (multiple)
40
30
20
Below Level 1c Level 1b Level 1a Level 2 Level 3 Level 4 Level 5 Level 6
Level 1c
Reading proficiency level
Note: Reading proficiency levels are defined based on reading performance plausible values: Level 6: Above 698.32 score points; Level 5: From 625.61 to less
than 698.32 score points; Level 4: From 552.89 to less than 625.61 score points; Level 3: From 480.18 to less than 552.89 score points; Level 2: From 407.47 to
less than 480.18 score points; Level 1a: From 334.75 to less than 407.47 score points; Level 1b: From 262.04 to less than 334.75 score points; Level 1c: From
189.33 to less than 262.04 score points; Below Level 1c: Less than 189.33 score points. Refer to PISA 2018 Results (Volume I) - What Students Know and Can
Do, Chapter 5.
Source: OECD, PISA 2018 Database.
12https://doi.org/10.1787/888934239724
Overall average
Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Figure 3.17 Association between reading performance and average ratio of time spent on the initial page during the
1.05
0.95
0.90
Overall average (multiple)
0.85
OECD average (multiple)
0.80
0.75
0.70
Below Level 1c Level 1b Level 1a Level 2 Level 3 Level 4 Level 5 Level 6
Level 1c Reading proficiency level
Notes: The average ratio of time indicates the proportion of time that students spent on reading the default initial page throughout the entire process in solving
the tasks.
Reading proficiency levels are defined based on reading performance plausible values: Level 6: Above 698.32 score points; Level 5: From 625.61 to less than
698.32 score points; Level 4: From 552.89 to less than 625.61 score points; Level 3: From 480.18 to less than 552.89 score points; Level 2: From 407.47 to
less than 480.18 score points; Level 1a: From 334.75 to less than 407.47 score points; Level 1b: From 262.04 to less than 334.75 score points; Level 1c: From
189.33 to less than 262.04 score points; Below Level 1c: Less than 189.33 score points. Refer to PISA 2018 Results (Volume I) - What Students Know and Can
Do, Chapter 5.
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934239743
In summary, students’ reading performance has a strong association with their navigation activities not only in terms of quantity
and quality of navigation but time spent during navigation. The number of page visits in both single- and multiple-source items
is positively correlated with reading performance. Students who actively navigated in both single- and multiple-source items
obtained the highest reading score. Even though these students may have executed navigation activities that were beyond the
required pages in the single-source items, active exploration helped them achieve a better overview of the whole reading task
unit and locate and collect information in advance before the multiple-source items were activated. In addition, students in the
highest reading proficiency levels, on average, did not quickly switch between pages. And, these students tended to allocate
a smaller proportion of time to the initial page during the whole reading and navigation process; instead, reserving time for
navigating to other pages.
The data-driven investigation on sequence of page transitions and sequence of time spent on each transition provides
insight into students’ navigation strategy. Specifically, the pages that students visited, and the time spent on each page
could be extracted on item-level and recorded in a sequence manner. This case study draws on sequence data from
one multiple-source item (CR551Q11) in the Rapa Nui unit to illustrate how students’ navigation strategies could be
tracked in solving a multiple-source reading item (see Annex C1 for an illustration on how students spent their time and
employed navigation activates across the Rapa Nui unit).
A sequence clustering analysis was conducted to identify students’ typical strategies in navigation by two sections:
first, clustering on navigation sequence (i.e., the sequence of page transitions) and second, clustering on navigation
time sequence (i.e., the time of transition through each page). Basically, the distance between each pair of sequences
was computed. The closer the distance, the more similar the sequences were. The sequences with high similarity were
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 71
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
categorised into homogenous groups (Tang et al., 2020[22]; Ulitzsch et al., 2021[25]; Dong and Pei, 2007[26]; He et al.,
forthcoming[27]). Details of the methodology of sequence mining used in this case study are provided in Annex C2.
A small subsample of 17 126 students who showed at least one navigation activity in the Rapa Nui question CR551Q11
were included in this study. In the first section, four typical navigation paths (page sequence cluster P1 to P4) employed
in this item were derived (Figure 3.18).
End
Sci Sci
Book
Book
Blog
Blog
Start Start
STEP 0 STEP 1 STEP 2 STEP 3 STEP 4 STEP 5 STEP 6 STEP 0 STEP 1 STEP 2 STEP 3 STEP 4 STEP 5 STEP 6
Sci Sci
Book Book
Start Start
STEP 0 STEP 1 STEP 2 STEP 3 STEP 4 STEP 5 STEP 6 STEP 0 STEP 1 STEP 2 STEP 3 STEP 4 STEP 5 STEP 6
Notes: “Blog” indicates the blog page, “Book” indicates book review page, “Sci” indicates science news page.
X axis indicates the sequence of steps (e.g., starting from step 0 to step 6), and Y axis indicates the predefined page visit sequence (i.e., 1-start, 2-blog
page, 3- book review page, 4-science news page, and 5-end).
How to read this chart (take Cluster Page Sequence 1 as an example): Students in Cluster Page Sequence 1 (P1) started the sequence at the first
step and then transit to the book review page at the second step, the students only navigated to one page and ended the navigation at the third step.
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934239781
• P1: Students started from the initial page – blog page, then navigated to book review page, and ended the navigation
sequence at the third step.
• P2: Students started from the initial page – blog page, then navigated to the scientific news page, followed by
a transition to the book review page, a back-and-forth navigation pattern shown on step 4 and ended the navigation
at step 5.
• P3: Students started from the initial page – blog page, then navigated to the scientific news page, and ended the
navigation sequence at the third step.
• P4: Students started from the initial page – blog page, then navigated to the scientific news page, and transitioned
to the book review page and then to the blog page. After a revisit to the book review page at step 5, the students
ended the navigation at step 6.
The students in P2 who adopted a multiple-page navigation strategy with a focus on the scientific news and book review
pages got the highest average reading score (601 points). In contrast, students in P1 who only navigated to the book
review page obtained the lowest reading score (575 points) (Table B.3.7).
Analogously, in the second section, four representative time allocation sequence patterns (time sequence cluster T1
to T4) were derived (Figure 3.19).
25
20
2
PAGE 0 PAGE 1 PAGE 2 PAGE 0 PAGE 1 PAGE 2
10
20
Note: The vertical axis indicates the time in seconds that students spent in navigation, while the horizontal axis indicates the transition steps. The first
transition time is defined as the time interval between the item start and the first-time page click, that is, the staying time on the initial page.
How to read this chart: Students in Cluster Time Sequence 1 spent 25 seconds on the default initial page and made a very quick page switch with
only 2 seconds during the second transition, then kept staying on the third page for a long time averagely 75 seconds until the end.
Source: OECD, PISA 2018 Database,
12 https://doi.org/10.1787/888934239800
• T1: Students spent a short time (25 seconds on average) on the initial page and made a quick page transition to the
second page, then remained on the third page for a long time (80 seconds on average) until the end.
• T2: Students spent a long time (80 seconds on average) on the initial page but a short time on the second page
(20 seconds on average) until the end.
• T3: Students started with a short time on the initial page (20 seconds on average), then transited to the second page
and remained there for a long time (160 seconds on average) until the end.
• T4: Students started with a short time on the initial page (10 seconds on average), then transited to the second time
where they stayed a short time (50 seconds on average) until the end.
The students in T3 who stayed for a long time on the transited page got the highest average reading score (599 points)
while students in T4 who stayed a short time on the transited page after a quick switch from the initial page got the
lowest reading score (565 points) (Table B.3.7).
As shown in Figure 3.20, the joint behaviour patterns in page sequence and time allocation were further explored.
Cluster PT14 – that is, a combination of P1 and T4, a short time on both initial page and transited page as well as short
navigation path, resulted in the lowest reading performance score. In contrast, Clusters PT43, namely, the pattern of a
long length of time spent on the transited page and multiple navigations across different pages, showed the highest
reading performance score. A substantial reading performance gap (55 score points) was found between these two
groups. This finding stresses the importance of examining students’ strategies in the reading and problem-solving
process. Such an investigation is also helpful for teachers to understand what strategies students use in solving literacy
tasks and better support students’ learning in reading, navigation and information-gathering.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 73
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
Figure 3.20 Distribution of reading performance scores by clusters of page and time sequence
Reading score
630
620
610 612
606 607
600
600
598
590 593
589 584
585 585
580
575 579 578
577
570
569
560
556
550
PT11 PT12 PT13 PT14 PT21 PT22 PT23 PT24 PT31 PT32 PT33 PT34 PT41 PT42 PT43 PT44
Clusters (page*time)
Note: The clusters PT indicates the joint combination of page clusters (“P”) and time clusters (“T”).
Source: OECD, PISA 2018 Database,
12 https://doi.org/10.1787/888934239819
1. Linear reading is the traditional mode of reading as a sequential reading process, for instance, reading left to right, from start to finish.
In contrast, non-linear reading describes a reader jumping from section to section and often not needing to finish any particular reading
selection.
3
2. The Rapa Nui unit (CR551) locates in two highly difficult testlets R21H and R25H. Each testlet consists of two reading units. Specifically, R21H
consists of CR543 and CR551 while R25H consists of CR544 and CR551. The Rapa Nui unit has a fixed position as the second unit in both
R21H and 25H testlets. Also, the reading unit right before the Rapa Nui unit is also designed as multiple-source environment. Therefore, the
pre-knowledge on the format of multiple-source environment could be assumed as equal, and no position effect would need to be considered
in this study.
3. Because the Rapa Nui unit locates as the last unit in R21H and R25H, a high non-response rate was expected with students possibly running
out of time at the end of the test. The non-response rate could be higher in Design A than Design B. The reason is that in Design A (with 75%
student administration), the Rapa Nui unit is at the very end of the reading test where both the two testlets, R21H and R25H, are at Stage 2.
In Design B (with 25% student administration), the Rapa Nui unit is at the end of Stage 1. Fatigue and running out of time are not as likely in
terms of possibilities as in Design A.
4. Process data are not available in the nine countries that administered in pencil-paper based assessments in PISA 2018, hence are not included
in the current study.
5. In the Survey for Adult Skills, a product of the OECD Programme for the International Assessment of Adult Competencies (PIAAC), 5 seconds
are used to set the effective time threshold for missing response (Weeks, von Davier and Yamamoto, 2016[29]; OECD, 2016[30]). It suggested
that if a person “solved” an item in less than 5 seconds, the item typically could not be solved in an effective way. The three-second threshold
set in this study was considered a plausible amount of time for effective reading. Any transition under 3 seconds is labelled a fast transition in
the current study.
6. Students who clicked on the hyperlinks to self-activate the multiple-source environment intended to visit the newly emerged tabs. This could
result in a longer navigation sequence for these students in single-source items.
7. Students who spent more than 1800 seconds (30 minutes) on the initial page in either single or multiple environments are labelled as outliers.
A total of 18 students were detected as outliers, accounting for 0.002% of the whole sample used in Chapter 3. Students were generally
expected to complete each cluster within 30 minutes in PISA, though the multistage adaptive testing (MSAT) in reading is restricted to
60 minutes (including 3-minute reading fluency). See Table B.3.4.
8. Refer to PISA 2018 results (Volume I) - What Students Know and Can Do, Annex A9 (OECD, 2019[28]) regarding a note about Spain in PISA 2018
https://www.oecd.org/pisa/PISA2018-AnnexA9-Spain.pdf.
References
Anmarkrud, Ø. et al. (2013), “Task-oriented reading of multiple documents: online comprehension processes and offline products”, [16]
Instructional Science, Vol. 41/5, pp. 873-894, http://dx.doi.org/10.1007/s11251-013-9263-8.
Dong, G. and J. Pei (2007), Sequence Data Mining., Springer US, Boston, MA, http://dx.doi.org/10.1007/978-0-387-69937-0. [26]
Goldhammer, F. et al. (2014), “The time on task effect in reading and problem solving is moderated by task difficulty and skill: Insights [12]
from a computer-based large-scale assessment.”, Journal of Educational Psychology, Vol. 106/3, pp. 608-626,
http://dx.doi.org/10.1037/a0034716.
Hahnel, C. et al. (2016), “Effects of linear reading, basic computer skills, evaluating online information, and navigation on reading digital [21]
text”, Computers in Human Behavior, Vol. 55, pp. 486-500, http://dx.doi.org/10.1016/j.chb.2015.09.042.
Han, Z., Q. He and M. von Davier (2019), “Predictive Feature Generation and Selection Using Process Data From PISA Interactive [23]
Problem-Solving Items: An Application of Random Forests”, Frontiers in Psychology, Vol. 10, http://dx.doi.org/10.3389/fpsyg.2019.02461.
He, Q., F. Borgonovi and M. Paccagnella (2021), “Leveraging process data to assess adults’ problem-solving skills: Using sequence [9]
mining to identify behavioral patterns across digital tasks”, Computers & Education, Vol. 166, p. 104170,
http://dx.doi.org/10.1016/j.compedu.2021.104170.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 75
3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
He, Q., F. Borgonovi and M. Paccagnella (2019), “Using process data to understand adults’ problem-solving behaviour in the
Programme for the International Assessment of Adult Competencies (PIAAC): Identifying generalised patterns across multiple tasks with
sequence mining”, OECD Education Working Papers, No. 205, OECD Publishing, Paris, https://dx.doi.org/10.1787/650918f2-en.
He, Q. et al. (forthcoming), Quantifying and Clustering Visit-Revisit Patterns Using Dynamic Time Warping Model.
[8]
[27]
Lawless, K. and J. Kulikowich (1996), “Understanding Hypertext Navigation through Cluster Analysis”, Journal of Educational Computing [3]
Research, Vol. 14/4, pp. 385-399, http://dx.doi.org/10.2190/dvap-de23-3xmv-9mxh.
Lawless, K. and P. Schrader (2008), “Where do we go now? Understanding research on navigation in complex digital environments“. In [7]
Coiro, J.; Knobel, M.; Leu, D.; Lankshear, C. (Eds.)”, Handbook of research on new literacies, Mahwah, NJ: Lawrence Erlbaum, pp. 267-296.
Liao, D., Q. He and H. Jiao (2020), , Using Log Files to Identify Sequential Patterns in PIAAC Problem Solving Environments by U.S. Adults’ [24]
Employment-Related Variables, National Center for Education Statistics (NCES) commissioned research report,
https://static1.squarespace.com/static/51bb74b8e4b0139570ddf020/t/5e41972c89f5bb0179fcb999/1581356847181/2020_Liao_He_Jiao_
Log-Files-Sequential-Patterns.pdf.
Naumann, J. (2015), “A model of online reading engagement: Linking engagement, navigation, and performance in digital reading”, [19]
Computers in Human Behavior, Vol. 53, pp. 263-277, http://dx.doi.org/10.1016/j.chb.2015.06.051.
Naumann, J. et al. (2007), “Signaling in expository hypertexts compensates for deficits in reading skill.”, Journal of Educational Psychology, [6]
Vol. 99/4, pp. 791-807, http://dx.doi.org/10.1037/0022-0663.99.4.791.
OECD (2019), PISA 2018 Assessment and Analytical Framework, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/b25efab8-en. [1]
OECD (2019), “PISA 2018 Reading Framework”, in PISA 2018 Assessment and Analytical Framework, OECD Publishing, Paris, [15]
https://dx.doi.org/10.1787/5c07e4f1-en.
OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, [28]
https://dx.doi.org/10.1787/5f07c754-en.
OECD (2016), Technical Report of the Survey of Adult Skills, Second Edition, [30]
https://www.oecd.org/skills/piaac/PIAAC_Technical_Report_2nd_Edition_Full_Report.pdf.
OECD (2011), PISA 2009 Results: Students On Line, OECD, http://dx.doi.org/10.1787/9789264112995-en. [2]
Rouet, J. and M. Britt (2011), “Relevance processes in multiple document comprehension“. In M. T. McCrudden, J. P. Magliano, & G. [13]
Schraw (Eds.)”, Text relevance and learning from text, IAP Information Age Publishing, pp. 19–52.
Rouet, J., M. Britt and A. Durik (2017), “RESOLV: Readers’ Representation of Reading Contexts and Tasks”, Educational Psychologist, Vol. [17]
52/3, pp. 200-215, http://dx.doi.org/10.1080/00461520.2017.1329015.
Sahin, F. and K. Colvin (2020), “Enhancing response time thresholds with response behaviors for detecting disengaged examinees”, [20]
Large-scale Assessments in Education, Vol. 8/1, http://dx.doi.org/10.1186/s40536-020-00082-1.
Salmerón, L. and V. García (2011), “Reading skills and children’s navigation strategies in hypertext”, Computers in Human Behavior, Vol. [4]
27/3, pp. 1143-1151, http://dx.doi.org/10.1016/j.chb.2010.12.008.
Salmerón, L., W. Kintsch and E. Kintsch (2010), “Self-Regulation and Link Selection Strategies in Hypertext”, Discourse Processes, Vol. [5]
47/3, pp. 175-211, http://dx.doi.org/10.1080/01638530902728280.
Tang, X. et al. (2020), “Latent Feature Extraction for Process Data via Multidimensional Scaling”, Psychometrika, Vol. 85/2, pp. 378-397, [22]
http://dx.doi.org/10.1007/s11336-020-09708-3.
Ulitzsch, E. et al. (2021), “Combining Clickstream Analyses and Graph-Modeled Data Clustering for Identifying Common Response [25]
Processes”, Psychometrika, http://dx.doi.org/10.1007/s11336-020-09743-0.
Vidal-Abarca, E., A. Mañá and L. Gil (2010), “Individual differences for self-regulating task-oriented reading activities.”, Journal of [18]
Educational Psychology, Vol. 102/4, pp. 817-826, http://dx.doi.org/10.1037/a0020062.
von Davier, M. et al. (2019), “Developments in Psychometric Population Models for Technology-Based Large-Scale Assessments: [10]
An Overview of Challenges and Opportunities”, Journal of Educational and Behavioral Statistics, Vol. 44/6, pp. 671-705, http://dx.doi.
org/10.3102/1076998619881789.
Vörös, Z., D. Kehl and J. Rouet (2020), “Task Characteristics as Source of Difficulty and Moderators of the Effect of Time-on-Task in Digital [11]
Problem-Solving”, Journal of Educational Computing Research, Vol. 58/8, pp. 1494-1514, http://dx.doi.org/10.1177/0735633120945930.
Weeks, J., M. von Davier and K. Yamamoto (2016), “Using response time data to inform the coding of omitted responses”, Psychological [29]
Test and Assessment Modeling, Vol. 58/4, pp. 671-701,
https://www.psychologie-aktuell.com/fileadmin/download/ptam/4-2016_20161219/06_Weeks.pdf.
Yamamoto, K., H. Shin and L. Khorramdel (2019), “Introduction of multistage adaptive testing design in PISA 2018”, OECD Education [14]
Working Papers, No. 209, OECD Publishing, Paris, https://dx.doi.org/10.1787/b9435d4b-en.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 77
4 The interplay between digital devices, enjoyment, and reading performance
– Approximately one-third of students reported that they rarely or never read books, another third reported reading books
more often in paper format than on digital devices, about 15% that they read more often on digital devices, and about
13% that they read equally often in paper format and on digital devices.
– Compared to students who rarely or never read books, digital-book readers read for enjoyment about 3 hours more
a week, print-book readers about 4, and those who balance both formats about 5 hours or more a week after accounting
for students’ and schools’ socio-economic background and gender.
– Students who reported reading more often or equally often in paper format and on digital devices reported more than
1 standard deviation more enjoyment than those who reported that they rarely or never read books.
– Compared to students who rarely or never read books, students in OECD countries who reported reading books more
often on paper scored 49 points more in reading while students who reported reading books more often on digital
devices scored only 15 points more after accounting for students’ and schools’ socio-economic profile and gender.
DO 15-YEAR-OLDS SPEND MORE TIME READING FOR ENJOYMENT THAN TWO DECADES AGO?
In PISA, students who read for enjoyment have typically been stronger performers in reading (OECD, 2010[1]). Reading
engagement and performance are mutually dependent (Nurmi et al., 2003[2]). Students who regularly read for enjoyment have
more opportunities to improve their reading skills through practice. They also perceive themselves as more competent and
motivated readers (Smith et al., 2012[3]; Sullivan and Brown, 2015[4]). At the same time, students with more difficulties reading
and who perceive themselves as less competent readers are less motivated to read for enjoyment.
As in previous cycles of PISA, the contextual questionnaire distributed in PISA 2018 allowed the measurement of reading
for enjoyment. It asked students whether they agree (“strongly disagree”, “disagree”, “agree”, “strongly agree”) with several
statements about their attitudes towards reading, including “I read only if I have to”; “Reading is one of my favourite hobbies”;
and “I read only to get information that I need.” Students’ responses to these questions were summarised in an index of
enjoyment of reading. The index is standardised to have a mean of 0 and a standard deviation of 1 across OECD countries.
In PISA 2018, approximately half of the students (49%) in OECD countries agreed or strongly agreed with the statement
“I read only if I have to” and one in four (28%) students agreed or strongly agreed that reading is a waste of time (Table B.4.1).
The index of reading enjoyment might be particularly sensitive to cross-cultural differences in the response style. Therefore,
comparisons within countries are more advisable than comparison between countries (see Box 4.1).
The indicators analysed in this report are based on students’, teachers’, and principals’ reports, which are susceptible
to several possible measurement errors: memory decay; social desirability (the tendency to respond in a manner
that is more acceptable in one’s own social and cultural context); reference-group bias (what the comparison group
is); and response-style bias (e.g. straight-lining, over-reporting, modesty, heaping, acquiescence). These biases
can operate differently in different cultural contexts, thus limiting the cross-country comparability of responses
(Benítez, Van de Vijver and Padilla, 2019[5]; Van de Vijver et al., 2019[6]; van Hemert, Poortinga and van de Vijver, 2007[7];
Lee, 2020[8]). Above all, readers should be particularly cautious when interpreting indicators with a strong subjective
component such as enjoyment of reading, perceived competence in reading (chapter 5), and teachers’ stimulation of
reading engagement (chapter 6), which are more likely to be influenced by cultural norms and the response style of the
4
respondent. Therefore, the results should be interpreted with caution when comparing countries/economies’ means or
interpreting system-level relationships that do not hold within-country.
In order to minimise the risk of misleading interpretations, a number of reliability and invariance analyses of the PISA
indices used in this report have been carried out (see Annex A1 for more details), providing readers with an indication of
how reliable cross-country comparisons are.
In all PISA-participating countries and economies in 2018, girls reported much higher levels of reading enjoyment than boys
(Table B.4.2). On average across OECD countries, the difference in reading for enjoyment between boys and girls was larger than
half a standard deviation. The largest gender gap in reading for enjoyment was observed in Germany, Hungary, and Italy while
the smallest gender gap was observed in Indonesia and Korea (Table B.4.2).
Nevertheless, gender is not the only factor that shows differences in the reading enjoyment index within countries. On average
across OECD countries, the difference in reading for enjoyment between students from advantaged and disadvantaged
socio-economic backgrounds1 was about half a standard deviation in favour of advantaged students. Students enrolled in
general programmes reported one-third of standard deviation more enjoyment of reading than students enrolled in vocational
programmes on average across OECD countries (Table B.4.2). The vast majority of variance in this index lies within schools, and
only 6% constitutes between-schools variance on average across OECD countries (Table B.4.3).
Figure 4.1 shows the change in reading performance associated with a one-unit increase in the reading enjoyment index after
accounting for students’ and schools’ socio-economic profile and gender. Reading for enjoyment was positively associated
with reading performance in all PISA-participating countries and economies after accounting for students’ and schools’
socio-economic profile2 and gender. However, the strength of the association varies across countries and economies. In Ireland,
Macao (China) and Chinese Taipei this change in reading is over 30 score points, while in Kazakhstan, Lithuania, and Morocco it is
smaller than 11 score points (Table B.4.1).
The index of enjoyment of reading decreased between 2009 and 2018 on average across OECD countries (Figure 4.2).
For example, approximately 8 percentage points more students reported in PISA 2018 than in PISA 2009 that they only read
if they have to. This trend, however, varies across countries and economies. Approximately students in one-third of countries
and economies with available data on this index enjoyed reading less, while in another one-third reading enjoyment increased.
The most pronounced decline was observed in Germany, Finland and Norway, where the index of enjoyment of reading decreased
by around 0.30 or more of a standard deviation over the last decade. However, Bulgaria, Colombia, Costa Rica, Mexico, Russia,
the Slovak Republic and Uruguay, the reading enjoyment index increased by at least 0.2 of a standard deviation (Table B.4.4a).
The elements that compose the index of reading enjoyment have changed since PISA 2000. However, some items remain identical.
Approximately 13 percentage points more students in OECD countries reported in PISA 2018 than in PISA 2000 that they only
read if they have to while about 8 percentage points more did it between PISA 2018 and PISA 2009. This decline, therefore, has
been steady if not more pronounced in the last years. In the Czech Republic, Denmark, Finland, Iceland, Indonesia, Mexico, Peru,
and Sweden, since PISA 2000 this difference widens to at least 20 percentage points (Tables B.4.4a and B.4.4b).
To summarise, students who read for enjoyment achieved higher reading scores in the PISA reading test than those who do not.
Over the last decade, students report that they enjoy reading less. Could it be, however, that they also read fewer hours?
As in previous cycles of PISA, the contextual questionnaire distributed in PISA 2018 asked students how much time they usually
spend reading for enjoyment. This includes books, magazines, newspapers, websites, blogs, and emails. It asked students to
select one of the following responses: “I do not read for enjoyment”; “30 minutes or less a day”; “More than 30 minutes to less than
60 minutes a day”; “1 to 2 hours a day”; and “More than 2 hours a day’. Students’ responses to this question was summarised in
an index of hours spent reading for enjoyment a week.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 79
4 The interplay between digital devices, enjoyment, and reading performance
Change in reading performance associated with a one-unit increase in the index of enjoyment of reading, after accounting for
students’ and schools’ socio-economic profile1 and gender
0 5 10 15 20 25 30 35 0 5 10 15 20 25 30 35
Score-point difference Score-point difference
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Note: All score-point differences are statistically significant.
Countries and economies are ranked in descending order of the change in reading performance.
Source: OECD, PISA 2018 Database, Table B.4.1.
12 https://doi.org/10.1787/888934239838
80 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Germany
The interplay between digital devices, enjoyment, and reading performance
Figure 4.2 Change between 2009 and 2018 in the enjoyment of reading
2009 2018
Moldova
4
Norway Albania
Finland Italy
Denmark Estonia
Switzerland Malaysia
Thailand Israel
Netherlands Turkey
Belgium Greece
New Zealand Latvia
Sweden Qatar
Lithuania Indonesia
Iceland Malta
Croatia Czech Republic
Austria Brazil
France Japan
Portugal Spain
Canada Korea
Australia Chile
Hungary Peru
United Kingdom United Arab Emirates
Singapore Georgia
Luxembourg Panama
Montenegro Poland
OECD average Argentina
Serbia Macao (China)
Chinese Taipei Mexico
Hong Kong (China) Slovak Republic
United States Colombia
Slovenia Bulgaria
Kazakhstan Russia
Romania Costa Rica
Ireland Uruguay
-0.80 -0.60 -0.40 -0.20 0 0.20 0.40 0.60 0.80 -0.80 -0.60 -0.40 -0.20 0 0.20 0.40 0.60 0.80
Mean index Mean index
Notes: Statistically significant differences between PISA 2018 and PISA 2009 are marked in a darker tone.
Costa Rica, Georgia, Malta and Moldova conducted the PISA 2009 assessment in 2010 as part of PISA 2009+.
Countries and economies are ranked in ascending order of the change between 2009 and 2018 (PISA 2018 - PISA 2009) in the index of enjoyment of reading.
Source: OECD, PISA 2018 Database, Table B.4.4a.
12 https://doi.org/10.1787/888934239857
The number of hours 15-year-old students spent reading for enjoyment decreased between 2000 and 2009 and, although at a
slower rate, increased between 2009 and 2018 on average across OECD countries (Figure 4.3). This is in contrast to the decrease
in enjoyment of reading between 2009 and 2018 (Figure 4.2). Although reading for enjoyment and time spent reading are
moderately correlated (r = 0.55, OECD average), there are some interesting differences at the country level. Of the 13 countries
and economies that reported reading significantly fewer hours in 2018 than in 2009, 11 of them also reported decreased
enjoyment of reading and two reported no significant differences. However, of the 32 countries/economies that reported reading
significantly more hours in 2018 than in 2009, almost half (13 countries/economies) reported no differences or significantly less
enjoyment of reading (Tables B.4.4a and B.4.8).
Students in Austria, Hungary, Portugal, Serbia, and Thailand – as well as the OECD average level – showed significantly less
enjoyment but more hours of reading in 2018 compared to 2009 (Tables B.4.4a and B.4.8). In other words, these results
suggest that fewer hours reading is associated with less enjoyment (or no change), while long hours of reading do not always
translate into more enjoyment. At the same time, PISA 2018 Results (Volume I) - What Students Know and Can Do (Box I.1.1,
(OECD, 2019[9])) showed that students reported reading less for leisure and reading fewer books of fiction, magazines or
newspapers because they want to (as opposed to because they have to). Instead, they read more to fulfil practical needs, and
they read more online in the form of chats, online news or websites containing practical information. This is likely associated
with more time spent reading and the stagnation of enjoyment. For example, students in Hungary, Serbia, and Thailand – which
showed significantly less enjoyment but more hours of reading in 2018 compared to 2009 – reported reading fewer fiction books
in PISA 2018 compared to PISA 2009. Also, the percentage of students in Austria, Hungary, Portugal, and Serbia who read fiction
books several times a month or more because they want to was below the OECD average in PISA 2018 (Tables I.B1.57, I.B1.58 and
I.B1.59 from Volume I, (OECD, 2019[9])).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 81
4 The interplay between digital devices, enjoyment, and reading performance
Figure 4.3 Change between 2000 and 2018 in time spent reading for enjoyment
Hours
9
8
2000 2009 2018
0
Denmark
Netherlands
Norway
Iceland
Switzerland
Sweden
United Kingdom
Belgium
Ireland
Austria
Luxembourg
Germany
Finland
Australia
United States
OECD average-31
France
Spain
New Zealand
Canada
Portugal
Czech Republic
Japan
Korea
Argentina
Mexico
Israel
Chile
Italy
Hungary
Latvia
Poland
Brazil
Indonesia
Greece
Peru
Romania
Bulgaria
Thailand
The most pronounced decline in time spent reading for enjoyment was observed in Chile and Peru between 2000 and 2009 with
a drop of at least 2 hours a week. On the other hand, the most pronounced increase was observed in Hong Kong (China) between
2000 and 2018 with an almost 3-hour increase, and Russia between 2009 and 2018 with an increase of 2 ½ hours per week
(Table B.4.8). In PISA 2018, on average across OECD countries, 83% of students reported reading 60 minutes or less per day, while
11% of students reported reading between 1 and 2 hours and only 6% more than 2 hours a day (Table B.4.5).
In almost all PISA-participating countries and economies in 2018, girls reported spending more hours a week reading for
enjoyment than boys. Girls spend about two hours more a week reading for pleasure than boys. In Brunei Darussalam and
Georgia this gender difference is as big as around 5 hours while this difference in Baku (Azerbaijan) and Korea is not statistically
significant. Students from more advantaged socio-economic backgrounds also read, on average, more hours than students from
less advantaged socio-economic backgrounds. This is particularly the case in Belarus, Bulgaria, the Philippines, Thailand and
Ukraine, where advantaged students read, on average, at least 2 hours and half more a week than disadvantaged students do.
However, in B-S-J-Z (China), Hong Kong (China) and Macao (China) this difference is not statistically significant. Students enrolled
in general programmes reported spending more hours a week reading for enjoyment than students enrolled in vocational
programmes on average across OECD countries (Table B.4.6). The vast majority of variance in this index lies within schools, and
only 2% is between-schools variance on average across OECD countries (Table B.4.7).
In conclusion, the index of reading for enjoyment has significantly declined on average across OECD countries over the last decade.
Still, the amount of time spent reading for pleasure has significantly increased. Enjoyment of reading and time spent reading
for enjoyment are moderately correlated (r = 0.55, OECD average). However, the association between enjoyment of reading and
reading performance is higher (r = 0.32, OECD average) than the association between time spent reading for enjoyment and
reading performance (r = 0.19, OECD average). This finding suggests that how much students enjoy reading matters more than
how many hours students spend reading for enjoyment. It is important to consider that since the reading enjoyment index was
introduced in PISA 2000, the nature of reading has dramatically changed: students read in an increasingly digital environment.
Therefore, it is important to keep in mind that the most recent measures of reading for leisure might be better interpreted as the
enjoyment of written online communication rather than a paper book.
It is still too early to predict how these trends will evolve in the future. As shown in the next section, many factors shaping the
direction of these trends may be related to a change in what students are reading and how they are reading.
82 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
The interplay between digital devices, enjoyment, and reading performance
DO 15-YEAR-OLDS SPEND MORE TIME READING FOR ENJOYMENT ON PAPER OR DIGITAL DEVICES?
The rapid digitalisation of communication over the last decade is changing what 15-year-olds do and read. There are two competing
theories about the relationship between new and legacy media use: a) the displacement theory assumes that teenagers spend
a relatively fixed amount of time on media consumption and, therefore, time spent in one media would decrease the time spent
4
on other media, b) the complementary theory assumes that media use has an additive effect and, therefore, time spent in one
media would change minimally or, conversely, increase the total time spent on all media (Twenge, Martin and Spitzberg, 2019[10]).
Displacement of media use occurs when individuals seek to fulfil their uses and gratifications through digital media rather than
legacy media. For example, students read more online in the form of chats, online news, and searching for practical information,
and fewer fiction books, magazines, and newspapers (OECD, 2019[9]). In Ireland, for example, the percentage of students who
read newspapers several times a month or more because they wanted to decreased by 43 percentage points between 2009
and 2018 while reading the news online increased by 44 percentage points. The way students interact with online formats is
also changing. In Japan, for example, the percentage of students who read emails several times a week or more decreased
by 62 percentage points between 2009 and 2018 while chatting online increased by 77 percentage points (Figure 4.5).
The percentage of students in Indonesia, Kazakhstan, and Thailand who speak online, read online news, and search for information
online increased approximately around 40 percentage points during the last decade. Students in some OECD countries may have
already experienced this transition yet the OECD average still rose by 12 to 20 percentage points on the indicators previously
mentioned (Table B.4.10). Does this mean that the amount of time students spend reading for enjoyment in print is being
overtaken by digital devices? Or, rather, that the time spent reading on digital devices is complementary to time spent reading
on paper?
PISA 2018 did not ask students to describe a main mode of reading – paper or digital – for various types of reading for enjoyment.
Still, it did ask to what extent students think the following statements best describe how they read books: a) “I rarely or never
read books”; b) “I read books more often in paper format”; c) “I read books more often on digital devices” (e.g. e-reader, tablet,
smartphone, computer); and d) “I read books equally often in paper format and on digital devices.” Not only do students’
responses throw light on their enjoyment of reading, reading format may play a role in how much they enjoy reading as well.
But it is not the only factor: previous reading experience, and home and school learning environments also affect reading
enjoyment (Sullivan and Brown, 2015[4]).
On average across OECD countries, print-book readers reported chatting online as much as non-print-book readers. A smaller
share of print-book readers takes part in online discussions (about 6 percentage points less) than non-print-book-readers,
while they are more likely to search for information online on a particular topic (about 6 percentage points more) or search for
practical information online (about 4 percentage points more). In other words, students who reported reading more often in print
engage in digital activities such as chatting online as frequent as non-print-book readers and they engage more frequently in
activities related to reading for information. Results by each participating country and economy in PISA 2018 can be consulted in
Table B.4.9.
Approximately one-third of students reported that they rarely or never read books, another third reported reading books more
often in paper format than on digital devices, about 15% that they read more often on digital devices, and about 13% that they
read equally often in paper format and on digital devices. More than 40% of students in Hong Kong (China), Indonesia, Malaysia,
Chinese Taipei and Thailand, reported reading books more often on digital devices. In contrast, more than 45% of students in
Japan, Korea, Slovenia and Turkey reported reading books on paper more often than on digital devices (Table B.4.11).
Students who read books on digital devices more often are more likely to be students with an immigrant background
(20% of immigrant students compared to 14% of non-immigrant students), from a socio-economically disadvantaged background
(16% of disadvantaged students compared to 13% of advantaged students), and boys (15% of boys compared to 14% of girls)
(Table B.4.12). The variance in the percentage of students who read books on digital devices more often lies almost exclusively
within schools and marginally between-schools (less than 2%) (Table B.4.13).
From OECD countries, Belgium, Chile, France, and the United Kingdom, however, girls are more likely to read books on digital
devices than boys. In Colombia and Mexico, socio-economically advantaged students are more likely to read books on digital
devices than disadvantaged students. The largest gender differences were observed in Albania where 27% of boys compared to
14% of girls reported reading books on digital devices more. In Morocco, advantaged students are twice as likely to read books
on digital devices as disadvantaged students (Table B.4.12). These results may reflect a more pronounced decrease in access to
print books at home among students from lower socio-economic backgrounds (Figure 4.4). On average across OECD countries,
socio-economically disadvantaged students in 2018 had approximately half the number of books at home they used to have
in 2000 while advantaged students had essentially the same number. In addition, the even larger share of students with an
immigrant background using digital devices might be possibly related to a greater availability of print books at home in their
home language rather than the language they took the PISA test in.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 83
4 The interplay between digital devices, enjoyment, and reading performance
Figure 4.4 Change in number of books between 2000, 2009, and 2018, by socio-economic status
Number of books
Disadvantaged students Advantaged students
300
250
200
150
100
50
0
2000 2009 2018
Notes: Differences between advantaged and disadvantaged students for each cycle are all statistically significant.
Change in number of books between 2018 and 2009, and change between 2009 and 2000 are all statically significant.
Source: OECD, PISA 2018 Database, Tables B.4.14a and B.4.14b.
12 https://doi.org/10.1787/888934239895
Students who reported reading between 1 and 2 hours and more than two hours a day for enjoyment were collapsed in the
following analysis to provide a further analysis focusing on the higher end of the reading time spectrum. They represent 17%
of students on average across OECD countries (Table B.4.5). Some 28% of girls who read at least 1 hour a day reported reading
books equally often in paper format and on digital devices compared to 22% of boys. In Switzerland and Uruguay, girls surpassed
boys by more than 14 percentage points in this comparison. The largest gender differences among students who read at least
1 hour a day were observed in Bosnia and Herzegovina, Greece, Lithuania, Montenegro, Serbia, the Slovak Republic, and Slovenia,
where girls reported reading more often in paper format compared to boys (Table B.4.15).
Numerous studies have shown that younger generations might be quite familiar with technology; however, “digital natives” are not
necessarily always equipped with adequate skills in terms of access to and use of digital information (OECD, 2011[11]; Breakstone
et al., 2018[12]; Macedo-Rouet et al., 2019[13]; McGrew et al., 2018[14]). Students’ reading habits and preferences have changed
over the past decades because of changes in the digitalisation of communication. Teenagers increasingly read for enjoyment on
digital devices. Yet, printed books still have a fair share of readers, especially among avid readers.
Students who reported reading books more often on digital devices read about 3 hours more a week than those who reported
that they rarely or never read books, while students who reported reading books more often in paper format read about 4 hours
more a week on average across OECD countries. Most importantly, students who reported reading books equally often in paper
format and on digital devices read about 5 hours or more a week than those who reported that they rarely or never read books,
after accounting for students and schools socio-economic background and gender (Figure 4.6).
Reading e-mails
The interplay between digital devices, enjoyment, and reading performance
2009 2018
Chatting online
4
Japan Lithuania
Hungary Bulgaria
Czech Republic Estonia
Uruguay Hungary
Slovenia Sweden
Portugal Norway
Belgium Macao (China)
Latvia Slovenia
Croatia Slovak Republic
Norway Iceland
Slovak Republic Canada
Austria Portugal
France Czech Republic
Italy Denmark
Luxembourg United Arab Emirates
Iceland Netherlands
Germany Malta
Turkey France
Argentina United Kingdom
Chile Qatar
Canada Australia
United Kingdom Hong Kong (China)
Netherlands Luxembourg
Israel Finland
Lithuania United States
Colombia Belgium
OECD average Poland
Costa Rica Germany
Denmark Austria
Poland OECD average
Singapore Latvia
Malta Switzerland
Finland Russia
Montenegro Spain
Estonia Romania
Spain Chinese Taipei
Qatar Georgia
United Arab Emirates Singapore
Sweden Montenegro
Panama Turkey
Hong Kong (China) Italy
Brazil Uruguay
Romania Croatia
Bulgaria Chile
Macao (China) Colombia
Switzerland Peru
Greece Brazil
Peru Moldova
Serbia Argentina
Malaysia Greece
Chinese Taipei Israel
Mexico Panama
Ireland New Zealand
United States Korea
Korea Ireland
New Zealand Albania
Moldova Mexico
Australia Serbia
Georgia Costa Rica
Thailand Jordan
Russia Thailand
Albania Kazakhstan
Jordan Malaysia
Indonesia Indonesia
Kazakhstan Japan
0 10 20 30 40 50 60 70 80 90 100 % 0 10 20 30 40 50 60 70 80 90 100 %
Notes: Values that are statistically significant are marked in a darker tone.
Costa Rica, Georgia, Malta and Moldova conducted the PISA 2009 assessment in 2010 as part of PISA 2009+.
Countries and economies are ranked in ascending order of difference in percentage of students who engaged in the activities several times a week or more.
Source: OECD, PISA 2018 Database, Table B.4.10.
12 https://doi.org/10.1787/888934239914
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 85
4 The interplay between digital devices, enjoyment, and reading performance
Figure 4.5[2/3] Change between 2009 and 2018 in what students read
0 10 20 30 40 50 60 70 80 90 100 % 0 10 20 30 40 50 60 70 80 90 100 %
Notes: Values that are statistically significant are marked in a darker tone.
Costa Rica, Georgia, Malta and Moldova conducted the PISA 2009 assessment in 2010 as part of PISA 2009+.
Countries and economies are ranked in ascending order of difference in percentage of students who engaged in the activities several times a week or more.
Source: OECD, PISA 2018 Database, Table B.4.10.
12 https://doi.org/10.1787/888934239914
86 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Figure 4.5[3/3] Change between 2009 and 2018 in what students read
2009 2018
0 10 20 30 40 50 60 70 80 90 100 % 0 10 20 30 40 50 60 70 80 90 100 %
Notes: Values that are statistically significant are marked in a darker tone.
Costa Rica, Georgia, Malta and Moldova conducted the PISA 2009 assessment in 2010 as part of PISA 2009+.
Countries and economies are ranked in ascending order of difference in percentage of students who engaged in the activities several times a week or more.
Source: OECD, PISA 2018 Database, Table B.4.10.
12 https://doi.org/10.1787/888934239914
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 87
4 The interplay between digital devices, enjoyment, and reading performance
Figure 4.6 Time spent reading for enjoyment per week and format of reading
Difference between students who read books in the following way and those who “rarely or never read books”, OECD average
Before accounting for students' and schools' socio-economic profile¹, and gender
After accounting for students' and schools' socio-economic profile, and gender
6
Difference in time spent reading for
enjoyment a week (hours)
Students who reported reading books equally often in paper format and on digital devices
5 read about 5 hours or more a week than those who reported that they rarely or never read books
0
I read books more often on digital devices I read books more often in paper format I read books equally often in paper format
(e.g. ereader, tablet, smartphone, computer) and on digital devices
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
Note: All values are statistically significant.
Source: OECD, PISA 2018 Database, Table B.4.16.
12 https://doi.org/10.1787/888934239933
In addition to teachers, parents are also important role models for reading habits. In PISA 2018, students whose
parents enjoy reading the most have a higher index of reading enjoyment3. A one-unit increase in the index of
parental reading enjoyment is associated, on average, with a 0.05 increase in student reading enjoyment for boys, and
0.11 for girls (Figure 4.7). In Belgium, Hong Kong (China), Korea and Macao (China), variations in the index of parental
reading enjoyment are associated with variations in student reading enjoyment that are not significantly different
between boys and girls. On the contrary, in Croatia, the Dominican Republic, Luxembourg, Malta and Portugal,
variations in parental reading enjoyment are only associated with variations in girl students’ reading enjoyment.
On average across OECD countries, students who talk to their parents about what they read or go with their parents
to a bookstore or library at least once a week have a higher index of reading enjoyment by 0.13 and 0.10, respectively.
Parents play a crucial role in conveying positive attitudes towards reading at home beginning in a child’s early
years. The day-to-day activities that parents undertake are highly correlated with children’s early learning and
social-emotional development (OECD, 2020[15]). Examples of these include reading to them almost every day when
they are children and providing them with books. Furthermore, PISA data suggests that parents who are observed
reading or who endorse the view that reading is pleasurable are associated with children’s reading activities at home,
reading motivation and achievement.
Figure 4.7 Relationship between students’ and parents’ enjoyment of reading, and students’
Change in the index of students’ enjoyment of reading associated with one-unit increase of the following variables,
based on students’ and parents’ reports
4
Index of parents' enjoyment of reading
Interaction between gender and parents' enjoyment of reading
0.10
0.05
-0.05
Malta
Hong Kong (China)
Ireland
Portugal
Panama
Macao (China)
Overall average
Mexico
Italy
Korea
Luxembourg
Dominican Republic
Georgia
Croatia
Chile
Germany
Belgium
Brazil
In absolute terms, students in Georgia, Hong Kong (China), Italy, and Kosovo spent on average 9 hours or more a week reading
books equally often in paper format and on digital devices (Figure 4.8). In relative terms, students in Hong Kong (China), Italy and
Japan who reported reading books equally often in paper format and on digital devices read more than 6.5 hours a week than
those who reported that they rarely or never read books after accounting for students’ and schools’ socio-economic background
and gender (Table B.4.16).
In conclusion, students in OECD countries who use both formats read close to 2 hours more a week than those who only use one
of the reading formats (i.e. the number of hours students read using both formats minus the average time between students
who read books more often in paper format and those who do more often on digital devices). In other words, students who gave
the highest frequency ratings for both formats are the heaviest readers. It is true that the overall time students spend on digital
devices doing online activities other than reading for enjoyment – such as social networks or games – is time taken away from
print reading (Twenge, Martin and Spitzberg, 2019[10]). But, this data suggest that time spent reading for enjoyment on digital
devices may not always displace time spent reading for leisure on print.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 89
4 The interplay between digital devices, enjoyment, and reading performance
Georgia Georgia
Kosovo Kosovo
Hong Kong (China) Hong Kong (China)
Italy Italy
Kazakhstan Kazakhstan
Bulgaria Bulgaria
Russia Russia
Albania Albania
Moldova Moldova
Chinese Taipei Chinese Taipei
Philippines Philippines
Belarus Belarus
Brunei Darussalam Brunei Darussalam
Romania Romania
Macao (China) Macao (China)
Malaysia Malaysia
Greece Greece
B-S-J-Z (China) B-S-J-Z (China)
Brazil Brazil
Panama Panama
Ukraine Ukraine
Turkey Turkey
Uruguay Uruguay
Hungary Hungary
Poland Poland
Thailand Thailand
Serbia Serbia
Costa Rica Costa Rica
Chile Chile
Latvia Latvia
United Arab Emirates United Arab Emirates
Finland Finland
Baku (Azerbaijan) Baku (Azerbaijan)
Israel Israel
Argentina Argentina
Viet Nam Viet Nam
Montenegro Montenegro
Spain Spain
France France
Qatar Qatar
Mexico Mexico
Bosnia and Herzegovina Bosnia and Herzegovina
Japan Japan
Jordan Jordan
Slovak Republic Slovak Republic
Peru Peru
Korea Korea
Czech Republic Czech Republic
Portugal Portugal
Belgium Belgium
Estonia Estonia
OECD average OECD average
Singapore Singapore
Saudi Arabia Saudi Arabia
New Zealand New Zealand
Indonesia Indonesia
Australia Australia
Malta Malta
Canada Canada
Ireland Ireland
Dominican Republic Dominican Republic
Colombia Colombia
Luxembourg Luxembourg
Croatia Croatia
Switzerland Switzerland
Austria Austria
United Kingdom United Kingdom
Germany Germany
Lithuania Lithuania
United States United States
Morocco Morocco
Norway Norway
Slovenia Slovenia
Sweden Sweden
Iceland Iceland
Netherlands Netherlands
Denmark Denmark
20 15 10 5 0 5 10 15
Hours Hours
Countries and economies are ranked in descending order of the number of hours students spent in reading books equally often in paper format and on digital
devices.
Source: OECD, PISA 2018 Database, Table B.4.16.
12 https://doi.org/10.1787/888934239971
90 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
The interplay between digital devices, enjoyment, and reading performance
Another relevant question is whether digital technologies help to improve the reading experience. Figure 4.9 shows the difference
in the reading enjoyment index between students who reported rarely or never reading books and those who read books more
often digitally, more often on paper, and equally often on paper and on digital devices. The average results across OECD countries
show a clear relationship between reading print books and enjoyment regardless of whether students read equally often on
paper and on digital devices or more often on paper. Moreover, these differences were significantly lower after accounting for
the student’s and schools’ socio-economic status and gender (Table B.4.17).
Difference between students who read books in the following way and those who “rarely or never read books”, OECD average
Before accounting for students' and schools' socio-economic profile¹, and gender
After accounting for students' and schools' socio-economic profile, and gender
Difference in the index of enjoyment of reading
1.80
Students who reported reading more often in paper or equally often in paper format and on digital devices reported
1.60 more than 1 standard deviation more enjoyment than those who reported that they rarely or never read books
1.40
1.20
1.00
0.80
0.60
0.40
0.20
0
I read books more often on digital devices I read books more often in paper format I read books equally often in paper format
(e.g. ereader, tablet, smartphone, computer) and on digital devices
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
Note: All values are statistically significant.
Source: OECD, PISA 2018 Database, Table B.4.17.
12 https://doi.org/10.1787/888934239990
Figure 4.10 shows the differences in the index of enjoyment of reading between students who read books equally often in
print and on digital devices or more often in paper compared to students who read books more often on digital devices.
In all participating countries/economies in PISA 2018, students who read books equally often in paper format and on digital
devices or more often in paper format reported higher scores in the reading enjoyment index compared to students who read
books more often on digital devices after accounting for the student’s and school’s socio-economic status and the student’s
gender. As previously shown (Figure 4.6), students who read books equally often in print and on digital devices are the heaviest
readers, and, therefore, they are expected to be the ones who enjoy reading the most. However, students who read books equally
often in paper format and on digital devices reported higher scores in the reading enjoyment index even after accounting for
time spent reading for enjoyment in all countries and economies.
On average across OECD countries, this score-point difference in favour of students who read equally often in both formats or
more often in print was of almost half a standard deviation after accounting for students’ and schools’ socio-economic profile,
gender, and time spent reading. Students in France, Japan and Norway who read equally often in both formats or more often
in paper format scored at least 0.60 of a standard deviation more in the reading enjoyment index than those who read more
often on digital devices. In the Dominican Republic, Jordan, Morocco, Panama, the Philippines, and Viet Nam, however, these
differences were smaller than 0.20 of a standard deviation (Table B.4.17).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 91
4 The interplay between digital devices, enjoyment, and reading performance
Difference in the index of enjoyment of reading between students who read books equally often in print and on digital devices
or more often in paper format and students who read books more often on digital devices
Before accounting for students' and schools' socio-economic profile¹, and gender
After accounting for students' and schools' socio-economic profile, and gender
0 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1 0 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1
Mean index difference Mean index difference
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
Note: Statistically significant values are shown in darker tones.
Countries and economies are ranked in descending order of the difference in the index of enjoyment of reading, after accounting for students’ and schools’
socio-economic profile, and gender.
Source: OECD, PISA 2018 Database, Table B.4.17.
12 https://doi.org/10.1787/888934240009
PISA 2018 shows that students who reported they would like to work as ICT professionals or technicians in the
future were more in touch with the digital environment than other students who also chose science-related
careers. Compared to health professionals, students who chose ICT professional careers reported reading books
(20% compared to 14%) or the news (48% compared to 43%) more often on digital devices (Tables B.4.23b, B.4.23c,
92 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
The interplay between digital devices, enjoyment, and reading performance
B.4.24b and B.4.24c). They were also the highest share of students who reported reading e-mails (47% compared
39%) and taking part in online discussion groups or forums (31% compared 21%) (Tables B.4.22b and B.4.22c). On the
other hand, students who chose health professional careers read books more often in paper format (46% compared
4
to 28%). They were also the highest share of students who reported searching for practical information online (61%
compared to 56%). In the use of digital devices for school, the share of ‘future’ technicians playing simulations at school is
10 percentage points more than health professionals (38% compared to 28%) (Tables B.4.25b and B.4.25d). All of these
differences are statistically significant.
At the same time, students who chose ICT professions reported reading for enjoyment less frequently than other
science-related careers (-0.20, on average across OECD countries). In contrast, students who chose health professional
careers as future jobs reported reading for enjoyment the most frequently (0.24). Compared to ICT professionals, the
highest share of students who reported that reading is one of their favourite hobbies (43% compared to 30%) and
talk about books with other people (46% compared to 31%) were students who chose health professional careers.
In contrast, the highest share of students who reported reading only if they had to or to get information were students
who chose technician or ICT professional careers (more than 50% compared to 39% for health professions) (Figure 4.11
and; Tables B.4.20a-d). It is important to bear in mind that girls typically report reading for enjoyment more frequently
than boys so professional careers chosen mostly by girls (e.g. health) are expected to have higher enjoyment levels than
professional careers mostly chosen by boys (e.g. technicians or ICT professionals). Nonetheless, the largest gender
gap is observed among those who chose ICT careers, where girls showed the largest reading frequency for enjoyment.
On average across OECD countries, 60% of ‘future’ ICT girls (compared to 27% of boys) agreed or strongly agreed that
reading is one of their favourites hobbies (Figure 4.12 and; Tables B.4.21a-d).
These results highlight how, even among students with science-related career expectations, the interplay between
digital environments, motivation and enjoyment of reading is different. As students’ professional expectations differ so
do their preferences for paper or digital format, level of enjoyment of reading, and ultimately their behaviour towards
reading. Students interested in ICT already have higher exposure to digital devices at the age of 15 but they read for
enjoyment less frequently, especially boys. Given the close relationship between reading enjoyment and performance,
further attention in this area is needed.
Percentage of students who expect to work in the following careers, and who agreed or strongly agreed on the
following reading behaviours, OECD average
Students expect to work as:
Health professional Science-related technician¹
Science and engineering professional ICT professional
%
60
50
40
30
20
10
0
I read only if I have to Reading is one of I like talking about books For me, I read only to
my favourite hobbies with other people reading is a waste of time get information
that I need
1. Because of too few observations to provide reliable estimates in many OECD countries, OECD average considers 27 OECD countries.
Careers are ranked in ascending order of the percentage of students who reported “I read only if I have to”.
Source: OECD, PISA 2018 Database, Tables B.4.20a-d.
12 https://doi.org/10.1787/888934240028
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 93
4 The interplay between digital devices, enjoyment, and reading performance
0.60
0.40
0.20
-0.20
-0.40
Health professional Science and engineering Science-related ICT professional²
professional technician¹
1. Because of too few observations to provide reliable estimates in many OECD countries, OECD average for all students considers 27 OECD
countries, for boys it considers 19 OECD countries and for girls it considers 10 OECD countries.
2. Because of too few observations to provide reliable estimates in many OECD countries for girls, OECD average considers 9 OECD countries.
Note: All gender differences are statistically significant.
Careers are ranked in descending order of the mean index of enjoyment of reading.
Source: OECD, PISA 2018 Database, Tables B.4.20a-d and B.4.21a-d.
12 https://doi.org/10.1787/888934240047
Figure 4.13 shows the system-level relationship between reading performance and book formats. Education systems in which
a higher percentage of students read books more often on paper perform better in reading than education systems in which
students read books more often using digital devices. A similar relationship can be observed within countries and economies.
In all countries and economies, except in the Dominican Republic and Morocco where there are no differences, students who
reported reading paper books scored higher in reading than students who rarely or never read books once students’ and school’s
socio-economic profile and students’ gender had been accounted for. In comparison to students who rarely or never read,
students in OECD countries who reported reading books more often on paper scored 49 points more in reading while students
who reported reading books more often on digital devices scored only 15 points more (Table B.4.16). These results are aligned
with recent meta-analyses that revealed that reading on paper resulted in better comprehension than reading the same text on
a screen (Delgado et al., 2018[21]; Clinton, 2019[22]).
Figure 4.14 shows the system-level relationship between reading performance and the format news is read in. Education systems
in which a higher percentage of students read the news more often on digital devices perform better in reading than education
systems in which students do not follow the news at all. This result is consistent across all participating countries and economies
with available data on the ICT questionnaire once students’ and school’s socio-economic profile, and students’ gender had
been accounted for. As previously pointed out, less than 5% of students across the OECD reported reading the news on paper
compared to 41% who reported reading the news more often on digital devices (Table B.4.18).
Figure 4.13 Correlations between reading performance and the format of reading books
510
I read books more often
490 in paper format
Reading performance
R² = 0.15
470
450
430
R² = 0.21
410 R² = 0.29 I read books more often
I read books equally often on digital devices
in paper format and on
390 digital devices
370
0 5 10 15 20 25 30 35 40 45 50
Percentage of students who reported that the following statements best describes how they read books
Figure 4.14 Correlations between reading performance and the format of reading the news
550
530
510
490
R² = 0.34
Reading performance
430
410
R² = 0.19
I read the news more often R² = 0.25
390 in paper format I do not follow the news at all
370
350
0 10 20 30 40 50 60 70
Percentage of students who reported that the following statements best describes how they read the news
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 95
4 The interplay between digital devices, enjoyment, and reading performance
Box 4.4. What are the common characteristics among strong reading performers?
Stronger reading performers are more likely to be female students and students from higher socio-economic
backgrounds (OECD, 2019[9]; OECD, 2019[20]). However, these are not the only characteristics that define strong
readers. Previous PISA results showed that, although students who read fiction are more likely to achieve high scores,
it is students who read a wide variety of material who perform particularly well in reading (OECD, 2010[1]). In the last
decade, there has been considerable debate as to what format, type, and length of reading may be most effective in
fostering reading skills and improving reading performance (Wolf, 2018[23]; Firth et al., 2019[24]).
PISA 2018 shows that strong readers tend to read books in paper or balance their reading time between paper and
digital (Table B.4.16). At the same time, stronger readers tend to read the news more often on digital devices or
balance their reading time between paper and digital. In other words, it seems that most proficient readers are able
to effectively optimise the use of digital technology, depending on the activity. For example, strong readers use digital
devices to read for information such as the news (Table B.4.18) or browse the Internet for schoolwork (Table B.6.16)
while still enjoying reading a good book on paper. Most of the high performers in reading also read longer pieces of
text for school (Table B.6.11a) and different types of texts, including fiction books such as novels or short stories, and
texts with diagrams and graphs (Table B.6.8a).
In conclusion, PISA 2018 data suggest that digital devices are increasingly displacing print media, particularly in activities most
closely tied to reading for information (e.g. newspapers, magazines). Yet, print book readers still read for pleasure diverse kinds
of reading materials (e.g. books, magazines, newspapers, websites, etc.) more hours a week than digital book readers, and
the biggest book readers balance their reading time between paper and digital. Compared to digital-book readers, print-book
readers tend to perform better in reading and spend more time reading for enjoyment in all participating countries/economies
in PISA 2018. Therefore, the potential benefit of using technology to enhance students’ reading experience seems bigger in
activities related to reading for information rather than reading books. Chapter 6 of this report provides some insights into how
teaching practices can enhance reading in digital environments.
Notes
1. A socio-economically disadvantaged (advantaged) student is a student in the bottom (top) quarter of the PISA index of economic, social and
cultural status (ESCS) in the relevant country/economy.
2. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
3. In PISA 2018, 17 countries and economies distributed the parental questionnaire: 9 OECD countries – Belgium, Chile, Germany, Ireland,
Italy, Korea, Luxembourg, Mexico, and Portugal; and 8 partner countries and economies – Brazil, Croatia, the Dominican Republic, Georgia,
Hong Kong (China), Macao (China), Malta, and Panama.
Benítez, I., F. Van de Vijver and J. Padilla (2019), “A Mixed Methods Approach to the Analysis of Bias in Cross-cultural Studies”,
Sociological Methods & Research, p. 1-34, http://dx.doi.org/10.1177/0049124119852390.
Breakstone, J. et al. (2018), “Why we need a new approach to teaching digital literacy”, Phi Delta Kappan, Vol. 99/6, pp. 27-32,
4
[5]
[12]
http://dx.doi.org/10.1177/0031721718762419.
Clinton, V. (2019), “Reading from paper compared to screens: A systematic review and meta‐analysis”, Journal of Research in Reading, [22]
Vol. 42/2, pp. 288-325, http://dx.doi.org/10.1111/1467-9817.12269.
Delgado, P. et al. (2018), “Don’t throw away your printed books: A meta-analysis on the effects of reading media on reading [21]
comprehension”, Educational Research Review, Vol. 25, pp. 23-38, http://dx.doi.org/10.1016/j.edurev.2018.09.003.
Firth, J. et al. (2019), “The “online brain”: how the Internet may be changing our cognition”, World Psychiatry, Vol. 18/2, pp. 119-129, [24]
http://dx.doi.org/10.1002/wps.20617.
Kardefelt-Winthe, D. (2019), “Children’s time online and well-being outcomes” in Burns, T. and F. Gottschalk (eds.), Educating 21st Century [16]
Children: Emotional Wellbeing in the Digital Age, Educational Research and Innovation, OECD Publishing, Paris,
https://dx.doi.org/10.1787/b7f33425-en.
Lee, J. (2020), “Non-cognitive characteristics and academic achievement in Southeast Asian countries based on PISA 2009, 2012, and [8]
2015”, OECD Education Working Papers, No. 233, OECD Publishing, Paris, https://dx.doi.org/10.1787/c3626e2f-en.
Macedo-Rouet, M. et al. (2019), “Are frequent users of social network sites good information evaluators? An investigation of adolescents’ [13]
sourcing abilities / ¿Son los usuarios frecuentes de las redes sociales evaluadores competentes? Un estudio de las habilidades de los
adolescentes para identificar, evaluar y hacer uso de las fuentes”, Infancia y Aprendizaje, pp. 1-38,
http://dx.doi.org/10.1080/02103702.2019.1690849.
Margaryan, A., A. Littlejohn and G. Vojt (2011), “Are digital natives a myth or reality? University students’ use of digital technologies”, [25]
Computers & Education, Vol. 56/2, pp. 429-440, http://dx.doi.org/10.1016/j.compedu.2010.09.004.
McGrew, S. et al. (2018), “Can Students Evaluate Online Sources? Learning From Assessments of Civic Online Reasoning”, Theory & [14]
Research in Social Education, Vol. 46/2, pp. 165-193, http://dx.doi.org/10.1080/00933104.2017.1416320.
Nurmi, J. et al. (2003), “The role of success expectation and task-avoidance in academic performance and satisfaction: Three studies on [2]
antecedents, consequences and correlates”, Contemporary Educational Psychology, Vol. 28/1, pp. 59-90,
http://dx.doi.org/10.1016/s0361-476x(02)00014-0.
OECD (2020), Early Learning and Child Well-being: A Study of Five-year-Olds in England, Estonia, and the United States, OECD Publishing, [15]
Paris, https://dx.doi.org/10.1787/3990407f-en.
OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, [9]
https://dx.doi.org/10.1787/5f07c754-en.
OECD (2019), PISA 2018 Results (Volume II): Where All Students Can Succeed, PISA, OECD Publishing, Paris, [20]
https://dx.doi.org/10.1787/b5fd1b8f-en.
OECD (2019), PISA 2018 Results (Volume III): What School Life Means for Students’ Lives, PISA, OECD Publishing, Paris, [19]
https://dx.doi.org/10.1787/acd78851-en.
OECD (2017), PISA 2015 Results (Volume III): Students’ Well-Being, PISA, OECD Publishing, Paris, [17]
https://dx.doi.org/10.1787/9789264273856-en.
OECD (2015), Students, Computers and Learning: Making the Connection, PISA, OECD Publishing, Paris, [18]
https://dx.doi.org/10.1787/9789264239555-en.
OECD (2011), PISA 2009 Results: Students On Line: Digital Technologies and Performance (Volume VI), PISA, OECD Publishing, Paris, [11]
https://dx.doi.org/10.1787/9789264112995-en.
OECD (2010), PISA 2009 Results: Learning to Learn: Student Engagement, Strategies and Practices (Volume III), PISA, OECD Publishing, Paris, [1]
https://dx.doi.org/10.1787/9789264083943-en.
Smith, J. et al. (2012), “Students’ self-perception of reading ability, enjoyment of reading and reading achievement”, Learning and [3]
Individual Differences, Vol. 22/2, pp. 202-206, http://dx.doi.org/10.1016/j.lindif.2011.04.010.
Sullivan, A. and M. Brown (2015), “Reading for pleasure and progress in vocabulary and mathematics”, British Educational Research [4]
Journal, Vol. 41/6, pp. 971-991, http://dx.doi.org/10.1002/berj.3180.
Twenge, J., G. Martin and B. Spitzberg (2019), “Trends in U.S. Adolescents’ media use, 1976–2016: The rise of digital media, the decline [10]
of TV, and the (near) demise of print.”, Psychology of Popular Media Culture, Vol. 8/4, pp. 329-345, http://dx.doi.org/10.1037/ppm0000203.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 97
4 The interplay between digital devices, enjoyment, and reading performance
Van de Vijver, F. et al. (2019), “Invariance analyses in large-scale studies”, OECD Education Working Papers, No. 201, OECD Publishing,
Paris, https://dx.doi.org/10.1787/254738dd-en.
van Hemert, D., Y. Poortinga and F. van de Vijver (2007), “Emotion and culture: A meta-analysis”, Cognition and Emotion, Vol. 21/5,
pp. 913-943, http://dx.doi.org/10.1080/02699930701339293.
[6]
[7]
Wolf, M. (2018), Reader Come Home: The Reading Brain in a Digital World, Harper, New York. [23]
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 99
5 Strategies to tackle inequality and gender gaps in reading performance
– Approximately 40% of students on average across OECD countries responded that clicking on the link of a phishing email
was somewhat appropriate or very appropriate.
– Disadvantaged students perceived the PISA reading assessment as more difficult even after accounting for students’
reading scores in the 70 countries and economies that participated in PISA 2018
– The OECD average change in reading performance associated with a one-unit increase in the index of perception of the
PISA test’s difficulty was 30 points after accounting for students’ and schools’ socio-economic status. This change in the
index of knowledge of strategies for assessing the credibility of sources was 36 points.
– Almost 30% of the association between socio-economic background and reading performance can be accounted for
by the difference between socio-economically advantaged and disadvantaged students’ reported self-perception of
reading competence.
– Almost two-thirds of the association between gender and reading performance can be accounted for by the difference
between boys’ and girls’ knowledge of effective reading strategies.
PISA 2018 asked students about their general self-efficacy, whose results have been extensively discussed in earlier volumes
of PISA (OECD, 2019[5]; OECD, 2019[6]). In addition, for the first time, PISA 2018 asked students about their self-concept in
reading by choosing one of these six categories: “I am a good reader”, “I am able to understand difficult texts”, “I read fluently”,
“I have always had difficulty with reading”, “I have to read a text several times before completely understanding it”, and “I find it
difficult to answer questions about a text”. The first three positively worded categories were combined into an index of perceived
competence in reading (screadcomp), and the last three negatively worded items into an index of perceived difficulty in reading
(screaddiff). Positive values in the index of perceived competence in reading (screadcomp) mean that the student reported
higher self-concept than did the average student across OECD countries. Positive values in the index of perceived difficulty in
reading (screaddiff) mean that the student reported lower self-concept than did the average student across OECD countries
(OECD, 2020[7]).
PISA 2018 also included a new reading self-efficacy index linked to the PISA task itself called index of perception of difficulty of
the PISA test (pisadiff). In this task, students were asked to report the extent to which they agree (i.e. strongly disagree, disagree,
agree, and strongly agree) with the following statements about the PISA reading tasks they had just completed: “Many texts were
too difficult for me”, “There were many words I could not understand”, “I was lost when I had to navigate between different pages”.
This index can be regarded as a proxy for measuring self-efficacy in reading. Positive values in this index mean that the student
reported lower self-efficacy than did the average student across OECD countries.
Being able to navigate through different pages to understand a text is particularly important for digital literacy as students
often face similar challenges when navigating through information on the Internet. However, almost one in five students across
OECD countries reported feeling lost in the PISA test when navigating through different pages. Approximately one out of every
two students in Indonesia, Thailand, and the Philippines reported these difficulties while less than 15% did so in B-S-J-Z (China),
Belarus, Denmark, Finland, Germany, Hungary, Ireland, Italy, Lithuania, Russia, and Spain. Approximately 17% of students in
OECD countries agreed or strongly agreed that many texts in the PISA reading assessment were too difficult for them.
Similarly, about 18% agreed or completely agreed that they could not understand many words (Table B.5.1).
100 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Strategies to tackle inequality and gender gaps in reading performance
The index of perception of difficulty of the PISA test (pisadiff) is moderately correlated with the index of perceived difficulty in
reading (screaddiff) (r = 0.49, OECD average), and negatively correlated with the index of perceived competence in reading
(screadcomp) (r = -0.37, OECD average). As will be further discussed in this section, student perception of difficulty of the PISA
test (pisadiff) is also the indicator of perception of competence included in PISA that is more strongly associated with reading
5
performance. While this chapter prioritises the novel and reading-related aspects not described in previous volumes of PISA,
such as the new measure of self-efficacy in reading, the annex tables include detailed results for student perceived competence
in reading, student perceived difficulty in reading, and general self-efficacy (Tables B.5.2 and B.5.3).
Austria, Denmark, and Germany were the countries whose students had the lowest perception of difficulty (-0.35 or lower).
In contrast, students in the Philippines, Thailand, and Viet Nam reported the highest perception of difficulty (0.87 or higher).
And despite performing above the OECD average in reading, students of several countries and economies considered the
reading test more difficult than the OECD average. These include some East Asian education systems such as B-S-J-Z (China),
Hong Kong (China), Japan, Korea, Macao (China) and Chinese Taipei, which might be more sensitive to modesty bias. This is
attributed to the fact that in East Asian cultures people value being modest more than in Western cultures (Van de gaer et al.,
2012[8]).
Students’ perception of competence typically varies as a function of different students’ characteristics (Figure 5.1). In all
participating countries and economies in PISA 2018, students from a lower socio-economic background1 perceived the PISA
reading assessment as more difficult (Tables B.5.4a and B.5.4c) and reported a lower perception of competence (Table B.5.4b).
On average across OECD countries, the difference in the index of perception of the difficulty of the PISA test between students
from advantaged and disadvantaged socio-economic backgrounds2 was about half of a standard deviation (0.52) in favour of
advantaged students – i.e. they perceived the test as less difficult. Students in France, Luxembourg, New Zealand, and Singapore,
in particular, reported the largest socio-economic gap in this index (0.74 or higher) while the Dominican Republic, Indonesia, and
Thailand, reported the smallest socio-economic gap (lower than 0.20).
Although not as pronounced as between students from different socio-economic status, there are also differences in the
perception of competence depending on students’ immigration backgrounds. On average across OECD countries, the difference
in the index of perception of the difficulty of the PISA test between students from an immigrant and non-immigrant background
was about one-quarter of a standard deviation in favour of students with a non-immigrant background - i.e. they perceive
the test as less difficult. Students from an immigrant background in Finland, Iceland, and Mexico, in particular, perceived the
PISA reading test as more difficult compared to non-immigrant students. However, immigrant students in Brunei Darussalam,
Qatar, and the United Arab Emirates, perceived the PISA reading test as less complicated than non-immigrant students
(Figure 5.1 and Table B.5.4a). It is important to bear in mind that students from an immigrant background in Brunei, Qatar, and
the United Arab Emirates are, on average, stronger performers in reading than their non-immigrant counterparts (OECD, 2019[5]).
In other words, it is important to take into account that perceptions of competence and performance are mutually reinforcing
so when higher-performing students receive and process performance feedback, their perception of competence tends to be
higher.
It would be reasonable to expect that disadvantaged students’ higher perception of difficulty is due in part to their lower
performance in reading. However, disadvantaged students still perceived the PISA reading assessment as more difficult even
after accounting for students’ reading scores in all countries and economies that participated in PISA 2018. The five countries that
were an exception are Belgium, the Dominican Republic, Indonesia, Saudi Arabia, and Thailand (Table B.5.4a).
The gender difference in students’ perception of competence are relatively smaller in magnitude than socio-economic background
yet present in about half of the countries. Girls in Iceland, the Netherlands, and Sweden, in particular, perceived the PISA reading
test as more complicated compared to boys, while boys, instead of girls, perceived it to be more difficult in Albania, Greece,
and Kosovo. The vast majority of variance in this index lies within schools and only 6% constitutes between-schools variance on
average across OECD countries (Tables B.5.4a and B.5.5a).
Countries in which the average student perceived the PISA reading test to be more challenging are also the countries where
students tended to have lower scores in the PISA reading assessment. For example, students in Denmark and Germany scored
above the OECD average in reading performance and they were also the countries where students had the lowest perception of
difficulty (around -0.35 points or lower). On the other hand, students in Indonesia, the Philippines, and Thailand scored below the
OECD average and their students reported the highest perception of difficulty (0.8 points or higher). These relationships are also
observed within-countries and economies in the overall reading score as well as in single- and multiple-source reading subscales
(Figure 5.2; Tables B.5.6, B.5.7 and B.5.8). Nonetheless, as previously pointed out, modesty bias might be playing a role in the list
of countries that appear in the upper right part of the figure as they are predominantly from East Asia (Van de gaer et al., 2012[8]).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 101
5 Strategies to tackle inequality and gender gaps in reading performance
Figure 5.1 Index of perception of difficulty of the PISA reading test, by student characteristics
A
Positive difference
Girls - boys B
Negative difference Difference is not significant
Indonesia † Belarus †
Panama † Malta †
Colombia † Iceland †
Argentina † Ukraine †
Moldova † Lithuania †
Turkey † Denmark
Singapore Austria
Bosnia and Herzegovina † Germany
Chile †
Belgium A B C
Poland Countries/economies with a positive difference 17 0 29
Croatia † Countries/economies with no difference 38 0 32
B-S-J-Z (China)
Greece
Georgia †
Note: One dagger (†) means that the share of immigrant students in the country is less than 10%.
Countries and economies are ranked in descending order of the index of perception of difficulty of the PISA test.
Source: OECD, PISA 2018 Database, Tables B.5.1 and B.5.4a.
12 https://doi.org/10.1787/888934240104
102 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
On average across OECD countries, students in the top quarter of the index of perception of difficulty of the PISA test scored
95 points – or approximately one standard deviation – less in reading than students in the bottom quarter of this index. Note
that students in the top quarter of this index had more perception of difficulty and therefore, lower reading self-efficacy. This
score-point difference for single and multiple-source reading subscales was 97 points (Tables B.5.6, B.5.7 and B.5.8).
Strategies to tackle inequality and gender gaps in reading performance
5
Figure 5.2 Relationship between the perception of the difficulty of the PISA reading test and performance in
‘multiple’ source text
1. Hungary 5. Netherlands 9. United Kingdom 13. Portugal 17. Latvia
2. Spain¹ 6. United States 10. New Zealand 14. Iceland 18. France
3. Russia 7. Australia 11. Norway 15. Israel
4. Switzerland 8. Ireland 12. Slovenia 16. Belarus
600
Above-average reading performance
and below-average perception of
575 B-S-J-Z (China)
difficulty of the PISA test
Singapore
550
Estonia Hong Kong (China) Macao (China)
R² = 0.35 Korea
Reading 'multiple' source text performance
Sweden
525 Finland Canada 8 Japan
7 10 Poland
Denmark 6 9 Chinese Taipei
11 Belgium
500 Germany 5 13 Czech Republic OECD average: 490
4 12 18
Austria 14 17
3 Croatia
475 Lithuania Italy 1 2 16 Turkey
Luxembourg Slovak Republic
15
450 Greece Chile
Serbia Uruguay United Arab Emirates
Malta
Costa Rica Malaysia
425
Mexico
Qatar Brunei Darussalam
Bulgaria Montenegro Peru
Albania Colombia
400 Brazil
Bosnia and Herzegovina Kazakhstan Thailand
Baku (Azerbaijan)
OECD average: 0.01
375
Georgia Panama Indonesia
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Source: OECD, PISA 2018 Database, Tables B.5.1 and B.5.8.
12 https://doi.org/10.1787/888934240123
Figure 5.33 shows the change in reading performance associated with a one-unit increase in the index of perception of difficulty
of the PISA test. Figure 5.4 does the same but for single- and multiple-source text subscales. The relationship is statistically
significant in all participating countries and economies in PISA 2018. However, the magnitude of this relationship is not the
same across countries. The OECD average change in reading performance associated with a one-unit increase in the index
of perception of difficulty of the PISA test was 30 points after accounting for students’ and schools’ socio-economic status
(Table B.5.6). In Australia, Ireland, Malta, New Zealand, and the United States, this change in reading performance is of at least
40 points after accounting for students’ and schools’ socio-economic status. However, in Indonesia and Thailand, this score-point
difference is lower than 10 points. Students’ perception of self-efficacy in reading is also strongly associated with single- and
multiple-source subscales of reading after accounting for socio-economic status. (Figure 5.4, Tables B.5.7 and B.5.8).
When interpreting these results, it is important to bear in mind that as a result of the Multistage Adaptive Testing (MSAT) used in
PISA 2018, students who answered the computer-based assessment were assigned to different booklets based on their ability at
the beginning of the cognitive test (see PISA 2018 Technical Report for further details). Nevertheless, the results presented above
hold even after accounting for the MSAT effect (Table B.5.6).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 103
5 Strategies to tackle inequality and gender gaps in reading performance
Figure 5.3 Relationship between the perception of the difficulty of the PISA reading test and reading performance
Score-point difference in reading associated with a one-unit increase in the index of perception of the difficulty of the PISA
reading test
Before accounting for students' and schools' socio-economic profile¹
After accounting for students' and schools' socio-economic profile
New Zealand Italy
United States Brazil
Ireland Peru
Malta Chile
Australia Austria
United Kingdom Hong Kong (China)
Singapore Georgia
Canada Brunei Darussalam
Moldova Colombia
Estonia Japan
Denmark Uruguay
Portugal Turkey
Finland Czech Republic
Iceland Serbia
Sweden Mexico
Korea Kazakhstan
Norway Malaysia
Germany Slovenia
Israel Argentina
OECD average Montenegro
Poland Bulgaria
Spain² Slovak Republic
United Arab Emirates B-S-J-Z (China)
Macao (China) Chinese Taipei
Ukraine Albania
Luxembourg Philippines
Greece Croatia
Belgium Kosovo
Belarus Bosnia and Herzegovina
Romania Costa Rica
France Baku (Azerbaijan)
Latvia Panama
Switzerland Saudi Arabia
Netherlands Jordan
Russia Dominican Republic
Hungary Thailand
Qatar Indonesia
Lithuania
-60 -50 -40 -30 -20 -10 0 -60 -50 -40 -30 -20 -10 0
Score-point difference Score-point difference
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Notes: All score-point differences are statistically significant.
When interpreting these results, it is important to bear in mind that as a result of the Multistage Adaptive Testing (MSAT) used in PISA 2018, students who
answered the computer-based assessment were assigned at the beginning of the cognitive test to different booklets depending on their ability (see PISA 2018
Technical Report for further details). Nevertheless, the results presented above hold even after accounting for the MSAT effect (see Table B.5.6).
Countries and economies are ranked in ascending order of the score-point difference in reading, after accounting for students’ and schools’ socio-economic profile.
Source: OECD, PISA 2018 Database, Table B.5.6.
12 https://doi.org/10.1787/888934240142
104 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
source scores
Strategies to tackle inequality and gender gaps in reading performance
Figure 5.4 Relationship between the perception of the difficulty of the PISA reading test and single- and multiple-
Score-point difference in each reading subscale associated with a one-unit increase in the index of perception of the difficulty of
the PISA reading test
Before accounting for students' and schools' socio-economic profile¹
5
After accounting for students' and schools' socio-economic profile
-60 -50 -40 -30 -20 -10 0 -10 -20 -30 -40 -50 -60
Score-point difference Score-point difference
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Note: All score-point differences are statistically significant.
Countries and economies are ranked in ascending order of the multiple-source score-point difference, after accounting for students’ and schools’ socio-economic
profile.
Source: OECD, PISA 2018 Database, Tables B.5.7 and B.5.8.
12 https://doi.org/10.1787/888934240161
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 105
5 Strategies to tackle inequality and gender gaps in reading performance
A measure of performance and difficulty-perception mismatch is developed based on an approach similar to the
one measuring academic resilience in PISA4. This indicator shows the share of students who scored in the bottom
quarter of reading in their country/economy and in the bottom quarter of the reported difficulty for the PISA test
in that country/economy (i.e. low performers who reported that the test was easier than most of their peers).
The share of low performers who displayed this performance and difficulty-perception mismatch ranges from below
10% for Ireland, Moldova, Portugal, Romania, Singapore, and the United States to more than 20% for Indonesia, Jordan,
the Dominican Republic, Panama, the Philippines, and Thailand. On average across OECD countries, 13.4% of
low-performing students reported finding the PISA test easier than most of their peers. In Chinese Taipei and Japan,
both countries with an average reading score above the OECD average, around 17% of low performers reported finding
the PISA test easier than most of their peers (Table B.5.23). It is important to bear in mind that in top-performing
countries/economies, even low-performing students may find fewer difficulties5 in the PISA reading tasks than
top-performing students in low-performing countries/economies.
Students’ perceptions of how good (or bad) they are at reading, and more generally, how competent they are, have
important ramifications. They influence how well students motivate themselves, set goals and persevere in the face
of difficulties (Fang et al., 2018[9]). These are critical qualities for improving reading skills, which require persistent
practice (Peura et al., 2019[10]). After taking the PISA test, students who thought that the reading tasks were easy
yet did not perform well might have overestimated their competence, particularly their reading skills. Students who
perceive themselves as more competent than they really are may be hampered in their motivation and persistence in
developing their reading skills. This may explain, to some extent, their more unsatisfactory performance in the PISA
test. Students need to know how to manage their knowledge about what they actually know, and what they can do
with what they know. Intensified teacher feedback, increased peer assessment opportunities, systematic reviewing of
past performance, and the development of self-appraisal skills may help students better calibrate their perception of
competence with their actual performance (Dunning, Heath and Suls, 2004[11]).
Figure 5.5 Perceived difficulty of the PISA test across levels of performance
Difference in the proportion of boys and girls according to their perception of difficulty of the PISA test and their
performance in reading (girls-boys), OECD average
% dif.
Low performers High performers
8
6
4
2 More
More girlthan
girls thanboys
boys
0
More
Moreboys
boysthan girls
than girls
-2
-4
-6
-8
Perceived the PISA test as easier Perceived the PISA test Perceived the PISA test as easier Perceived the PISA test
(bottom quarter) as more difficult (bottom quarter) as more difficult
(top quarter) (top quarter)
Note: All differences between girls and boys are statistically significant.
Source: OECD, PISA 2018 Database, Table B.5.21.
12 https://doi.org/10.1787/888934240180
106 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
PERFORMANCE
Strategies to tackle inequality and gender gaps in reading performance
Students’ awareness of reading strategies (i.e. how students monitor and manage reading tasks) is fundamental to cognitively
processing texts. Meta-cognition strategies consist of an individual’s ability to think about, monitor and adjust their activity for a
particular goal. Numerous studies have found a positive association between meta-cognitive strategies and reading proficiency
5
(Artelt, Schiefele and Schneider, 2001[12]; Cantrell et al., 2010[13]; Artelt and Schneider, 2015[14]). Previous PISA cycles have shown
that meta-cognition is a robust predictor of reading achievement even after accounting for gender and socio-economic status
(OECD, 2010[15]).
As in previous PISA cycles, students in PISA 2018 were asked to evaluate the effectiveness of different reading strategies in
understanding and memorising a text (Table B.5.9) as well as strategies for summarising information (Table B.5.10). In the first
task, students were asked what strategies would be more useful for remembering the information in a text. Examples of these
strategies include “I concentrate on the parts of the text that are easy to understand” or “I quickly read through the text twice”.
In the second task, students were asked what strategies would be more useful for writing a summary of a long and rather difficult
two-page text about fluctuations in the water level of a lake in Africa. Examples of these strategies include “I write a summary.
Then I check that each paragraph is covered in the summary because the content of each paragraph should be included” or
“I try to copy out accurately as many sentences as possible”.
Additionally, PISA 2018 also collected information about knowledge of reading strategies linked explicitly to the goal of assessing
the credibility of sources for the first time (Table B.5.11). In this task, students were asked what strategies would be more
appropriate for responding to a spam email (Box 5.2). The reading task presented to students read as follows: “You have received
a message in your inbox from a well-known mobile phone operator telling you that you are one of the winners of a smartphone.
The sender asks you to click on the link to fill out a form with your data so they can send you the smartphone”. Examples of these
strategies include “Click on the link to fill out the form as soon as possible” or “Delete the email without clicking on the link”.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 107
5 Strategies to tackle inequality and gender gaps in reading performance
The index of knowledge of reading strategies for assessing the credibility of sources (metaspam) is moderately correlated with the
index of knowledge of reading strategies for understanding and remembering (undrem) (r = 0.32, OECD average), and the index
of knowledge of reading strategies for writing a summary (metasum) (r = 0.39, OECD average). The index of knowledge of reading
strategies for understanding and remembering and the index of knowledge of reading strategies for writing a summary are also
moderately correlated (r = 0.47, OECD average). Assessing the credibility of sources is particularly relevant in digital reading and
when reading multiple pieces of text online. Therefore, this section will pay special attention to the index of knowledge of reading
strategies for assessing the credibility of sources. For further country results of two other reading strategies consult the annex
tables (Tables B.5.9, B.5.10 and B.5.11).
Approximately 40% of students in OECD countries responded that clicking on the link to fill out the form as soon as possible
was somewhat appropriate or very appropriate. About 31% of students in OECD countries reported that deleting the email
without clicking on the link would be very appropriate. Students in Denmark, Germany, Ireland, Japan, the Netherlands, and
the United Kingdom scored the highest in the index of knowledge of reading strategies for assessing the credibility of sources
across all participating countries and economies in PISA 2018 (higher than 0.20 points). In contrast, students in Baku (Azerbaijan),
Indonesia, Kazakhstan, the Philippines, and Thailand had the lowest scores in this index (lower than -0.65 points). Among OECD
countries, students in Chile, Colombia, Hungary, Korea, Mexico, and Turkey had the lowest scores in this index (lower than
-0.20 points) (Table B.5.11).
Students’ knowledge of reading strategies typically vary by different students’ characteristics (Figure 5.6). Students from
advantaged socio-economic backgrounds in all participating countries and economies who participated in PISA 2018 scored
higher in the index of knowledge of reading strategies for assessing the credibility of sources than students from disadvantaged
socio-economic backgrounds. The widest socio-economic disparities were observed in the index of knowledge of strategies for
assessing the credibility of sources compared with the two other reading strategies indices. On average across OECD countries,
the difference between students from advantaged and disadvantaged socio-economic backgrounds was close to half a standard
deviation (0.45) in favour of the wealthier students (compared to 0.37 in undrem and 0.42 in metasum). Students in Germany,
Luxembourg, Portugal, Switzerland and the United States, in particular, reported the largest socio-economic gap (0.65 points
or higher) in this index of knowledge of strategies for assessing the credibility of sources across all participating countries
and economies in PISA 2018. In contrast, Albania, Baku (Azerbaijan), Kazakhstan, and Macao (China) reported the smallest
socio-economic gap (lower than 0.15 points). Among OECD countries, Canada, Estonia, Iceland, Ireland, Italy, Korea, Latvia, and
Lithuania reported the smallest socio-economic gap among OECD countries (lower than 0.35 points) (Tables B.5.12a-c).
Girls also predominantly reported better knowledge of reading strategies than boys in the three indices included in PISA 2018.
Although still present, less gender difference was observed in the index of knowledge of strategies for assessing the credibility
of sources (one-fifth of standard deviation) than the index of knowledge of strategies for understanding and remembering
as well as for summarising information (both around one-third of standard deviation) on average across OECD countries.
The largest gender difference in the index of knowledge of strategies for assessing the credibility of sources was observed in the
Czech Republic, Finland, Hong Kong (China), Iceland, Japan, and Korea (at least one-quarter of standard deviation). On the other
hand, in 16 countries and economies including the OECD countries Chile, Colombia, and Mexico, the gender difference was not
statistically significant (Tables B.5.12a-c).
On average across OECD countries, students from an immigrant background reported lower knowledge of effective reading
strategies than non-immigrant students in the three indices included in PISA 2018. However, in Australia, Canada, and
New Zealand, immigrant students reported higher levels in undrem and metasum indices. In 17 OECD countries, immigrant
and non-immigrant students showed the same level in undrem index, in 11 OECD countries for metasum index, and in 10 OECD
countries for metaspam index. The widest disparities based on immigrant background were observed in the index of knowledge
of strategies for assessing the credibility of sources (about one-fifth of standard deviation).
Between 8 to 10% of the variance in the three indices of knowledge of reading strategies is between-school on average across
OECD countries (Tables B.5.13a-c). In Austria, Germany and the Netherlands, the between-schools variance in the index of
knowledge of reading strategies for assessing the credibility of sources is higher than 15% while in Albania, Georgia, and Kosovo
it is lower than 2% (Table B.5.13c). Although the proportion of variation that lies between schools is slightly higher than other
indices presented in this report (e.g. enjoyment of reading, or reading self-efficacy), the variance in the indices of knowledge of
reading strategies still lays mostly within-schools.
Countries in which the average student is more aware of effective strategies for assessing the credibility of sources are also those
in which students tend to perform better in the PISA reading assessment, including single and multiple-source reading subscales
(Table B.5.16). And within countries and economies on average across OECD countries, students in the top quarter of the index
of knowledge of strategies for assessing the credibility of sources scored 114 points more in reading than students in the bottom
quarter of this index. This score-point difference for single- and multiple-source reading subscales were 116 and 115 respectively.
108 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Positive difference Negative difference Difference is not significant
Strategies to tackle inequality and gender gaps in reading performance
Figure 5.6 Index of knowledge of reading strategies for assessing the credibility of sources, by student
characteristics
Missing values
5
A Girls - boys B Advantaged - disadvantaged students C Immigrant - non-immigrant students
Japan Iceland †
Germany Israel
Netherlands Slovak Republic †
Denmark Turkey †
Ireland Jordan
Finland † Qatar
Singapore United Arab Emirates
Austria Brunei Darussalam †
Greece Hungary †
France Colombia †
Sweden Korea
Estonia Serbia †
Romania
Hong Kong (China)
A B C
Viet Nam Countries/economies with a positive difference 60 76 8
Saudi Arabia Countries/economies with no difference 16 0 33
Croatia † Countries/economies with a negative difference 0 0 27
Note: One dagger (†) means that the share of immigrant students in the country is less than 10%.
Countries and economies are ranked in descending order of the index of knowledge of reading strategies for assessing the credibility of sources.
Source: OECD, PISA 2018 Database, Tables B.5.11 and B.5.12c.
12 https://doi.org/10.1787/888934240199
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 109
5 Strategies to tackle inequality and gender gaps in reading performance
The PISA 2018 reading assessment included one item-unit (Rapa Nui Question 3, CR551Q06) that tested whether students can
distinguish between facts and opinions when presented with multiple texts (see Chapter 2, Box 2.2). Figure 5.7 illustrates how
education systems in which the average student is aware of effective strategies for assessing the credibility of sources are also
those in which the estimated percentage correct in the reading item of distinguishing facts from opinions is higher.
Figure 5.7 Relationship between the reading item of distinguishing facts from opinions and the index of knowledge of
reading strategies for assessing the credibility of sources
80
Above-average item reading performance and above-average awareness of
effective strategies for assessing the quality and credibility of sources
United States
Australia
United Kingdom
Turkey New Zealand Singapore
Percentage correct in the capacity to distinguish facts
Canada Netherlands
60 Denmark
from opinions (Equated P+, Rapa Nui Question 3)
-1.00 -0.90 -0.80 -0.70 -0.60 -0.50 -0.40 -0.30 -0.20 -0.10 0.00 0.10 0.20 0.30 0.40 0.50
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Source: OECD, PISA 2018 Database, Tables B.2.8 and B.5.11.
12 https://doi.org/10.1787/888934240218
Figure 5.8 shows the change in reading performance associated with a one-unit increase in the index of knowledge of strategies
for assessing the credibility of sources within countries and economies. Figure 5.9 shows the same, but for single- and
multiple-source text reading subscales. The OECD average change in reading performance associated with a one-unit increase
in this index is 36 points after accounting for students’ and schools’ socio-economic status. In Australia, Finland, New Zealand,
and Sweden, this change was at least 45 points in reading after accounting for students’ and schools’ socio-economic status.
In Albania, Baku (Azerbaijan), the Philippines, and Thailand, however, this score-point difference was about 11 points or less.
Nonetheless, these differences are statistically significant across all participating countries and economies in PISA 2018
(Table B.5.16).
Students’ knowledge of reading strategies for assessing the credibility of sources is also strongly associated with single- and
multiple-source subscales of reading after accounting for socio-economic status. Thus, supporting the idea that students who
are able to think about, monitor and adjust their activity for assessing the credibility of sources will similarly master single- and
multiple-source texts (Figure 5.9 and Table B.5.16).
110 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
performance
Strategies to tackle inequality and gender gaps in reading performance
Figure 5.8 Relationship between knowledge of reading strategies for assessing the credibility of sources and reading
Change in reading performance associated with a one-unit increase in the index of knowledge of reading strategies for
assessing the credibility of sources
5
Before accounting for students' and schools' socio-economic profile¹
After accounting for students' and schools' socio-economic profile
Finland Qatar
New Zealand Russia
Sweden France
Australia Czech Republic
Singapore Uruguay
United States Croatia
Malta Belarus
Norway Slovenia
Iceland Chile
Japan Italy
Korea Colombia
United Kingdom Turkey
Canada Slovak Republic
Estonia Costa Rica
Denmark Brunei Darussalam
Netherlands Argentina
Luxembourg Serbia
Hong Kong (China) Kazakhstan
Portugal Mexico
Ireland Romania
Germany Peru
Chinese Taipei Hungary
Switzerland Malaysia
Austria Bosnia and Herzegovina
OECD average Saudi Arabia
Spain² Panama
Israel Montenegro
Macao (China) Bulgaria
Latvia Dominican Republic
B-S-J-Z (China) Indonesia
United Arab Emirates Jordan
Greece Georgia
Poland Kosovo
Belgium Morocco
Lithuania Albania
Brazil Baku (Azerbaijan)
Moldova Thailand
Ukraine Philippines
0 10 20 30 40 50 60 0 10 20 30 40 50 60
Score-point difference Score-point difference
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Note: All score-point differences are statistically significant.
Countries and economies are ranked in descending order of the score-point difference in reading performance, after accounting for students’ and schools’
socio-economic profile.
Source: OECD, PISA 2018 Database, Table B.5.16.
12 https://doi.org/10.1787/888934240237
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 111
5 Strategies to tackle inequality and gender gaps in reading performance
Figure 5.9 Relationship between knowledge of reading strategies for assessing the credibility of sources, and
single- and multiple-source scores
Score-point difference in each reading subscale associated with a one-unit increase in the index of knowledge of reading
strategies for assessing the credibility of sources
Before accounting for students' and schools' socio-economic profile¹
After accounting for students' and schools' socio-economic profile
Multiple-source score-point differences Single-source score-point differences
Finland Finland
Sweden Sweden
Australia Australia
New Zealand New Zealand
United States United States
Malta Malta
Singapore Singapore
Japan Japan
Norway Norway
Korea Korea
United Kingdom United Kingdom
Iceland Iceland
Canada Canada
Estonia Estonia
Denmark Denmark
Portugal Portugal
Hong Kong (China) Hong Kong (China)
Luxembourg Luxembourg
Germany Germany
Netherlands Netherlands
Ireland Ireland
Chinese Taipei Chinese Taipei
Switzerland Switzerland
OECD average OECD average
Spain² Spain²
Austria Austria
Israel Israel
Latvia Latvia
Macao (China) Macao (China)
B-S-J-Z (China) B-S-J-Z (China)
Lithuania Lithuania
Greece Greece
Poland Poland
United Arab Emirates United Arab Emirates
Belgium Belgium
Qatar Qatar
Croatia Croatia
Russia Russia
Brazil Brazil
Czech Republic Czech Republic
France France
Uruguay Uruguay
Belarus Belarus
Slovenia Slovenia
Chile Chile
Italy Italy
Costa Rica Costa Rica
Turkey Turkey
Colombia Colombia
Slovak Republic Slovak Republic
Kazakhstan Kazakhstan
Brunei Darussalam Brunei Darussalam
Serbia Serbia
Mexico Mexico
Hungary Hungary
Peru Peru
Bosnia and Herzegovina Bosnia and Herzegovina
Panama Panama
Malaysia Malaysia
Montenegro Montenegro
Bulgaria Bulgaria
Dominican Republic Dominican Republic
Indonesia Indonesia
Kosovo Kosovo
Georgia Georgia
Baku (Azerbaijan) Baku (Azerbaijan)
Thailand Thailand
Morocco Morocco
Albania Albania
Philippines Philippines
60 50 40 30 20 10 0 10 20 30 40 50 60
Score-point difference Score-point difference
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Note: All score-point differences are statistically significant.
Countries and economies are ranked in descending order of the multiple-source score-point difference, after accounting for students’ and schools’ socio-economic
profile.
Source: OECD, PISA 2018 Database, Table B.5.16.
12 https://doi.org/10.1787/888934240256
112 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Strategies to tackle inequality and gender gaps in reading performance
1. No navigation: students who had navigation activities neither in single- nor multiple-source items;
2. Unfocused Limited navigation: students who navigated merely in single-source items but not through
multiple-source items;
3. Strictly focused navigation: students who strictly followed the item instruction to actively navigate in
multiple-source items only and limit navigation in single-source items, and
4. Actively explorative navigation: students who actively navigated in both single- and multiple-source items.
PISA 2018 results show that, on average across OECD countries, students who have a better knowledge of effective
reading strategies are also more likely to have an actively explorative navigation across single- and multiple-source items
in the PISA reading assessment. This is the case for the three reading strategies assessed in PISA 2018: understanding
and memorising a text, summarising information, and, in particular, assessing the credibility of sources (Tables B.5.24,
B.5.25 and B.5.26). In the Index of knowledge of reading strategies for assessing the credibility of sources, students
who actively navigated and explored the Rapa Nui unit scored three times more than students who did not navigate
at all. These differences remain significant even after accounting for students’ and schools’ socio-economic profile in
70% of the countries/economies with available data (42/60). No statistically significant differences were observed in
18 countries/economies (Table B.5.26).
Girls predominantly reported better reading strategies than boys in the three indices included in PISA 2018
(Tables B.5.12a-c). This is also the case across every type of navigation behaviour (Tables B.5.27, B.5.28 and B.5.29).
Still, both boys and girls who actively navigated and explored the Rapa Nui unit have better knowledge of reading
strategies for assessing sources’ credibility than students with no or limited navigation. Moreover, gender differences
in reading strategies are narrower among students with more active navigation that those with no navigation
(Figure 5.10).
These results highlight the importance of teaching and learning effective reading strategies to bolster reading in digital
environments. As reading in digital environments requires many more self-organisational skills, students may benefit
from knowing effective reading strategies and how to assess information critically.
Figure 5.10 Index of knowledge of reading strategies for assessing the credibility of sources, by navigation
behaviours and gender
OECD average
Mean index Boys Girls Girls - boys
0.6
0.5
0.4
0.3
0.2
0.1
0
No navigation Limited navigation Strictly focused navigation Actively explorative navigation
Note: All differences between girls and boys are statistically significant.
Source: OECD, PISA 2018 Database, Table B.5.29.
12 https://doi.org/10.1787/888934240275
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 113
5 Strategies to tackle inequality and gender gaps in reading performance
In PISA 2018, the OECD average change in reading performance associated with one-unit increase in the PISA index of economic,
social and cultural status was 36 points after accounting for gender. At the same time, girls outperformed boys in reading by
25 points after accounting for students’ socio-economic background (Figure 5.11, and Figure 5.12). To what extent are students’
self-perception of reading competence and reading strategies in PISA 2018 mediators of socio-economic and gender inequalities
in reading performance?
Almost 30%6 of the difference in reading performance between socio-economically advantaged and disadvantaged students
is the indirect result of disparities in socio-economically advantaged and disadvantaged students’ reported self-perception
of reading competence, on average across OECD countries, (Figure 5.11 and Table B.5.17). However, only about 10% of the
difference in reading performance between boys and girls is the indirect result of disparities in boys’ and girls’ reported
self-perception of reading competence. In other words, almost one-third of the association between socio-economic background
and reading performance can be accounted for by the differences in students’ perception of reading competence across
socio-economic backgrounds. Gender differences in students’ perception of competence, however, are relatively smaller in
magnitude as pointed out earlier.
Figure 5.11 Student’s self-perception of reading competence as a mediator of the relationship between
socio-economic background, gender, and reading performance
OECD average
a) Total effect
Reading performance
Gender
25.1
b) ESCS and gender effect when accounting for the indirect effect of self-perception of reading competence
Reading performance
Gender
23.3
Notes: Socio-economic status is measured by the PISA index of economic, social and cultural status (ESCS); gender = girl.
Total socio-economic status effect represents the score-point change in reading performance that is associated with a one-unit change in socio-economic status
when accounting for gender.
Total gender effect represents the score-point change in reading performance that is associated with a one-unit change in gender when accounting for socio-
economic status.
Socio-economic status effect when accounting for the indirect effect of perceived competence in reading represents the score-point change in reading
performance that is associated with a one-unit increase in ESCS when accounting for gender, the index of perceived competence in reading (SCREADCOMP), the
index of perceived difficulty in reading (SCREADDIFF), and the index of perception of difficulty of the PISA test (PISADIFF).
Gender effect when accounting for the indirect effect of self-perception of reading competence represents the score-point change in reading performance
that is associated with being a girl when accounting for ESCS, the index of perceived competence in reading (SCREADCOMP), the index of perceived difficulty in
reading (SCREADDIFF), and the index of perception of difficulty of the PISA test (PISADIFF).
Source: OECD, PISA 2018 Database.
114 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Strategies to tackle inequality and gender gaps in reading performance
On average across OECD countries, about 32%7 of the difference in reading performance between socio-economically advantaged
and disadvantaged students is the indirect result of disparities in socio-economically advantaged and disadvantaged students’
reported knowledge of effective reading strategies (Figure 5.12 and Table B.5.18). However, about 65% of the difference in reading
performance between boys and girls is the indirect result of disparities in boys’ and girls’ reported knowledge of effective reading
5
strategies. In other words, almost two-thirds of the association between gender and reading performance can be accounted for
by the difference between boys’ and girls’ knowledge of effective reading strategies.
These results imply that the impact of socio-economic disparities and gender on reading performance is likely to be reduced by
aligning students’ perception of reading competence to their actual reading competency (see Box 5.1), and teaching effective
reading strategies to navigate digital enviroments (see Box 5.2).
Among the indices of perception of competence included in PISA 2018, the reading self-efficacy (i.e. index of perception of
difficulty of the PISA reading test) is a comparatively more potent mediator in the association between students’ socio-economic
status and reading performance.
Figure 5.12 Student’s knowledge of reading strategies as a mediator of the relationship between socio-economic
background, gender, and reading performance
OECD average
a) Total effect
Reading performance
Gender
25.1
b) ESCS and gender effect when accounting for the indirect effect of knowledge of reading strategies
24.4
Socio-economic status
Reading performance
Gender 8.7
Notes: Socio-economic status is measured by the PISA index of economic, social and cultural status (ESCS); gender = girl.
Total socio-economic status effect represents the score-point change in reading performance that is associated with a one-unit change in socio-economic status
when accounting for gender.
Total gender effect represents the score-point change in reading performance that is associated with a one-unit change in gender when accounting for socio-
economic status.
Socio-economic status effect when accounting for the indirect effect of perceived competence in reading represents the score-point change in reading
performance that is associated with a one-unit increase in ESCS when accounting for gender, the index of perceived competence in reading (SCREADCOMP), the
index of perceived difficulty in reading (SCREADDIFF), and the index of perception of difficulty of the PISA test (PISADIFF).
Gender effect when accounting for the indirect effect of knowledge of reading strategies represents the score-point change in reading performance that is
associated with being a girl when accounting for ESCS, the indices of knowledge of reading strategies for understanding and remembering (UNDREM), for
writing a summary (METASUM), and for assessing the credibility of sources (METASPAM).
Source: OECD, PISA 2018 Database.
Figure 5.13 shows the change in reading performance associated with a one-unit increase in the indices of self-perception of
reading competence and knowledge of reading strategies after accounting for students’ and schools’ socio-economic profile
and gender, and the effect of the rest of the indices. Among the indices included in this analysis, students’ knowledge of
effective strategies for assessing sources’ credibility had the strongest unique association with performance after accounting for
socio-economic status, gender, and the rest of the variables, followed by the indices of knowledge of reading strategies for
summarising information, of reading self-efficacy, and of perceived competence in reading (Table B.5.19).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 115
5 Strategies to tackle inequality and gender gaps in reading performance
All of these variables together explained 47% of the variance in reading performance on average across OECD countries.
This set of variables explains a similar amount of variance as well in single- and multiple-source reading subscales (Table B.5.19).
The total explained variance in reading performance drops about one-fourth (from 47% to 36%) when dropping awareness of
effective reading strategies (undrem, metasum, metaspam). However, it approximately drops about 15% (from 47% to 40%) when
dropping students’ self-perception of reading competence (pisadiff, screaddiff, screadcomp) or background variables (students’
and schools’ socio-economic profile, and gender). Thus, indicating that the indices of awareness of effective reading strategies
showed a higher unique variance (Table B.5.20).
These results imply that students with the same socio-economic status who have better knowledge of effective reading strategies
and better self-efficacy in reading are more likely to be proficient readers. Therefore, despite socio-economic status and gender,
students’ self-perceptions and awareness of effective reading strategies play a decisive role in reading performance. First, both
sets of indicators are effective mediators in the association between socio-economic status, gender, and reading performance.
Second, the indices of awareness of effective reading strategies are more likely to be susceptible to provide visible changes as
they showed a higher unique variance. These finding are particularly important for education policies. Contrary to socio-economic
status, which cannot be changed, knowledge of effective reading strategies can be taught. There is strong evidence that teaching
and supportive classroom practices can enhance meta-cognitive reading strategies (Christenson, Reschly and Wylie, 2012[22];
Guthrie, Klauda and Ho, 2013[23]; Reeve, 2012[24]; Autin and Croizet, 2012[25]). For instance, a recent study shows that classroom
training on sourcing skills can help improve teenagers’ critical thinking when comprehending multiple documents (Pérez et al.,
2018[26]). Students’ general self-concept and reading self-efficacy are also likely to be enhanced through school-based social and
emotional learning programmes (Corcoran et al., 2018[27]; Smithers et al., 2018[28]).
Figure 5.13 Perception of reading competence, knowledge of reading strategies, socio-economic status and gender as
predictors of reading performance
Change in reading performance associated with a one-unit increase in the following indices after accounting for students’ and
schools’ socio-economic profile, gender, and the effect of the rest of the indices, OECD average
0 5 10 15 20 25
Score-point difference in reading
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. The change in reading performance associated with one-unit increase of the index of perception of difficulty of the PISA test (pisadiff) was inverted for easier
interpretation. A positive score, therefore, means higher reading self-efficacy.
3. The change in reading performance associated with one-unit increase of the index of perceived difficulty in reading (screaddiff) was inverted for easier
interpretation. A positive score, therefore, means a lower perception of difficulty in reading.
Indices are ranked in descending order of the change in reading performance, after accounting for students’ and schools’ socio-economic profile, gender, and the
effect of the rest of the indices.
Source: OECD, PISA 2018 Database, Table B.5.19.
12 https://doi.org/10.1787/888934240294
116 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Notes
Strategies to tackle inequality and gender gaps in reading performance
1. The socio-economic status is measured by the PISA index of economic, social and cultural status (ESCS).
2. A socio-economically disadvantaged (advantaged) student is a student in the bottom (top) quarter of the PISA index of economic, social and
5
cultural status (ESCS) in the relevant country/economy.
3. When interpreting these results, it is important to bear in mind that as a result of the Multistage Adaptive Testing (MSAT) there were three stages
to the PISA 2018 reading assessment: Core, Stage 1 and Stage 2. Students first saw a short non‑adaptive Core stage, which consisted of between
7 and 10 items. The vast majority of these items (at least 80% and always at least 7 items) were automatically scored. Students’ performance
in this stage was provisionally classified as low, medium or high, depending on the number of correct answers to these automatically scored
items. Nevertheless, the results presented above hold even after accounting for the MSAT effect (Table B.5.6).
4. Academically resilient students are disadvantaged students who are in the bottom quarter of the PISA index of economic, social and cultural
status (ESCS) in their own country/economy but who score in the top quarter of reading in that country/economy.
5. Students were asked to report the extent to which they agree (i.e. strongly disagree, disagree, agree, and strongly agree) with the following
statements about the PISA reading tasks they had just completed: “Many texts were too difficult for me”, “There were many words I could not
understand”, “I was lost when I had to navigate between different pages”.
6. This value is calculated as 1 minus the result of dividing the ESCS coefficient of the equation b by the ESCS coefficient of equation A and then
multiply by 100 (see Figure 5.11).
7. This value is calculated as 1 minus the result of dividing the ESCS coefficient of the equation b by the ESCS coeffsicient of equation A and then
multiply by 100 (see Figure 5.12).
References
Artelt, C., U. Schiefele and W. Schneider (2001), “Predictors of reading literacy”, European Journal of Psychology of Education, Vol. 16/3, [12]
pp. 363-383, http://dx.doi.org/10.1007/bf03173188.
Artelt, C. and W. Schneider (2015), “Cross-Country Generalizability of the Role of Metacognitive Knowledge in Students’ Strategy Use and [14]
Reading Competence”, Teachers College Record, Vol. 117/1, pp. 1-32, https://www.tcrecord.org/content.asp?contentid=17695.
Autin, F. and J. Croizet (2012), “Improving working memory efficiency by reframing metacognitive interpretation of task difficulty.”, Journal [25]
of Experimental Psychology: General, Vol. 141/4, pp. 610-618, http://dx.doi.org/10.1037/a0027478.
Bong, M. and E. Skaalvik (2003), “Academic Self-Concept and Self-Efficacy: How Different Are They Really?”, Educational Psychology Review, [1]
Vol. 15/1, pp. 1-40, http://dx.doi.org/10.1023/a:1021302408382.
Cantrell, S. et al. (2010), “The impact of a strategy-based intervention on the comprehension and strategy use of struggling adolescent [13]
readers.”, Journal of Educational Psychology, Vol. 102/2, pp. 257-280, http://dx.doi.org/10.1037/a0018212.
Chamorro-Premuzic, T. and A. Furnham (2008), “Personality, intelligence and approaches to learning as predictors of academic [20]
performance”, Personality and Individual Differences, Vol. 44/7, pp. 1596-1603, http://dx.doi.org/10.1016/j.paid.2008.01.003.
Christenson, S., A. Reschly and C. Wylie (eds.) (2012), Handbook of Research on Student Engagement, Springer US, Boston, MA, [22]
http://dx.doi.org/10.1007/978-1-4614-2018-7.
Corcoran, R. et al. (2018), “Effective universal school-based social and emotional learning programs for improving academic achievement: [27]
A systematic review and meta-analysis of 50 years of research”, Educational Research Review, Vol. 25, pp. 56-72,
http://dx.doi.org/10.1016/j.edurev.2017.12.001.
Dunning, D., C. Heath and J. Suls (2004), “Flawed Self-Assessment”, Psychological Science in the Public Interest, Vol. 5/3, pp. 69-106, [11]
http://dx.doi.org/10.1111/j.1529-1006.2004.00018.x.
Fang, J. et al. (2018), “The Big-Fish-Little-Pond Effect on Academic Self-Concept: A Meta-Analysis”, Frontiers in Psychology, Vol. 9, [9]
http://dx.doi.org/10.3389/fpsyg.2018.01569.
Guthrie, J., S. Klauda and A. Ho (2013), “Modeling the Relationships Among Reading Instruction, Motivation, Engagement, and [23]
Achievement for Adolescents”, Reading Research Quarterly, Vol. 48/1, pp. 9-26, http://dx.doi.org/10.1002/rrq.035.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 117
5 Strategies to tackle inequality and gender gaps in reading performance
Kankaraš, M. and J. Suarez-Alvarez (2019), “Assessment framework of the OECD Study on Social and Emotional Skills”, OECD Education
Working Papers, No. 207, OECD Publishing, Paris, https://dx.doi.org/10.1787/5007adef-en.
Marsh, H. and R. Craven (2006), “Reciprocal Effects of Self-Concept and Performance From a Multidimensional Perspective: Beyond
Seductive Pleasure and Unidimensional Perspectives”, Perspectives on Psychological Science, Vol. 1/2, pp. 133-163,
[16]
[4]
http://dx.doi.org/10.1111/j.1745-6916.2006.00010.x.
O’Connor, M. and S. Paunonen (2007), “Big Five personality predictors of post-secondary academic performance”, Personality and [21]
Individual Differences, Vol. 43/5, pp. 971-990, http://dx.doi.org/10.1016/j.paid.2007.03.017.
OECD (2020), PISA 2018 Technical Report, PISA, OECD Publishing, Paris. [7]
OECD (2019), PISA 2018 Results (Volume II): Where All Students Can Succeed, PISA, OECD Publishing, Paris, [5]
https://dx.doi.org/10.1787/b5fd1b8f-en.
OECD (2019), PISA 2018 Results (Volume III): What School Life Means for Students’ Lives, PISA, OECD Publishing, Paris, [6]
https://dx.doi.org/10.1787/acd78851-en.
OECD (2010), PISA 2009 Results: Learning to Learn: Student Engagement, Strategies and Practices (Volume III), PISA, OECD Publishing, Paris, [15]
https://dx.doi.org/10.1787/9789264083943-en.
Pérez, A. et al. (2018), “Fostering teenagers’ assessment of information reliability: Effects of a classroom intervention focused on critical [26]
source dimensions”, Learning and Instruction, Vol. 58, pp. 53-64, http://dx.doi.org/10.1016/j.learninstruc.2018.04.006.
Peura, P. et al. (2019), “Reading self-efficacy and reading fluency development among primary school children: Does specificity of self- [10]
efficacy matter?”, Learning and Individual Differences, Vol. 73, pp. 67-78, http://dx.doi.org/10.1016/j.lindif.2019.05.007.
Reeve, J. (2012), “A Self-determination Theory Perspective on Student Engagement”, in Handbook of Research on Student Engagement, [24]
Springer US, Boston, MA, http://dx.doi.org/10.1007/978-1-4614-2018-7_7.
Retelsdorf, J., O. Köller and J. Möller (2011), “On the effects of motivation on reading performance growth in secondary school”, [2]
Learning and Instruction, Vol. 21/4, pp. 550-559, http://dx.doi.org/10.1016/j.learninstruc.2010.11.001.
Roberts, B., K. Walton and W. Viechtbauer (2006), “Patterns of mean-level change in personality traits across the life course: [18]
A meta-analysis of longitudinal studies.”, Psychological Bulletin, Vol. 132/1, pp. 1-25, http://dx.doi.org/10.1037/0033-2909.132.1.1.
Smithers, L. et al. (2018), “A systematic review and meta-analysis of effects of early life non-cognitive skills on academic, psychosocial, [28]
cognitive and health outcomes”, Nature Human Behaviour, Vol. 2/11, pp. 867-880, http://dx.doi.org/10.1038/s41562-018-0461-x.
Soto, C. (2015), “The Little Six Personality Dimensions From Early Childhood to Early Adulthood: Mean-Level Age and Gender Differences [19]
in Parents’ Reports”, Journal of Personality, Vol. 84/4, pp. 409-422, http://dx.doi.org/10.1111/jopy.12168.
Suarez-Alvarez, J. et al. (2020), “Editorial: Bridging the Gap Between Research and Policy in Fostering Social and Emotional Skills”, [17]
Frontiers in Psychology, Vol. 11, http://dx.doi.org/10.3389/fpsyg.2020.00426.
Suárez-Álvarez, J., R. Fernández-Alonso and J. Muñiz (2014), “Self-concept, motivation, expectations, and socioeconomic level as [3]
predictors of academic performance in mathematics”, Learning and Individual Differences, Vol. 30, pp. 118-123,
http://dx.doi.org/10.1016/j.lindif.2013.10.019.
Van de gaer, E. et al. (2012), “The Reference Group Effect”, Journal of Cross-Cultural Psychology, Vol. 43/8, pp. 1205-1228, [8]
http://dx.doi.org/10.1177/0022022111428083.
118 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
6
Teaching and learning literacy skills in a digital world
This chapter discusses how teachers’
stimulation of students’ reading engagement
has changed in the last years and its
association with students’ reading enjoyment
and performance. This chapter also focuses
on the relationship between the type and
length of texts used for teaching, learning
and reading performance, and how schools
are using digital devices to leverage the
potential of technology.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 119
6 Teaching and learning literacy skills in a digital world
– In 49 participating countries and economies in PISA 2018, students from a lower socio-economic status perceived less
stimulation from their teachers to engage in reading. In 49 participating countries and economies in PISA 2018, girls
perceived more stimulation from their teachers to read than boys.
– Reading fiction texts more frequently was positively associated with reading performance in 55 countries and economies
after accounting for students’ and schools’ socio-economic profiles. Reading digital texts more frequently, however, shows
a negative association with reading performance after accounting for students’ and schools’ socio-economic profiles.
– Students who had to read longer pieces of text for school (101 pages or more) achieved 31 points more in reading
than those who reported reading smaller pieces of text (10 pages or less) after accounting for students’ and schools’
socio-economic profiles and students’ gender. Yet, this was not the case in Greece, Japan, and Korea.
– Students in Australia, Denmark, New Zealand, Sweden and the United States reported spending more than 1 hour
a week using digital devices during and outside classroom lessons for lessons in the language the PISA test was taken
in. In contrast, students in Slovenia and Chinese Taipei reported spending about 23 and 24 minutes a week while Japan
reported spending only 10 minutes a week.
HAS TEACHERS’ STIMULATION OF STUDENTS’ READING ENGAGEMENT CHANGED OVER THE LAST NINE
YEARS?
Together with teachers’ enthusiasm, their stimulation of students’ reading engagement is the teaching practice most strongly
associated with students’ enjoyment of reading after accounting for socio-economic status, reading performance and other
teaching practices (OECD, 2019[1]). As shown in Chapter 4 of this report, students’ enjoyment of reading has significantly
decreased between 2009 and 2018 in one-third of the countries/economies – including at the OECD average – while in another
one-third it has increased. Does teachers’ stimulation of students’ reading engagement mirror that trend?
The contextual questionnaire distributed in PISA 2018 asked students how often (“never or hardly ever”, “in some lessons”,
“in most lessons”, “in all lessons”) the following things happened in their language of instruction lessons: “The teacher
encourages students to express their opinion about a text”; “The teacher helps students relate the stories they read to their
lives”; “The teacher shows students how the information in texts builds on what they already know”; “The teacher poses
questions that motivate students to participate actively”. The index teachers’ stimulation of students’ reading engagement is
standardised to have a mean of 0 and a standard deviation of 1 across OECD countries. Positive values in the index indicate
that the teaching practices are used more frequently. The index teachers’ stimulation of students’ reading engagement
might be sensitive to cross-cultural differences so caution should be taken when comparing average results across countries
(see Box 4.1, Chapter 4).
In PISA 2018, about one in ten students (11%) on average across OECD countries reported that their teacher never or hardly
ever encourages students to express their opinion about a text. However, about one in four students (23%) reported that their
teacher never or hardly ever helps students relate the stories they read to their lives. The first teaching practice is more general
and covers a broad range of opinions about a text while the second is more specific to students’ lives. These are, respectively,
the most and the least frequent teaching practices stimulating reading engagement across OECD countries (Table B.6.1).
Teachers’ stimulation of students’ reading engagement typically varies by different students and schools’ characteristics
(Figure 6.1). In 49 participating countries and economies in PISA 2018, students from a lower socio-economic status perceived
less stimulation from their teachers to engage in reading (Table B.6.2). On average across OECD countries, the difference in the
index of teachers’ stimulation of reading engagement between students from advantaged and disadvantaged socio-economic
backgrounds was 0.15 in favour of advantaged students. Students in Australia, Belarus, B-S-J-Z (China), Singapore, and the
United States, in particular, reported the largest socio-economic gap in this index (higher than 0.30). In contrast, students
from a lower socio-economic status in Argentina, Mexico, Morocco, Panama, and Peru perceived more stimulation from their
teachers to engage in reading (more than 0.10).
120 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Figure 6.1 Index of teacher’s stimulation of reading engagement perceived by student, by student
characteristics
A
Positive difference
Girls - boys B
Negative difference
C
Missing values
6
Difference in the index of Difference in the index of
teacher’s stimulation of teacher’s stimulation of reading
reading engagement perceived Mean index engagement perceived by Mean index
by student between: -1.0 -0.5 0 0.5 1.0 student between: -1.0 -0.5 0 0.5 1.0
A B C A B C
Netherlands United Kingdom
Spain New Zealand
Czech Republic † Qatar
Norway Singapore
Italy Australia
Finland † Costa Rica
Iceland † Malaysia †
Belgium Colombia †
Israel Serbia †
Germany Indonesia †
Slovenia † Chile †
Luxembourg Romania
Macao (China) Kosovo †
Ireland Philippines †
Turkey † Belarus †
Hungary † Jordan
Mexico † Moldova †
Poland Albania †
Ukraine †
Portugal † A B C
Hong Kong (China) Countries/economies with a positive difference 48 48 14
Japan
Chinese Taipei †
Note: One dagger (†) means that the share of immigrant students in the country is less than 10%.
Countries and economies are ranked in ascending order of teacher’s stimulation of reading engagement perceived by student.
Source: OECD, PISA 2018 Database, Tables B.6.1 and B.6.2.
12 https://doi.org/10.1787/888934240313
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 121
6 Teaching and learning literacy skills in a digital world
In 49 participating countries and economies in PISA 2018, girls perceived more stimulation from their teachers to engage in
reading (Table B.6.2). Students in Bosnia and Herzegovina, the Philippines, and Serbia, in particular, reported the largest gender
gap in favour of girls (around 0.25 or higher). Only in two countries (Korea and Sweden) boys perceived higher stimulation from
their teachers to engage in reading, than girls do (higher than 0.10). Interestingly, immigrant students in Latvia, Qatar, and
Singapore reported higher stimulation from their teachers than non-immigrant students (more than 0.15), while in Bulgaria,
Indonesia, and the Philippines, the gap was in favour of non-immigrant students (about 0.45 or more).
On average across OECD countries, the difference in the teachers’ stimulation of students’ reading engagement index between
advantaged and disadvantaged schools was approximately one-tenth of standard deviation, yet significant in favour of
advantaged schools. This was also the case in 33 countries and economies that participated in PISA 2018. However, students in
36 countries and economies with available data on this index showed no difference, and in eight of them, disadvantaged schools
scored significantly higher. Nevertheless, it is important to take into account when analysing school-level indicators that the vast
majority of variance in this index lays within schools, and only 6% is between-schools variance on average across OECD countries.
In the Czech Republic, Hungary, Israel, Latvia, and Panama the between-schools variance in this index is 10% while in Indonesia,
Baku (Azerbaijan), and Luxembourg it is lower than 2% (Table B.6.3).
It would be reasonable to expect that teachers would be more likely to stimulate students who need it the most as strong
readers would be already more engaged in reading. It is important to consider that teachers’ stimulation of reading engagement
is typically associated with students’ enjoyment of reading as well as reading performance. The association between teachers’
stimulation of reading engagement and students’ enjoyment of reading is positive in all participating countries and economies
in PISA 2018, and with reading performance in 61 countries and economies after accounting for students’ and schools’
socio-economic profile (Tables B.6.4 and B.6.5).
The following items of the PISA 2018 teachers’ stimulation of students’ reading engagement index were also administered in
PISA 2009: “The teacher encourages students to express their opinion about a text”; “The teacher helps students relate the
stories they read to their lives”; and “The teacher shows students how the information in texts builds on what they already
know”. On average across OECD countries, more students reported these indicators occurred in most or all lessons in PISA 2018
than in PISA 2009 (Figure 6.2). The most pronounced change was observed in Japan and Korea where the three indicators of
the teachers’ stimulation of students’ reading engagement index increased the most, i.e., an increase of 30 to 38 percentage
points for Korea, and 14 to 23 percentage points for Japan. In contrast, the most pronounced decline was observed in Georgia
and Russia where the three indicators of the teachers’ stimulation of students’ reading engagement index decreased the most,
i.e., a decrease of 8 to 19 percentage points for Georgia, and 10 to 14 percentage points for Russia (Table B.6.6).
Figure 6.2 Change between 2009 and 2018 in teachers’ stimulation of reading engagement
Percentage of students who reported that in their language-of-instruction lessons the teaching practice occurred in “most” or
“all” lessons, OECD average
% 2009 2018
60
50
40
30
20
10
0
The teacher shows students how the information The teacher helps students relate the stories The teacher encourages students
in texts builds on what they already know they read to their lives to express their opinion about a text
Note: All differences between 2009 and 2018 are statistically significant.
Items are ranked in descending order of the difference between 2009 and 2018.
Source: OECD, PISA 2018 Database, Table B.6.6.
12 https://doi.org/10.1787/888934240332
To summarise, students’ reported reading for enjoyment across OECD countries has decreased over the last decade even though
indicators of teachers’ stimulation of students’ reading engagement have significantly increased. However, there are differences
across countries. For example, if the change in students’ responses to the “teacher encourages students to express their opinion
122 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Teaching and learning literacy skills in a digital world
about a text” is compared with the change in those to “student’s enjoyment of reading”, both trends seem to go in the same
positive direction in Chile, Japan, Korea, and Macao (China) while in Hungary, Lithuania, Montenegro, Serbia, Sweden, and Thailand
they both go in the opposite direction. In the other 55 countries and economies, the trends do not align either in a positive or
negative direction (Table B.4.4a and Table B.6.6). This could be, in part, because teachers know that teenagers are less engaged
6
in reading and they feel the need to stimulate them more.
WHAT ARE THE LEARNING AND TEACHING PRACTICES IN TERMS OF TYPE AND LENGTH OF TEXTS MORE
STRONGLY ASSOCIATED WITH READING PERFORMANCE?
Some experts would argue that we are moving from a word-based culture into a far faster-paced digital- and screen-based one
(Wolf, 2018[2]). News are in real-time 24/7 and social media reactions spread across the globe in a matter of seconds. It is no longer
rare to find the length of an online text (in minutes) before the topic has even been introduced (e.g. online newspapers), listen to
18-min inspirational talks (e.g. TED talks), or limit our thoughts to 280 characters in Twitter (about 56 words). The increasing
production and consumption of content are resulting in the more rapid exhaustion of individuals’ attention, and the staying power
of topics in the collective memory is shorter than ever before (Lorenz-Spreen et al., 2019[3]). Over the last few years, there has been
a lot of debate about the extent to which digital immersion is shaping not only our behaviour but also our brains (Firth et al., 2019[4];
Wolf, 2018[2]; Carr, 2010[5]). Are students still able to read long pieces of text? Are schools still asking students to read fiction books?
Is there an association between the length and type of texts read in school and reading performance in PISA?
PISA 2018 asked students how often they read the following types of texts for school during the last month: texts that include
diagrams or maps, fiction (novels, short stories), texts that include tables or graphs, and digital texts including links. Approximately
85% of students reported that they read fiction texts for school at least once during the previous month while 66% reported
reading digital texts with the same frequency (Table B.6.7) on average across OECD countries. Figure 6.3 shows the score-point
difference in reading performance between students who reported reading texts for school «two or more times» during the last
month and those who reported «once or none» for different types of texts (texts that include diagrams or maps, fiction, texts that
include tables or graphs, and digital texts including links). A higher frequency of reading fiction texts, texts that include tables and
graphs, and texts that include diagrams more frequently is significantly associated with reading performance after accounting
for students and schools’ socio-economic profile on average across OECD countries. Digital texts, on the other hand, show a
negative association with reading performance after accounting for students and schools’ socio-economic profile. However, the
magnitude of this difference is comparatively smaller than in the other three types of text.
Figure 6.3 Reading performance, by the type of text read for school
Score-point difference between “two or more times” and “once or none” during the previous month, OECD average
Before accounting for students' and schools' socio-economic profile¹
After accounting for students' and schools' socio-economic profile
Score-point difference
20
Reading fiction texts such as novels or, texts that include tables,
15 graphs or diagrams, are associated with students’ reading skills
10
-5
Fiction Texts that include Texts that include Digital texts including links
(e.g., novels, short stories) tables or graphs diagrams or maps
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
Note: All score-point differences are statistically significant.
Items are ranked in descending order of the score-point differences after accounting for students’ and schools’ socio-economic profile.
Source: OECD, PISA 2018 Database, Table B.6.8a.
12 https://doi.org/10.1787/888934240351
Reading fiction texts more frequently was positively associated with reading performance in 55 countries and economies after
accounting for students’ and schools’ socio-economic profiles. In contrast, 16 showed no significant differences, including the
following OECD countries: Chile, Colombia, Czech Republic, Denmark, Finland, France, Hungary, Ireland, Latvia, New Zealand,
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 123
6 Teaching and learning literacy skills in a digital world
and Portugal. In Estonia, Mexico, Morocco, Panama, Qatar, and the Slovak Republic differences favoured students who read less
fiction – yet the score-point difference was less than 10 points (Table B.6.8a).
Although students seem to read less for leisure and to read fewer fiction books (OECD, 2019[6]), reading fiction texts such as
novels or texts that include tables, graphs, and diagrams are still associated with students’ reading skills (Tables B.6.8a-c).
PISA results also show that students who reported reading fiction books for school during the last month are more likely to
have also reported reading fiction books because they wanted to. This relationship is observed in all participating countries and
economies in PISA (Table B.6.9). Besides, education systems in which more students reported reading fiction books for school
more than once a month are also those with more students who reported reading fiction books more than once a month because
they wanted to (Figure 6.4). These results suggest that teachers’ assignments to read books for school encourages reading for
pleasure outside of school. At the same time, it is also possible that students who frequently read fiction books for pleasure ask
teachers for more fiction books to read for school.
Figure 6.4 System-level relationship between reading fiction for school and reading fiction for pleasure
Percentage of students who reported reading fiction books, more than once a month
Russia
Viet Nam Belarus
Philippines Peru Albania
Panama
Thailand Malaysia Ukraine All countries and
50 Korea
15 Georgia Turkey economies
Hong Kong (China)
14 Baku (Azerbaijan) Moldova R = 0.55
Macao (China) 13 Kosovo
Japan 16 Montenegro
40 Chinese Taipei 11
Qatar Morocco 12 Romania
17 Bulgaria
Malta 10 Costa Rica OECD countries
New Zealand Israel R = 0.41
Brazil 20 19 18 Argentina
30 1 Spain 9
OECD average: 29% 21 24 Bosnia and Herzegovina
Portugal 2 8 22 23
5 7 Latvia Denmark
3 6 Greece
20 Slovak Republic Serbia Poland
Czech Republic 4 Germany
Croatia
Netherlands Finland Norway
Belgium
After accounting for GDP and reading performance1:
10 Slovenia
R = 0.49 - All countries and economies
R = 0.42 - OECD countries
0 OECD average: 63%
30 40 50 60 70 80 90 100
Reading fiction books for school (%)
PISA 2018 also asked students how many pages was the longest piece of text they had to read for their language test lessons
(i.e. the language they took the test in) during the last academic year: “one page or less”, “between 2 and 10 pages”, “between
11 and 50 pages”, “between 51 and 100 pages”, “between 101 and 500 pages”, and “more than 500 pages”. A higher percentage
of students (43%) reported that the longest piece of text they had to read for school was at least 101 or more compared to any
other length of text across OECD countries (Table B.6.10). In Canada, Denmark, Finland, and the United Kingdom more than 70%
of students reported that the longest piece of text they had to read for school was 101 pages or more (Figure 6.5). In contrast,
less than 6% of students did so in Japan, Jordan, Morocco, Saudi Arabia, Thailand, and Viet Nam. These comparisons could
be marginally affected by what translation studies refer to as the expansion rate. For example, Swedish is often more concise
than English while Finnish or German have a slightly higher expansion rate. Therefore, a 100-page text in English could be a
125-page text in Finnish. However, it is unlikely that the expansion rate could account for the differences presented here, especially
between the top and bottom part of the graph. Some students may have also interpreted that the question was asking about the
longest piece of text they had to read for their language lessons test excluding school assignments, for example, during winter
or summer breaks.
124 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Figure 6.5 Length of the longest piece of text that students had to read for school
Finland
Denmark
Canada
10 pages or less Between 11 and 100 pages
Teaching and learning literacy skills in a digital world
% 100 80 60 40 20 0 20 40 60 80 100 %
Countries and economies are ranked in descending order of the percentage of students who had to read 101 pages or more, for school.
Source: OECD, PISA 2018 Database, Table B.6.10.
12 https://doi.org/10.1787/888934240389
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 125
6 Teaching and learning literacy skills in a digital world
The results also showed that there are differences between the longest piece of text boys and girls reported reading for school.
For example, in 50 countries/economies, there are differences between boys and girls reporting that the longest piece of text
they had to read for school was 10 pages or less (Table B.6.10). In 20 countries/economies, more boys than girls reported
reading texts of 10 pages or less while in 30 of them more girls than boys reported reading these texts. No differences were
observed in 27 countries/economies. In Macao (China), girls are 11 percentage points more than boys reporting reading texts
of 10 pages or less while around 6 percentage points more boys than girls reported reading texts of more than 100 pages.
In Belarus, Romania, and Ukraine, more than 4 percentage points more boys than girls reported reading texts of 10 pages or less
while over 10 percentage points more girls than boys reported reading texts of more than 100 pages. Most differences in these
teaching practices seem to be within schools or the same class rather than to different practices between schools. On average
across OECD countries, approximately 13% of the variance in the average length of the longest piece of text read for school lies
between schools (Table B.6.12a).
Figure 6.6 shows the PISA reading score for students who reported that the longest piece of text they had to read for school
during the last academic year was 10 pages or less, between 11 and 100 pages, or 101 pages or more. The relationship between
reading longer pieces of text for school and reading performance is clear. On average across OECD countries, students who had
to read longer pieces of texts for school (101 pages or more) achieved between 14 and 55 points higher in reading than those
who reported reading smaller pieces of text (between 11 and 100 pages, and 10 pages or less respectively). The magnitude of
the difference between students who reported reading texts of 101 pages or more compared to those who reported reading
10 pages or less was of 31 points after accounting for students’ and schools’ socio-economic profiles and students’ gender on
average across OECD countries (Table B.6.11a).
Figure 6.6 Reading performance, by the length of text read for school
OECD average
PISA reading score
550
Reading longer pieces of text for school is associated with
500 students’ reading skills
450
400
350
10 pages or less Between 11 and 100 pages 101 pages or more
Figure 6.7 The length of text read for school, by proficiency levels and gender
Percentage of students who reported that during the last academic year the longest piece of text that had to read for school was
101 pages or more, OECD average
% Girls Boys
70
60
50
40
30
20
10
0
Students who scored Students who scored Students who scored
below Level 2 in reading at Levels 2, 3 or 4 in reading at Level 5 or above in reading
(low achievers)* (top performers)
126 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Teaching and learning literacy skills in a digital world
When looking by performance level, more than half of top performers in reading (Level 5 or above) reported reading 101 pages
or more for school across OECD countries while only about one in four students reported reading that amount for school among
low performers (below Level 2). The gender difference among low performers is statistically significant – there are 4 percentage
points more boys than girls. However, the gender differences among top performers were not statistically significant (Figure 6.7
6
and Table B.6.10).
Figure 6.8 shows the score-point differences in reading between the students who reported that the longest piece of text they
had to read for school during the last academic year was “101 pages or more”, and those who reported “10 pages or less”.
In 60 countries and economies (or four in five countries/economies), students who reported that the longest piece of text they had
to read for school was 101 pages or more scored significantly higher on reading performance than those who reported 10 pages
or less after accounting for students and schools’ socio-economic profile and students’ gender. In 10 countries/economies, the
differences were not statistically significant. In six other education systems, the differences were in favour of students who reported
that the longest piece of text was 10 pages or less. The negative differences in these six education systems were also observed
between the categories “10 pages or less” and “11 and 100 pages” except in Japan, where there are no significant differences
between these two categories. In Australia, Belarus, Finland, Kazakhstan, Russia, Ukraine and the United Kingdom, the score-point
differences are of, at least, 60 points in favour of longer pieces of text after accounting for students’ and schools’ socio-economic
profile and students’ gender. In contrast, the score-point differences in students in Jordan, Morocco, and Saudi Arabia who reported
that the longest piece of text they had to read for school was 101 pages or more scored at least 30 points less in reading
– yet 5% or fewer students reported reading 101 pages or more in those countries (Table B.6.11a).
Four of the top-performing countries in PISA – Japan, Korea, Macao (China), and Singapore – did not show differences in terms
of the length of texts used for school or show them in a negative direction (i.e. favouring shorter texts). However, it is important
to take into account that 9% of students in Korea, 4% in Japan, 13% in Macao (China) and 16% of students in Singapore reported
that the longest piece of text they had to read for school during the last academic year was more than 100 pages (Table B.6.10).
There is no such thing as a unique standard text length for school practice across all countries/economies. Nevertheless,
the results are consistent when correlating the average length of the longest piece of text read for school with performance
(Figure 6.9 and Table B.6.12a).
PISA 2018 also asked teachers how many pages constituted the longest piece of text their students had to read for their
reading and/or language lessons during the last academic year. Although teachers and students were asked the same question,
it is important to note that the following results represent the average number of pages assigned by teachers in the school where
the student was enrolled, not necessarily what his/her actual teacher assigned. Teachers’ and students’ reports are strongly
correlated (r = 0.85, system-level) among the 19 countries and economies with available data. This means that countries and
economies where students reported reading on average longer pieces of text for their reading/language lessons were also the
countries/economies where their teachers did so. For example, Chile, Spain, the United Kingdom, and the United States showed
up at the top of both indicators.
Within-countries, however, the average correlation across OECD countries2 was 0.13, indicating that teachers and students did
not always agree on the number of pages reported. For example, Chinese Taipei students reported reading, on average, more
than twice the number of pages reported by their test language teachers (46 versus 115). Conversely, students in Albania,
the Dominican Republic, and Panama reported about half the number of pages reported by teachers. In schools attended by
15-year-olds, the average text length reported by teachers in Korea and Macao (China) was quite consistent with that reported
by students (45 versus 47 pages in Korea, and 48 versus 60 pages in Macao (China)). Despite the association between the
longest piece of text read for test language lessons and reading performance remaining negative based on teachers’ reports in
Macao (China), the association – which was non-significant before – becomes significant when looking at what teachers in Korea
reported. In other words, schools where teachers on average reported that their students read longer pieces of text for their test
language lessons performed better in reading (Tables B.6.12a and B.6.12b).
There is no doubt that the content of texts is probably as crucial, if not more, than the length of texts. It is important to bear in
mind that in some countries/economies it may be more common to assign books for students to read while in other countries
teachers may focus more on a textbook, which is a compilation of short texts. Nevertheless, these results show that most of
the high performers in reading read longer pieces of text for school. At the same time, schools with stronger readers may
assign them longer readings than schools with less proficient readers. In other word, schools possibly assign longer readings to
stronger readers because they can handle it. Although it is not possible to determine causal relationships between these factors,
the results show that this association is observed in the vast majority of countries/economies participating in PISA.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 127
6 Teaching and learning literacy skills in a digital world
Figure 6.8 Reading performance, by the length of the text read for school
Score-point differences between students who reported that the longest piece of text that they had to read for school was “101
pages or more” and those who reported “10 pages or less”
Before accounting for students' and schools' socio-economic profile¹ and gender
After accounting for students' and schools' socio-economic profile and gender
Ukraine Mexico
United Kingdom Slovenia
Students who
Belarus Panama
Russia reported that Malta
Australia the longest Malaysia
Finland piece of text Qatar
Kazakhstan Brazil
they had to
Poland Slovak Republic
New Zealand read for school B-S-J-Z (China)
Romania was 101 pages Luxembourg
Canada or more scored Dominican Republic
Moldova Czech Republic
higher on
Croatia Turkey
Serbia reading Bulgaria
United States performance Denmark
France than those Sweden
Colombia Latvia
who reported
Bosnia and Herzegovina Germany
Ireland 10 pages Portugal
Chile or less Norway
Belgium Chinese Taipei
Costa Rica Hong Kong (China)
Lithuania United Arab Emirates
Estonia Singapore
Israel Albania
Montenegro Uruguay
Netherlands Philippines
Indonesia Thailand
OECD average Brunei Darussalam Students who reported
Spain²² Baku (Azerbaijan) that the longest piece of
Georgia Korea text they had to read for
Hungary Kosovo
school was 101 pages or
Peru Greece
Iceland Japan more scored higher on
Argentina Macao (China) reading performance
Austria Jordan than those who reported
Switzerland Morocco
10 pages or less
Italy Saudi Arabia
-80 -60 -40 -20 0 20 40 60 80 100 120 140 -80 -60 -40 -20 0 20 40 60 80 100 120 140
Score-point difference Score-point difference
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Note: Statistically significant differences are shown in a darker tone.
Countries and economies are ranked in descending order of the score-point differences in reading, after accounting for students’ and schools’ socio-economic profile
and gender.
Source: OECD, PISA 2018 Database, Table B.6.11a.
12 https://doi.org/10.1787/888934240446
128 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Figure 6.9 System-level relationship between reading performance and the average length of the longest piece of
text read for school
1. Mexico
2. Qatar
5. Moldova
6. Malaysia
9. Latvia
10. Italy
Teaching and learning literacy skills in a digital world
13. Luxembourg
14. Spain¹
17. Croatia
6
3. Bulgaria 7. Baku (Azerbaijan) 11. Czech Republic 15. Switzerland
4. United Arab Emirates 8. Hungary 12. Iceland 16. Netherlands
575
B-S-J-Z (China)
Singapore
550
All countries and
United States economies
Macao (China) Hong Kong (China) Canada
525 Estonia Poland R = 0.51
Korea Chinese Taipei Ireland Finland
Japan New Zealand
Sweden Australia
500 Norway Slovenia
Portugal Germany 15 Belgium
OECD average: 487 score points 11 Denmark United Kingdom
14 16
475 10 Austria 17 France
OECD countries 12 Russia Belarus
Israel 8 9
Reading performance
Average length of the longest piece of text read for school (number of pages)
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Source: OECD, PISA 2018 Database, Tables B.2.1 and B.6.12a.
12 https://doi.org/10.1787/888934240465
The PISA 2018 ICT familiarity questionnaire asked students whether digital devices had been used in the previous month for
learning and teaching during school lessons, and how much time they spent using digital devices during classroom lessons and
outside of classroom lessons for the different subjects. On average across OECD countries with available data, 37% of students
reported that both the teacher and students used digital devices for learning or teaching during the last month, 25% reported that
only teachers used them, 11% reported that only students did and 26% reported that they did not use them at all (Figure 6.10).
More than 90% of students in Australia, Denmark, Finland, New Zealand, Sweden and the United States reported that they used
digital devices for learning and teaching. In comparison, 73% of students in Japan and 54% in Morocco and Panama reported they
did not use them at all in the previous month (Table B.6.15).
The average duration of time per week students spent using digital devices during classroom lessons and outside of classroom
lessons for language lessons across OECD countries was 41 minutes. Students in Australia, New Zealand, Sweden and
the United States reported spending more than 1 hour a week, and students in Denmark reported about 2 hours a week.
In contrast, students in Slovenia and Chinese Taipei reported spending about 23 and 24 minutes a week while Japan reported
spending only 10 minutes a week (Table B.6.15).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 129
6 Teaching and learning literacy skills in a digital world
Figure 6.10 Frequency of use of digital device for teaching and learning in test language lessons
Percentage of students who reported that during the last month a digital device has been used for learning and teaching,
OECD average
both the
Used by both the teacher
teacher only
Used only
Used only by the teacher Not used
and students by students
0 10 20 30 40 50 60 70 80 90 100 %
Figure 6.11 shows the system-level relationship between reading performance and time spent using digital devices for school.
There is no association between the average amount of time per week students spent using digital devices for schoolwork and
student reading performance at the system level. However, a positive relationship is observed at the system level when the
time spent using digital devices for school is compared with the test item that assesses emergent aspects of reading such as
distinguishing facts from opinions (CR551Q06, see Box 2.1 in Chapter 2). This means that education systems where students
reported spending a greater amount of time per week using digital devices for schoolwork were more likely to solve the item
about distinguishing facts from opinions after accounting for GDP.
Figure 6.11 Reading performance and time spent using digital devices for school
550 Singapore
2 9 7
Turkey All countries and
450 Greece 1 economies
Chile R = 0.11
Serbia Malta
Costa Rica
425 Uruguay
Bulgaria
Brazil
400 Mexico
Brunei Darussalam
Albania
Thailand
Kazakhstan
375 Georgia Panama
OECD average: 41 min
Morocco
350
Dominican Republic If using instead the reading item of distinguishing facts from opinions (CR551Q06):
325
R = 0.33 - All countries and economies, after accounting for GDP
R = 0.37 - OECD countries, after accounting for GDP
300
0 20 40 60 80 100 120
Minutes spent using digital devices for school a week
1. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Source: OECD, PISA 2018 Database, Tables B.2.1 and B.6.15.
12 https://doi.org/10.1787/888934240503
130 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Teaching and learning literacy skills in a digital world
Within countries, the relationship between reading performance and time spent using digital devices for schoolwork was generally
negative but with a few exceptions. This negative association, after accounting for students’ and schools’ socio-economic status,
was observed in 36 countries and economies; while in Australia, Denmark, Korea, New Zealand, and the United States, this
relationship was positive after accounting for students’ and schools’ socio-economic status. When looking at programmes where
6
students are enrolled, no differences were observed in the average amount of time digital devices were used for schoolwork
between students on OECD countries attending general and vocational programmes. However, in Costa Rica, Japan, Korea,
Luxembourg, and Turkey, students in general programmes spent, on average, more time using digital devices for schoolwork
than students enrolled in vocational programmes. In contrast, students attending vocational programmes in 11 countries and
economies spent more time using digital devices for schoolwork than students in general programmes. No differences were
observed in the remaining 16 countries/economies with available data (Table B.6.15).
The great variability across countries in the relationship between time spent using digital devices for school and reading
performance suggests that how digital devices are used may matter more than how much time is spent on them. The PISA 2018
ICT familiarity questionnaire asked students how often they use digital devices for a wide range of activities at school. Figure 6.12
shows that the most common use of digital devices at school across OECD countries is for browsing the Internet for schoolwork
(75% of students reported doing this activity between once a month and everyday). The least common use was playing simulations
at school (34% of students reported doing this activity between once a month to everyday). There are also considerable
differences across countries in the use of digital devices at school. For example, more than 90% of students in Japan and
70% in Korea reported that they never did homework on a school computer while only 22% of students in the United States and
15% in Denmark reported the same. When looking at the use of digital devices for browsing the Internet for schoolwork, more
than half of students in Japan and Korea also reported never using them while this percentage in Denmark and Sweden is just
3% (Table B.6.14).
Percentage of students who reported using digital devices for the following activities at school, at least once a month,
OECD average
% Bottom country/economy OECD average Top country/economy
100
90
80
70
60
50
40
30
20
10
0
Playing Posting my Doing Using Downloading, Using school Practicing Using email <Chatting Browsing the
simulations work on the homework learning uploading or computers and drilling, at school online> at Internet for
at school school’s on a school apps or browsing for group such as for school schoolwork
website computer learning material from work and foreign
websites the school’s communicati language
website (e.g. on with learning or
<intranet>) other mathematics
students
Items are ranked in ascending order of the percentage of students within OECD average.
Source: OECD, PISA 2018 Database, Table B.6.14.
12 https://doi.org/10.1787/888934240522
How digital devices are used at school matter: some activities are positively associated with reading performance while others are
negatively associated. Figure 6.13 shows the relationship between reading performance and the type of school activities done
on digital devices on average across OECD countries. Students who reported browsing the Internet for schoolwork performed
6 points higher in reading compared to those who reported that they never did that after accounting for students’ and schools’
socio-economic profiles. In contrast, students who reported posting work on the school’s website and playing simulations at
school scored less than 45 points in reading compared to those who reported never doing that.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 131
6 Teaching and learning literacy skills in a digital world
Figure 6.13 Relationship between reading performance and the type of school activities done on digital devices
Score-point difference in reading between students who reported using digital devices for the following activities at school
compared to those who reported that never did, OECD average
Before accounting for students' and schools' socio-economic profile¹
After accounting for students' and schools' socio-economic profile
Score-point difference
20
School activities done on digital devices are positively associated with reading performance
10
0
-10
-20
-30
-40 School activities done on digital devices are
negatively associated with reading
-50 performance
-60
Playing Posting my Doing Downloading, Using Using school Practicing Using email <Chatting Browsing the
simulations work on the homework uploading or learning computers and drilling, at school online> at Internet for
at school school’s on a school browsing apps or for group such as for school schoolwork
website computer material from learning work and foreign
the school’s websites communicati language
website (e.g. on with learning or
<intranet>) other mathematics
students
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
Note: All score-point differences are statistically significant.
Items are ranked in ascending order of the score-point differences in reading, after accounting for students’ and schools’ socio-economic profile.
Source: OECD, PISA 2018 Database, Table B.6.16.
12 https://doi.org/10.1787/888934240541
There are, however, remarkable differences across countries. In 18 countries/economies, the association between browsing
the Internet for schoolwork and reading performance is positive – after accounting for students’ and schools’ socio-economic
profiles. In contrast, in 17 countries/economies, this association is negative and in another 17 the association is not statistically
significant. However, all countries/economies showed a negative relationship between playing simulations at school and
reading performance even after accounting for students’ and schools’ socio-economic profiles (Figure 6.14 and Figure 6.15).
The rest of the digital activities previously mentioned also presented negative relationships with reading performance across the
vast majority of countries/economies (Table B.6.16).
It is probably not surprising to find Australia, Denmark, New Zealand, and the United States as some of the countries with
the strongest positive association between browsing the Internet for schoolwork and reading performance (Table B.6.16).
These countries also showed that about half or more of their students do this activity everyday or almost everyday (Table B.6.14).
At the same time, these countries showed a positive relationship between reading performance and time spent using digital
devices for schoolwork (Table B.6.15). The only exception is Korea. Despite showing a positive relationship between reading
performance and time spent using digital devices for schoolwork, the association between browsing the Internet for schoolwork
and reading performance was not significant. Most students in Korea reported that they never or hardly ever use digital devices
at school for any of the activities mentioned above. This suggests that other uses of digital devices might be related to the positive
association between reading performance and time using digital devices in Korea.
It is important to bear in mind that the negative correlation between the use of digital devices for school and reading performance
might be subject to selection bias and that students undertaking these activities may not necessarily represent the full population
of students. Nonetheless, these results suggest that digital devices are more helpful in some school activities than others and
that the use of digital devices might be displacing other instructional activities sometimes or could be better done without
digital devices (Falck, Mang and Woessmann, 2017[8]; OECD, 2019[9]). For example, browsing the Internet for schoolwork may
be more effective in reading for information compared to students who have to perform this activity without digital devices.
Playing simulations at school or posting work on the school’s website could perhaps displace other activities more beneficial
to student reading outcomes. Doing homework on a school computer is also negatively associated with reading performance.
Although this association is after accounting for students’ and schools’ socio-economic profile, other factors could also play a
role. For example, prior knowledge is often an important moderator in this relationship as students with more reading difficulties
may spend more time doing homework. In addition, whether students do homework autonomously is more important than the
time spent doing homework (Fernández-Alonso, Suárez-Álvarez and Muñiz, 2015[10]), and doing it at school could imply doing
homework with the help of other peers or teachers.
132 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Figure 6.14 Reading performance and browsing the Internet for schoolwork
Teaching and learning literacy skills in a digital world
Score-point difference in reading of students who reported using digital devices for browsing the Internet for schoolwork
compared to those who reported that never did it
Before accounting for students' and schools' socio-economic profile¹
6
After accounting for students' and schools' socio-economic profile
Score-point difference
80
60 Students who reported using digital devices for browsing the Internet for schoolwork
scored higher on reading performance than those who reported that never did it
40
20
-20
-40
-60
Sweden
Malta
Russia
Greece
Georgia
Albania
Kazakhstan
Brazil
Czech Republic
Serbia
Hungary
Israel
Dominican Republic
Latvia
Brunei Darussalam
Slovak Republic
Belgium
Slovenia
Japan
Croatia
Switzerland
Panama
OECD average
Thailand
Chile
France
Uruguay
United Kingdom
Macao (China)
United States
New Zealand
Finland
Australia
Denmark
Lithuania
Luxembourg
Estonia
Italy
Ireland
Spain²
Costa Rica
Singapore
Korea
Mexico
Chinese Taipei
Iceland
Poland
Turkey
Morocco
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Note: Statistically significant differences are shown in a darker tone.
Countries and economies are ranked in ascending order of the score-point differences in reading, after accounting for students’ and schools’ socio-economic profile.
Source: OECD, PISA 2018 Database, Table B.6.16.
12 https://doi.org/10.1787/888934240560
Score-point difference in reading of students who reported using digital devices for playing simulations compared to those who
reported that never did it
Before accounting for students' and schools' socio-economic profile¹
After accounting for students' and schools' socio-economic profile
Score-point difference
0
-20
-40
-60
In all countries and economies students who reported
using digital devices for playing simulations at school
-80 scored lower on reading performance than those who
reported that never did it
-100
Italy
Albania
Chinese Taipei
Croatia
Czech Republic
Georgia
Morocco
Kazakhstan
Uruguay
Costa Rica
United States
New Zealand
Luxembourg
Poland
France
Japan
Mexico
Thailand
Slovenia
Macao (China)
Serbia
Brazil
Slovak Republic
Belgium
United Kingdom
Ireland
Russia
Iceland
Malta
Spain²
Switzerland
Chile
Dominican Republic
Bulgaria
Turkey
Latvia
Singapore
OECD average
Lithuania
Brunei Darussalam
Greece
Sweden
Hong Kong (China)
Hungary
Denmark
Estonia
Australia
Korea
Israel
Panama
Finland
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Readers’ guide of this report or PISA 2018 Results (Volume I): What Students Know and Can Do, Annex A9.
Note: All score-point differences in reading are statistically significant.
Countries and economies are ranked in descending order of the score-point differences in reading, after accounting for students’ and schools’ socio-economic profile.
Source: OECD, PISA 2018 Database, Table B.6.16.
12 https://doi.org/10.1787/888934240579
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 133
6 Teaching and learning literacy skills in a digital world
The magnitude of these negative associations also holds when comparing students attending general programmes with those
attending vocational programmes except in playing simulations at school and chatting online at school where the magnitude of
the effect is smaller yet statistically significant. Browsing the Internet for schoolwork, however, only has a significant association
with performance among students in general programmes (Table B.6.17).
In conclusion, the time teachers spent using digital devices in teaching and learning activities is often negatively associated
with reading performance. According to the data analysed here, few have managed to integrate digital devices in teaching and
learning activities effectively, and they are the exception rather than the rule. The association between time spent using digital
devices and reading performance is only positive in Australia, Denmark, Korea, New Zealand, and the United States. The results
also imply that how digital devices are used may matter more than how much time is spent on them. Browsing the Internet for
schoolwork is the activity on digital devices more strongly related to reading performance on average across OECD countries.
The countries listed above (except Korea) are among the countries with the highest share of students who browse most days or
everyday and show a strong association with reading performance. Posting work on the school’s website and playing simulations
at school are the most negatively associated with reading performance on average across OECD countries, and Korea is among
the countries with the highest share of students who do not do such activities (Table B.6.14).
Integrating digital devices into regular teaching and learning activities may also be important for emergent aspects of
reading such as improving students’ critical thinking when comprehending multiple texts online. For example, as extensively
discussed in Chapter 2, learning how to detect biased information in school is not only likely to contribute to better reading
skills in digital environments but would also help to maximise online opportunities while reducing online risks for all students.
Classroom interventions aimed to develop students’ assessment of information reliability have proven to be effective in improving
students’ critical thinking when comprehending multiple documents (Pérez et al., 2018[11]). Providing access and promoting
the use of digital tools does not automatically lead to better results. Digital technologies have the potential to amplify teaching
and learning. This is all the more so when they also integrate innovative teaching and learning methods (e.g. gamification,
or computational thinking, see Paniagua and Istance, 2018[12]), quality teacher professional learning (e.g. through developing
teacher’s digital literacy, see Boeskens, Nusche and Yurita, 2020[13]; Minea-Pic, 2020[14]), and leverage the pedagogical effect of
technology through elaborate instructional design developments (Sung, Chang and Liu, 2016[15]). Finally, yet importantly, the use
of digital technologies should also be aligned with health-promoting activities that are compatible with enhancing physical and
mental well-being (Burns and Gottschalk, 2019[16]; Burns and Gottschalk, 2020[17]).
Notes
1. The partial correlation is calculated using the percentage of students who reported reading fiction books for school more than once a month
(the sum of categories 3 and 4) and the percentage of students who reported reading fiction books more than once a month because they
wanted (the sum of categories 4 and 5, Table B.6.9), after accounting for per capita GDP (Figure I.4.3 from PISA 2018 Results (Volume I) - What
Students Know and Can Do (OECD, 2019[6])), and reading performance (Table B.2.1).
2. The Teacher questionnaire for PISA 2018 was conducted in seven OECD countries: Chile, Germany, Korea, Portugal, Spain, the United Kingdom,
and the United States.
134 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
References
Teaching and learning literacy skills in a digital world
6
Boeskens, L., D. Nusche and M. Yurita (2020), “Policies to support teachers’ continuing professional learning: A conceptual framework [13]
and mapping of OECD data”, OECD Education Working Papers, No. 235, OECD Publishing, Paris, https://dx.doi.org/10.1787/247b7c4d-en.
Burns, T. and F. Gottschalk (eds.) (2020), Education in the Digital Age: Healthy and Happy Children, Educational Research and Innovation, [17]
OECD Publishing, Paris, https://dx.doi.org/10.1787/1209166a-en.
Burns, T. and F. Gottschalk (eds.) (2019), Educating 21st Century Children: Emotional Well-being in the Digital Age, Educational Research [16]
and Innovation, OECD Publishing, Paris, https://dx.doi.org/10.1787/b7f33425-en.
Carr, N. (2010), The Shallows: What the Internet Is Doing to Our Brains, W. W. Norton & Company. [5]
Falck, O., C. Mang and L. Woessmann (2017), “Virtually No Effect? Different Uses of Classroom Computers and their Effect on Student [8]
Achievement”, Oxford Bulletin of Economics and Statistics, Vol. 80/1, pp. 1-38, http://dx.doi.org/10.1111/obes.12192.
Fernández-Alonso, R., J. Suárez-Álvarez and J. Muñiz (2015), “Adolescents’ homework performance in mathematics and science: [10]
Personal factors and teaching practices.”, Journal of Educational Psychology, Vol. 107/4, pp. 1075-1085,
http://dx.doi.org/10.1037/edu0000032.
Firth, J. et al. (2019), “The “online brain”: how the Internet may be changing our cognition”, World Psychiatry, Vol. 18/2, pp. 119-129, [4]
http://dx.doi.org/10.1002/wps.20617.
Lorenz-Spreen, P. et al. (2019), “Accelerating dynamics of collective attention”, Nature Communications, Vol. 10/1, [3]
http://dx.doi.org/10.1038/s41467-019-09311-w.
Minea-Pic, A. (2020), “Innovating teachers’ professional learning through digital technologies”, OECD Education Working Papers, No. 237, [14]
OECD Publishing, Paris, https://dx.doi.org/10.1787/3329fae9-en.
OECD (2020), PISA 2018 Results (Volume V): Effective Policies, Successful Schools, PISA, OECD Publishing, Paris, [7]
https://dx.doi.org/10.1787/ca768d40-en.
OECD (2019), OECD Skills Outlook 2019 : Thriving in a Digital World, OECD Publishing, Paris, https://dx.doi.org/10.1787/df80bc12-en. [9]
OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, [6]
https://dx.doi.org/10.1787/5f07c754-en.
OECD (2019), PISA 2018 Results (Volume III): What School Life Means for Students’ Lives, PISA, OECD Publishing, Paris, [1]
https://dx.doi.org/10.1787/acd78851-en.
Paniagua, A. and D. Istance (2018), Teachers as Designers of Learning Environments: The Importance of Innovative Pedagogies, Educational [12]
Research and Innovation, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264085374-en.
Pérez, A. et al. (2018), “Fostering teenagers’ assessment of information reliability: Effects of a classroom intervention focused on critical [11]
source dimensions”, Learning and Instruction, Vol. 58, pp. 53-64, http://dx.doi.org/10.1016/j.learninstruc.2018.04.006.
Sung, Y., K. Chang and T. Liu (2016), “The effects of integrating mobile devices with teaching and learning on students’ learning [15]
performance: A meta-analysis and research synthesis”, Computers & Education, Vol. 94, pp. 252-275,
http://dx.doi.org/10.1016/j.compedu.2015.11.008.
Wolf, M. (2018), Reader Come Home: The Reading Brain in a Digital World, New York: Harper. [2]
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 135
7
Developing literacy skills in a digital world:
Implications for education policy and practice
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 137
7 Developing reading skills in a digital world: Implications for education policy and practice
Before the printing press, knowledge spread orally and handwritten books were available only to wealthy elites. The printing
press allowed for the mass production of printed books, making written information widely available, and encouraging and
incentivising the wide development of reading skills. Still, the production of books remained in the hands of the few, not the many.
Texts were usually, at the least, carefully curated if not requiring authoritative endorsement.
Digital technologies created another revolution of the written word in the 21st century. Suddenly, everyone could become
a journalist or a publisher. Literacy in the 20th century was about extracting and processing pre-coded information; in the
21st century, it is about constructing and validating knowledge. In the past, teachers could tell students to look up information
in an encyclopaedia, and to rely on that information as accurate and true. Nowadays, Google presents them with millions of
answers, and nobody tells them what’s right or wrong and true or not true. The more knowledge technology allows us to search
and access, the more important it becomes to develop deep understanding and the capacity to navigate ambiguity, to triangulate
viewpoints, and to make sense of content.
The fact that advancements in reading literacy, as measured by PISA, have fallen sharply behind the evolution of the nature of
information has profound consequences in a world where virality seems sometimes privileged over quality in the distribution
of information. In the “post-truth” climate in which we now find ourselves, assertions that “feel right” but have no basis in fact
become accepted as fact. Algorithms that sort us into groups of like-minded individuals create social media echo chambers
that amplify our views, and leave us insulated from opposing arguments that may alter our beliefs. These virtual bubbles
homogenise opinions and polarise our societies; and they can have a significant – and adverse – impact on democratic processes.
Those algorithms are not a design flaw; they are how social media work. There is scarcity of attention, but an abundance of
information. We are living in this digital bazaar where anything that is not built for the network age is cracking apart under its
pressure.
This is the age of acceleration, a speeding-up of human experience through the compound impact of disruptive forces on every
aspect of our lives. It is also a time of political contestation. The priority of the wider international community is to reconcile the
needs and interests of individuals, communities and nations within an equitable framework based on open borders and markets
and a sustainable future. But where disruption has brought a sense of dislocation, political forces emerge that are offering closed
borders, the protection of traditional jobs and a promise to put the interests of today’s generation over those of the future.
The fake news phenomenon can significantly amplify those forces.
The question is then: how can we live successfully in this new world of information? To what extent do we approach this issue from
a consumer protection angle; that is, working on it from the supply side? Or do we approach it from a skills or demand side angle;
that is, strengthening people’s capacity to better navigate information? PISA offers important insights on the latter. The PISA 2018
reading framework was devised to include essential reading skills in a digital world. This report aims to understand better how
15-year-old students are developing reading skills to navigate the technology-rich 21st century. This chapter offers a synthesis
of the findings. It focuses on policies and practices that can harness digitalisation to create better learning opportunities and
counter some of digitalisation’s disruptive effects in and for education
While interpretative skills are needed to read printed books, digital readers must also employ new techniques in simply accessing
information to read. Readers now navigate through multiple sources of text. They must be more selective in what they read due
to the vast quantities of information available at the click of a button. Digital readers not only need to follow linear information
structures but construct their own texts by selecting and assessing information from various sources. Reading in a digital world
requires continuously evaluating the quality and validity of different sources, navigating through ambiguity and constructing
knowledge. Individuals can benefit from effective strategies that help them think about, monitor and adjust their reading for a
particular goal (also known as metacognitive reading strategies). These strategies can also help readers’ motivation to persevere
in the face of difficulties (also known as self-efficacy).
Reading in a digital world is even more challenging given that the increasing production and consumption of media content rapidly
exhausts people’s attention. Real-time 24/7 news and social media reactions spread across the globe in a matter of seconds. It is no
longer rare to find the length of an online text (in minutes) before the topic has even been introduced (e.g. online newspapers),
listen to 18-minute inspirational talks (e.g. TED talks), and limit our thoughts to 280 characters in Twitter (about 56 words).
PISA 2018 was conducted before the COVID-19 pandemic. The findings discussed in this report do not reflect the impact of
the pandemic but they are useful when considering that a) the digital divide, exacerbated by school disruptions, will have likely
amplified the learning gaps discussed in this report, b) where students are learning at home and on their own, it becomes even
more urgent to develop advanced reading skills such as critical reading to prepare for the demands of an increasingly volatile,
uncertain, and ambiguous world, and c) understanding the challenges education systems faced before the pandemic may help
them act on solving those issues more effectively and become more resilient.
138 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Developing reading skills in a digital world: Implications for education policy and practice
ARE STUDENTS WHO HAD THE OPPORTUNITY TO LEARN DIGITAL SKILLS IN SCHOOL MORE LIKELY TO
DISTINGUISH FACTS FROM OPINIONS IN THE PISA READING TEST?
PISA data shows that Internet use in 15-year-olds has risen from 21 hours a week in PISA 2012 to 35 hours per week in
PISA 2018. This represents a 66% increase in just 6 years and is almost as much time as an average adult workweek in OECD countries.
7
Yet, only about half of students across OECD countries reported being trained at school on how to recognise whether information
is biased. An average of approximately 40% of students in OECD countries responded that clicking on the link of a phishing
e-mail was somewhat appropriate or very appropriate. Around 8.7% of students were top performers in reading, meaning
that they attained Level 5 or 6 in the PISA reading test. At these levels, students are able to comprehend lengthy texts, deal
with concepts that are abstract or counterintuitive, and establish distinctions between fact and opinion, based on implicit cues
pertaining to the content or source of the information. This report goes one step forward and pays special attention to the
estimated percentage correct in the PISA reading released item – Rapa Nui - that focuses on distinguishing fact from opinion
as one of the most emergent aspects of reading in digital environments. The PISA reading item of distinguishing fact from
opinion was estimated to be 47% correct1 on average across OECD countries. This means that you need to be at, at least,
Level 5 in the PISA test to be likely to obtain the full score in this item. The estimated percentage correct of this item was higher
than 60% in Australia, Canada, the Netherlands, New Zealand, Turkey, the United Kingdom and the United States while lower than
20% in Georgia, Indonesia, Kosovo, Morocco, Panama, and the Philippines. Among OECD countries, the estimated percentage
correct was lower than 30% in Colombia, Costa Rica, the Czech Republic, Korea, and the Slovak Republic. The United States was
the country with the highest percentage correct in this item (69%) and was above the average in the total reading score (505).
However, Korea, who performed above the OECD average in reading, scored below the average in this particular item while
Turkey, who performed below the OECD average in reading, is the country with the highest percentage correct (63%) after the
United States (69%) and the United Kingdom (65%).
PISA 2018 shows that education systems with a higher proportion of students who were taught digital skills in school have a
higher percentage correct in the reading item of distinguishing facts from opinions. The same happened with education systems
with a higher proportion of students who reported having a computer for schoolwork linked to the Internet. These associations
still hold even after accounting for the country per capita GDP. In Hong Kong (China) and Singapore, the percentage of students
who had access to training on how to detect biased information in school and their percentage correct in the reading item of
distinguishing fact from opinion was above the OECD average. However, students in Chinese Taipei scored below the OECD
average in this item even though the proportion of students reporting that they were taught how to detect biased information in
school was well above the OECD average.
These results may reflect differences in curriculum across countries but practice and out-of-school experiences may also be
explanations. Parents can play an essential role in providing access and encouraging appropriate use of digital devices at home,
and conveying positive attitudes towards reading. For instance, students whose parents enjoy reading the most tend to report
that they read for enjoyment more frequently than those whose parents enjoy reading the least. However, for many of the most
disadvantaged students, schools are the only way to learn and practice digital skills. The next section provides further details on
this matter.
Bottom line
Education systems with a higher proportion of students who were taught how to detect biased information in school
were more likely to distinguish fact from opinion in the PISA reading assessment, even after accounting for country per
capita GDP.
ARE SCHOOLS READY TO COMPENSATE FOR THE DIGITAL DIVIDE AND LEVERAGE THE POTENTIAL OF
TECHNOLOGY?
Digital technologies offer great opportunities as to what, how, where, and when people learn. However, digital divides mirror
or even amplify prevailing socio-economic gaps. Remote learning, which most students around the world experienced because
of the COVID-19 health crisis, often requires or benefits from having access to a computer linked to the Internet at home for
schoolwork. In PISA 2018, 88% of students had both a connection to the Internet at home and a computer that they could use
for schoolwork – 28 percentage points more than in PISA 2003. However, in the Dominican Republic, Indonesia, Malaysia, Mexico,
Morocco, Peru, the Philippines, Thailand, and Viet Nam, half or less of students had access to both. This percentage was lower
than 20% in rural areas of Indonesia, Mexico, Morocco and the Philippines. Four in five disadvantaged students in Malaysia,
Mexico, Morocco, Peru, the Philippines and Viet Nam do not have access to the Internet at home but at school only.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 139
7 Developing reading skills in a digital world: Implications for education policy and practice
The situation is also worrisome when looking at OECD countries. Socio-economically disadvantaged students in 2018 had
approximately half of the books at home they used to have in 2000 while advantaged students had essentially the same
number. This may explain why the average student from a disadvantaged socio-economic background is more likely to report
reading books on digital devices than in paper format. This is also the case in 30 countries/economies. In B-S-J-Z (China),
Hong Kong (China), and Chinese Taipei, the socio-economic difference in reading books on digital devices more often than in
paper format is of at least 10 percentage points. Colombia and Mexico are the only exception across OECD countries where
socio-economically advantaged students are more likely to read books on digital devices than disadvantaged students.
In summary, disadvantaged students from OECD countries are increasingly losing the cultural capital of having books in their
home-learning environments, and many of the most disadvantaged students across all participating countries/economies in
PISA 2018 can only access computers linked to the Internet at school.
Providing access to digital technologies at school does not automatically lead to better results. In fact, the amount of time
teachers spend using digital devices in teaching and learning activities is often negatively associated with reading performance.
According to the data analysed in this report, few have managed to integrate digital devices into teaching and learning activities
effectively, and they are the exception rather than the rule. The association between time spent using digital devices and reading
performance is only positive in Australia, Denmark, Korea, New Zealand, and the United States. These countries (except Korea)
are among the countries with the highest share of students who browse the Internet for schoolwork most days or everyday and
show a strong association with reading performance. Posting work on the school’s website and playing simulations at school are
the most negatively associated with reading performance on average across OECD countries, and Korea is among the countries
with the highest share of students who do not do such activities. PISA results are correlational, so it is impossible to determine
whether these activities result in lower performance or that low-performing schools use these approaches more frequently.
In either case, this only provides part of the picture. Many other potential benefits of digital technologies fall outside what
PISA 2018 measured but are no less important. For instance, PISA 2022 will provide more insights into how schools used digital
technologies to provide learning opportunities during school disruptions due to the pandemic2. The only thing that seems clear
so far is that integrating digital devices into regular teaching and learning activities is still challenging, and providing access to
digital technologies does not automatically lead to better results. How schools, teachers, and students use digital devices matters
more than how much time is spent on them. The following sections provide further insights into what policies and practices can
lead to better results.
Bottom line
Many of the most disadvantaged students in PISA 2018 can only access computers linked to the Internet at school, but,
unfortunately, providing access to digital technologies at school does not automatically lead to better results.
In other words, proficient readers were more likely to explore the given task to prepare themselves for later questions even
though they were aware that the question did not require them to do so. They were better positioned to locate and collect
information in advance before the more complex multiple-source items were activated. Proficient readers spent enough time
140 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Developing reading skills in a digital world: Implications for education policy and practice
on relevant pages to understand the content. For example, they did not quickly switch between pages. Instead, they allocated
a smaller proportion of time to the initial page and reserved more time for navigating more demanding pages. In short, they
actively regulated their cognition and behaviour to achieve their goal in this particular task.
Students in PISA 2018 were asked to evaluate the effectiveness of different metacognitive reading strategies in understanding
7
and memorising a text, summarising information, and assessing sources’ credibility. PISA 2018 data shows that, on average
across OECD countries, students who have better knowledge of effective reading strategies are also more likely to show actively
explorative navigation across single- and multiple-source items in the PISA reading assessment. That is not all. These reading
strategies also correlate with overall reading performance. Education systems in which the average student is more aware of
effective reading strategies are also those in which students perform better in the PISA reading assessment. The OECD average
change in reading performance associated with a one-unit increase in the index of knowledge of reading strategies for assessing
the credibility of sources is 36 points after accounting for students’ and schools’ socio-economic status. These findings are
consistent across every country and economy participating in PISA 2018. As reading in digital environments requires many
more self-organisational skills, students benefit from knowing effective metacognitive reading strategies and how to assess
information critically.
Furthermore, when comparing students with similar socio-economic status, those who have better knowledge of effective
reading strategies are more likely to be proficient readers. Knowledge of effective reading strategies is an effective mediator
in the association between socio-economic status, gender, and reading performance. Concretely, the index of effective reading
strategies for assessing the credibility of sources is the most strongly associated with reading performance after accounting
for background variables while the other two reading strategies (i.e. the indices of student knowledge of reading strategies
for understanding and memorising a text and summarising information) are also associated with reading performance. These
findings are particularly important for education policies and practices. Contrary to socio-economic status, which cannot be
changed, knowledge of effective metacognitive reading strategies can be taught3. For instance, in Austria, Belgium, the Czech
Republic, France, Germany, Luxembourg, the Netherlands, and Switzerland, students attending advantaged schools are more
than two-thirds of a standard deviation ahead in the Index of knowledge of reading strategies for assessing the credibility of
sources compared to students attending disadvantaged schools. Empirical studies have shown that classroom interventions
aimed at developing students’ assessment of information reliability have proven to be effective in improving students’ critical
thinking when comprehending multiple documents4.
Bottom line
Stronger readers tend to have a better knowledge of reading strategies and are more likely to actively explore and navigate
single- and multiple-source items in the PISA reading assessment.
WHAT DOES THE INTERPLAY BETWEEN ENJOYMENT, READING PERFORMANCE AND DIGITAL DEVICES
MEAN FOR STUDENTS?
Students’ reading habits and preferences have changed over the past decades because of changes in digitalisation of
communication. PISA 2018 data suggest that digital devices are increasingly displacing print media, particularly in activities most
closely tied to reading for information (e.g. newspapers, magazines). In Ireland, for example, the percentage of students who
read physical newspapers several times a month or more because they wanted to decreased by 43 percentage points between
2009 and 2018 while reading the news online increased by 44 percentage points. This sometimes even happens within digital
reading activities. In Japan, for example, the percentage of students who read e-mails several times a week or more decreased by
62 percentage points between 2009 and 2018 while chatting online increased by 77 percentage points.
PISA 2018 asked students what describes best how they read books: more often in paper format, more often on digital devices,
or equally often in paper format and on digital devices. Students who reported reading books more often in paper than digital
format perform better in reading and spend more time reading for enjoyment in all participating countries/economies in
PISA 2018. However, this does not mean that print-book readers do not read online. Print-book readers read diverse kinds of
reading materials for pleasure (e.g. books, magazines, newspapers, websites, etc.) more hours a week than digital-book readers,
and the biggest book readers balance their reading time between paper and digital. Compared to students who rarely or never
read books, digital-book readers on average across OECD countries read for enjoyment about 3 hours more a week, print-book
readers about 4, and those who balance both formats about 5 hours or more a week after accounting for students’ and schools’
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 141
7 Developing reading skills in a digital world: Implications for education policy and practice
socio-economic background and gender. Moreover, print-book readers reported chatting online as much as non-print book
readers. These findings suggest two things. First, that the time spent reading for enjoyment on digital devices may not always
displace time spent reading for enjoyment on print. Second, the potential benefit of using technology to enhance students’
reading experience seems greater in activities related to reading for information and meeting practical needs than reading books
for pleasure.
In summary, strong readers tend to read books in paper format or balance their reading time between paper and digital. At the
same time, strong readers tend to read the news more often on digital devices but not exclusively. In other words, it seems that
most proficient readers are able to optimise the use of digital technology depending on the activity. Strong readers will use digital
devices to read the news or browse the Internet for schoolwork while still enjoying a book on paper.
Bottom line
Most proficient readers seem to optimise the use of digital technology depending on the activity (e.g. reading news online,
browsing the Internet for schoolwork) while still enjoying reading a book on paper.
Students who reported reading for enjoyment in OECD countries have dropped in number over the last decade even though
indicators of teachers’ stimulation of students’ reading engagement have significantly increased. It would be reasonable to expect
that teachers are more likely to stimulate students who need it the most as strong readers would already be more engaged in
reading. However, students from a lower socio-economic background and boys – who typically have a lower reading performance
– perceived less stimulation from their teachers in reading activities in 49 countries/economies participating in PISA 2018.
Students are increasingly reading fewer fiction books. For example, at least 8 percentage points more students reported reading
fiction books several times a month or more in PISA 2009 than in PISA 2018 in Canada, Finland, Kazakhstan, New Zealand,
Sweden, and Thailand. Moreover, less than 20% of students reported reading fiction books several times a month or more in
Belgium, Finland, the Netherlands, Norway, and Slovenia. PISA 2018 data also shows that students who reported reading fiction
books for school during the last month are more likely to have also reported reading fiction books because they wanted to.
In addition, most of the high performers in reading also reported reading longer pieces of text for school. These results suggest
that teachers’ assignments to read books for school may actually encourage students to read for pleasure outside of school.
Education systems that aim to improve students’ resilience should understand why disadvantaged students still perceived the
PISA reading assessment as more difficult than advantaged students did even after accounting for students’ reading scores.
This perception-of-difficulty gap among advantaged and disadvantaged students is the largest in B-S-J-Z (China), Luxembourg,
and Singapore – close to a half standard deviation after accounting for reading performance. There is also a paradox: boys
reported they felt the PISA reading test was easier than girls did even though boys scored 25 points lower than girls in reading
after accounting for students’ socio-economic backgrounds. In some students, there is a gap between their perceptions of their
competency level – in this case, in reading – and the reality of that competency level or what that competency level actually is.
And this gap may be hampering their motivation and perseverance in developing their reading skills. The relationship between
one’s perception of one’s competence and performance is a mutually reinforcing one so when higher-performing students receive
and process performance feedback their perception of competence tends to be higher. Teachers’ feedback can be beneficial in
helping poorer readers have a better sense of their strengths and weaknesses.
142 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Developing reading skills in a digital world: Implications for education policy and practice
As automation, artificial intelligence, and robotics continue to seep into the workplace, tomorrow’s schools will need to help
students develop skills that will be difficult for machines to replicate. This is crucial for students’ labour market prospects and
well-being. To become proficient readers in a digital world, students need strong reading foundations but also the ability to think
critically, monitor and adjust their behaviour for a particular goal and motivate themselves to persevere in the face of difficulties.
7
In conclusion, the countries and economies that will be the most successful in fostering proficient readers in a digital world
are those that mobilise learning opportunities across the reading spectrum, encompassing both digital technologies as well as
traditional print reading. This will enable students to learn to think critically and develop metacognitive and self-efficacy skills to
navigate the technology-rich 21st century.
Bottom line
Countries and economies that foster proficient readers in a digital world are those that offer learning opportunities that
respond to students’ diverse needs.
Notes
1. Rapa Nui Question 3 is a partial credit item where non-credit is scored 0, partial credit is scored 0.5, and full credit is scored 1. Therefore, the
estimated percentage correct for full credit in this item is lower than 47% on average across OECD countries. This item was estimated to be 39%
correct on average across all PISA 2018 participating countries and economies. Rapa Nui Question 3 is a Level 5 item. This means that students
need to have a proficiency level 5 to have a 62% probability of getting full credit in this item.
2. Bertling, J., et al. (2020), “A tool to capture learning experiences during Covid-19: The PISA Global Crises Questionnaire Module”, OECD Education
Working Papers, No. 232, OECD Publishing, Paris, https://doi.org/10.1787/9988df4e-en.
3. Autin, F. and J. Croizet (2012), “Improving working memory efficiency by reframing metacognitive interpretation of task difficulty.”, Journal of
Experimental Psychology: General, Vol. 141/4, pp. 610-618, http://dx.doi.org/10.1037/a0027478.
4. Pérez, A. et al. (2018), “Fostering teenagers’ assessment of information reliability: Effects of a classroom intervention focused on critical source
dimensions”, Learning and Instruction, Vol. 58, pp. 53-64, http://dx.doi.org/10.1016/j.learninstruc.2018.04.006.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 143
ANNEX A
PISA 2018 technical background
The table in Annex A is avaliable on line
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 145
ANNEX A1
Construction of indices
Several PISA measures reflect indices that summarise responses from students, their parents, teachers or school representatives
(typically principals) to a series of related questions. The questions were selected from a larger pool on the basis of theoretical
considerations and previous research. The PISA 2018 Assessment and Analytical Framework (OECD, 2019[1]) provides an in-depth
description of this conceptual framework. Item response theory modelling was used to confirm the theoretically expected behaviour
of the indices and to validate their comparability across countries. For this purpose a joint model across all countries was estimated.
Item fit (RMSD) was evaluated separately for each item and each group (county by language). This procedure is in line with the
PISA 2015 scaling approach. For a detailed description of other PISA indices and details on the methods, see the PISA 2015 Technical
Report (OECD, 2017[2]) and the PISA 2018 Technical Report (OECD, forthcoming[3]).
There are three types of indices: simple indices, new scale indices and trend scale indices.
Simple indices are the variables that are constructed through the arithmetic transformation or recoding of one or more items in
exactly the same way across assessments. Here, item responses are used to calculate meaningful variables, such as the recoding
of the four-digit ISCO-08 codes into “Highest parents’ socio-economic index (HISEI)” or teacher-student ratio based on information
from the school questionnaire.
New and scale indices from other cycles: are the variables constructed through the scaling of multiple items. Unless otherwise
indicated, the index was scaled using a two-parameter item-response model (a generalised partial credit model was used in the case
of items with more than two categories) and values of the index correspond to Warm likelihood estimates (WLE) (Warm, 1989[4]).
For details on how each scale index was constructed, see the PISA 2018 Technical Report (OECD, forthcoming[3]). In general, the
scaling was done in two stages:
1. The item parameters were estimated based on all students from equally-weighted countries and economies; only cases with
a minimum number of three valid responses to items that are part of the index were included. In the case of trend indices,
a common calibration linking procedure was used: countries/economies that participated in both PISA 2009 and PISA 2018
contributed both samples to the calibration of item parameters; each cycle and, within each cycle, each country/economy
contributed equally to the estimation.1
2. For new scale indices, the Warm likelihood estimates were then standardised so that the mean of the index value for the OECD
student population was zero and the standard deviation was one (countries were given equal weight in the standardisation
process).
Sequential codes were assigned to the different response categories of the questions in the sequence in which the latter appeared
in the student, school or parent questionnaires. Where indicated in this section, these codes were inverted for the purpose of
constructing indices or scales. Negative values for an index do not necessarily imply that students responded negatively to the
underlying questions. A negative value merely indicates that the respondents answered less positively than all respondents
did on average across OECD countries. Likewise, a positive value on an index indicates that the respondents answered more
favourably, or more positively, on average, than respondents in OECD countries did. Terms enclosed in brackets < > in the
following descriptions were replaced in the national versions of the student, school and parent questionnaires by the appropriate
national equivalent. For example, the term <qualification at ISCED level 5A> was translated in the United States into “Bachelor’s
degree, post-graduate certificate program, Master’s degree program or first professional degree program”. Similarly the term
<classes in the language of assessment> in Luxembourg was translated into “German classes” or “French classes”, depending on
whether students received the German or French version of the assessment instruments.
In addition to simple and scaled indices described in this annex, there are a number of variables from the questionnaires that
were used in this report and correspond to single items not used to construct indices. These non-recoded variables have prefix
of “ST” for the questionnaire items in the student questionnaire and “SC” for the items in the school questionnaire. All the context
questionnaires, and the PISA international database, including all variables, are available through www.oecd.org/pisa.
146 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Construction of indices Annex A1
Career expectations
In PISA 2018, students were asked to answer a question (ST114) about “what kind of job [they] expect to have when [they] are
about 30 years old”. Answers to this open-ended question were coded to four-digit ISCO codes (ILO, 2007), in variable BSMJ.
This variable was used to derive several indices related to career expectations.
Science-related career expectations are defined as those career expectations whose realisation requires further engagement
with the study of science beyond compulsory education, typically in formal tertiary education settings. The classification of careers
into science-related and non-science-related is based on the four-digit ISCO-08 classification of occupations.
Only professionals (major ISCO group 2) and technicians/associate professionals (major ISCO group 3) were considered to fit
the definition of science-related career expectations. In a broad sense, several managerial occupations (major ISCO group 1)
are clearly science-related; these include research and development managers, hospital managers, construction managers,
and other occupations classified under production and specialised services managers (submajor group 13). However, when
science-related experience and training is an important requirement of a managerial occupation, these were not considered to
be entry-level jobs, and 15-year-old students with science-related career aspirations would not expect to be in such a position
by age 30.
Several skilled agriculture, forestry and fishery workers (major ISCO group 6) could also be considered to work in science-related
occupations. The United States O*NET OnLine (2019[5]) classification of science, technology, engineering and mathematics (STEM)
occupations indeed include these occupations. These, however, do not typically require formal science-related training or study
after compulsory education. Thus, only major occupation groups that require ISCO skill levels 3 and 4 were included amongst
science-related occupational expectations.
Amongst professionals and technicians/associate professionals, the boundary between science-related and non-science-related
occupations is sometimes blurred, and different classifications draw different lines.
• Health professionals: All health professionals in sub-major group 22 (e.g. doctors, nurses, veterinarians), with the exception
of traditional and complementary medicine professionals (minor group 223).
• ICT professionals: All information and communications technology professionals (sub-major group 25).
• Science technicians and associate professionals, including:
– physical and engineering science technicians (minor group 311)
– life science technicians and related associate professionals (minor group 314)
– air traffic safety electronic technicians (3155)
– medical and pharmaceutical technicians (minor group 321), except medical and dental prosthetic technicians (3214)
– telecommunications engineering technicians (3522).
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 147
Annex A1 Construction of indices
148 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Construction of indices Annex A1
Each scenario consists of (a) a stem which is a reading task and (b) a set of strategies. Students were asked to rate the strategies
regarding their usefulness for solving the reading task. All strategies have also been rated by reading experts regarding their
usefulness via multiple pairwise comparisons. This rating resulted in a hierarchy of all strategies for each task and it was based on
all the pairs agreed upon by at least 80% of the experts. For the new scenario METASPAM (based on question ST166), for example,
the experts’ ratings resulted in the following order: Q02HA, Q04HA, Q05HA > Q01HA, Q03HA.
Based on this rating order, pairwise rules were then created to construct a score for each student indicating the number of times
in which he or she chose a more useful over a less useful strategy. The final scores assigned to each student for each task ranges
from 0 to 1 and can be interpreted as the proportion of the total number of expert pairwise relations that are consistent with the
student ordering. The higher the score, the higher the number of times in which a student chose an expert-validated strategy
over a less useful one. For METASPAM, there were 6 (3x2) resulting pairwise rules based on this order, namely Q04HA > Q01HA,
Q04HA > Q03HA, Q02HA > Q01HA, Q02HA > Q03HA, Q05HA > Q01HA, and Q05HA > Q03HA. Consequently, a student following
4 of these rules receives a score of 4/6=0.67. A similar procedure was carried out for the remaining two meta-cognition tasks.
For UNDREM (based on question ST164), the expert-rated strategy order was Q03IA, Q04IA, Q05IA > Q01IA, Q02IA, Q06IA. For
METASUM (based on question ST165), the expert-rated strategy order was Q04IA, Q05IA > Q01IA, Q03IA > Q02IA.
In case of a missing value on one or more items of the question, a missing score was assigned. Finally, all three indices were
standardized to have an OECD mean of 0 and a standard deviation of 1.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 149
Annex A1 Construction of indices
Scaling of indices related to the PISA index of economic social and cultural status
The PISA index of economic, social and cultural status (ESCS) was derived, as in previous cycles, from three variables
related to family background: parents’ highest level of education (PARED), parents’ highest occupational status (HISEI), and
home possessions (HOMEPOS), including books in the home. PARED and HISEI are simple indices, described above. HOMEPOS is
a proxy measure for family wealth.
Household possessions
In PISA 2018, students reported the availability of 16 household items at home (ST011), including three country-specific household
items that were seen as appropriate measures of family wealth within the country’s context. In addition, students reported the
amount of possessions and books at home (ST012, ST013). HOMEPOS is a summary index of all household and possession items
(ST011, ST012 and ST013).
Computation of ESCS
For the purpose of computing the PISA index of economic, social and cultural status (ESCS), values for students with missing
PARED, HISEI or HOMEPOS were imputed with predicted values plus a random component based on a regression on the other
two variables. If there were missing data on more than one of the three variables, ESCS was not computed and a missing value
was assigned for ESCS.
In previous cycles, the PISA index of economic, social and cultural status was derived from a principal component analysis of
standardised variables (each variable has an OECD mean of zero and a standard deviation of one), taking the factor scores for the
first principal component as measures of the PISA index of economic, social and cultural status. In PISA 2018, ESCS is computed
by attributing equal weight to the three standardised components. As in PISA 2015, the three components were standardised
across all countries and economies (both OECD and partner countries/economies), with each country/economy contributing
equally (in cycles prior to 2015, the standardisation and principal component analysis was based on OECD countries only).
As in every previous cycle, the final ESCS variable was transformed, with 0 the score of an average OECD student and 1 the
standard deviation across equally weighted OECD countries.
150 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Construction of indices Annex A1
Internal consistency refers to the extent to which the items that make up an index are inter-related. Cronbach’s Alpha was used
to check the internal consistency of each scale within the countries/economies and to compare it amongst countries/economies.
The coefficient of Cronbach’s Alpha ranges from 0 to 1, with higher values indicating higher internal consistency. Similar and
high values across countries/economies are an indication of having measured reliably across countries/economies. Commonly
accepted cut-off values are 0.9 for excellent, 0.8 for good, and 0.7 for acceptable internal consistency. In the PISA 2018 context,
indices were always omitted for countries and economies with values below 0.6, and for some countries and economies with
values between 0.6 and 0.7.
Table A1.1, available online (see below), presents the Cronbach’s Alpha for the main scaled indices in this report. Based on these
results, the following indices were omitted in the figures and flagged in the tables from individual countries/economies:
• Self-efficacy (RESILIENCE): Viet Nam.
• Perception of reading competence (SCREADCOMP): Belarus, the Russian Federation.
• Perception of difficulty in reading (SCREADDIFF): Indonesia, Malaysia, Morocco, Saudi Arabia, Viet Nam.
• Enjoyment of reading (JOYREAD): Jordan, Morocco.
PISA 2018 examined the cross-country comparability of scaled indices also through the invariance of item parameters. The idea
was to test whether the item parameters of an index could be assumed to be the same (invariant) across groups of participating
countries and language groups. In a first step, groups were defined based on samples of at least 300 students responding to the
same language-version questionnaire in a country. In a second step, international and student parameters were estimated based
on students across all groups. In a third step, the root mean square deviance (RMSD) item-fit statistics was calculated for each
group and item. Values close to zero signal a good item fit, indicating that the international model describes student responses
within individual groups accurately. Any group receiving a value above 0.3 was flagged and a group-specific item parameter was
calculated. Steps 2 and 3 were then repeated until all items exhibited RMSD values below 0.3. The RMSD values will be reported
in the forthcoming PISA 2018 Technical Report. Amongst the main indices examined in this report, some needed just one round
to ensure that all items exhibited acceptable levels of RMSD, whereas other indices needed several iterations:
• One round: perception of difficulty of the PISA test, perception of difficulty in reading, teacher’s stimulation of reading
engagement.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 151
Annex A1 Construction of indices
• Several rounds: self-efficacy (2 rounds), perception of reading competence (2 rounds), enjoyment of reading (2 rounds),
enjoyment of reading – trend – (4 rounds), parents’ enjoyment of reading (2 rounds).
In addition to country-specific omissions, some indices were also omitted for all countries. With regard to this report, the original
plan was to produce an index of metacognitive reading strategies combining “Understanding and remembering” (UNDREM,
ST164), “Summarising” (METASUM, ST165) and “Assessing credibility” (METASPAM, ST166). However, the composite index was
omitted because it showed low internal consistency and low invariance of item parameters. Consequently, metacognitive reading
strategies are analysed individually in the report and caution should be paid when comparing countries and economies means
in these individual indices as cross-cultural comparability can not be guaranteed.
Note
1. P
ISA expert groups identified a few indices that should be scaled to make index values directly comparable between PISA 2009 and
PISA 2018. These indices include DISCLIMA, JOYREAD and JOYREADP. For these trend indices, a common calibration linking procedure was
used. Countries/Economies that participated in both PISA 2009 and PISA 2018 contributed both samples to the calibration of item parameters.
Each country/economy contributed equally to the estimation in each cycle. Trend indices were equated so that the mean and standard
deviation of rescaled PISA 2009 estimates and of the original estimates included in the PISA 2009 database, across OECD countries, matched.
Trend indices are therefore reported on the same scale as used in PISA 2009, so that values can be directly compared to those included in the
PISA 2009 database.
References
O*NET OnLine (2019), All STEM Disciplines, https://www.onetonline.org/find/quick?s=all+STEM+disciplines (accessed on 2 October 2019). [5]
OECD (2019), PISA 2018 Assessment and Analytical Framework, OECD Publishing, Paris, https://dx.doi.org/10.1787/b25efab8-en. [1]
OECD (2017), PISA 2015 Technical Report, OECD Publishing, Paris, http://www.oecd.org/pisa/data/2015-technical-report/. [2]
OECD (forthcoming), PISA 2018 Technical Report, OECD Publishing, Paris. [3]
Warm, T. (1989), “Weighted likelihood estimation of ability in item response theory”, Psychometrika, Vol. 54/3, pp. 427-450, [4]
http://dx.doi.org/10.1007/BF02294627.
152 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
ANNEX A2
Technical notes on analyses in this report
In many cases, readers are primarily interested in whether a given value in a particular country is different from a second value
in the same or another country, e.g. whether girls in a country perform better than boys in the same country. In the tables and
figures used in this report, differences are labelled as statistically significant when a difference of that size or larger, in either
direction, would be observed less than 5% of the time, if there were actually no difference in corresponding population values.
Similarly, the risk of reporting an association as significant if there is, in fact, no correlation between two measures, is contained
at 5%.
Throughout the report, significance tests were undertaken to assess the statistical significance of the comparisons made.
Similarly, differences between other groups of students (e.g. non-immigrant students and students with an immigrant background,
or socio-economically advantaged and disadvantaged students) were tested for statistical significance. The definitions of the
subgroups can, in general, be found in the tables and the text accompanying the analysis. All differences marked in bold in the
tables presented in Annex B of this report are statistically significant at the 95% level.
Statistical significance of differences between subgroup means, after accounting for other variables
For many tables, subgroup comparisons were performed both on the observed difference (“before accounting for other variables”)
and after accounting for other variables, such as the PISA index of economic, social and cultural status of students. The adjusted
differences were estimated using linear regression and tested for significance at the 95% confidence level. Significant differences
are marked in bold.
Statistical significance of performance differences between the top and bottom quarters of PISA indices and scales
Differences in average performance between the top and bottom quarters of the PISA indices and scales were tested for
statistical significance. Figures marked in bold indicate that performance between the top and bottom quarters of students on
the respective index is statistically significantly different at the 95% confidence level.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 153
A1 Technical
Annex A2 Construction
notes
of on
indices
analyses in this report
Although the student samples were drawn from within a sample of schools, the school sample was designed to optimise
the resulting sample of students, rather than to give an optimal sample of schools. It is therefore preferable to analyse the
school-level variables as attributes of students (e.g. in terms of the share of 15-year-old students affected), rather than as
elements in their own right.
Most analyses of student and school characteristics are therefore weighted by student final weights (or their sum, in the case of
school characteristics), and use student replicate weights for estimating standard errors.
In PISA 2018, as in PISA 2012 and 2015, multilevel models weights are used at both the student and school levels. The purpose
of these weights is to account for differences in the probabilities of students being selected in the sample. Since PISA applies a
two-stage sampling procedure, these differences are due to factors at both the school and the student levels. For the multilevel
models, student final weights (W_FSTUWT) were used. Within-school weights correspond to student final weights, rescaled
to amount to the sample size within each school. Between-school weights correspond to the sum of final student weights
(W_FSTUWT) within each school.
Analyses based on teacher responses to the teacher questionnaires are weighted by student weights. In particular, in order to
compute averages and shares based on teacher responses, final teacher weights were generated so that the sum of teacher
weights within each school was equal to the sum of student weights within the same school. The same procedure was used to
generate replicate teacher weights in analogy with the student replicate weights in the database. All teachers within a school have
the same weight. For the computation of means, this is equivalent to aggregating teacher responses to the school level through
simple, unweighted means, and then applying student weights to these school-level aggregates.
154 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
ANNEX B
Results for countries and economies
Chapter 1: https://doi.org/10.1787/888934240617
Chapter 2: https://doi.org/10.1787/888934240636
Chapter 3: https://doi.org/10.1787/888934240655
Chapter 4: https://doi.org/10.1787/888934240674
Chapter 5: https://doi.org/10.1787/888934240693
Chapter 6: https://doi.org/10.1787/888934240712
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 155
ANNEX B
Results for countries and economies
Table B.1.3 [1/2] Time spent on the Internet in total in 2012, 2015, 2018
Based on students' reports
Change in time (in hours per week) spent on the Internet between:
Time (in hours per week1) spent on the Internet
PISA 2015 and … PISA 2018 and …
PISA 2012 PISA 2015 PISA 2018 PISA 2012 PISA 2012 PISA 2015
Hours S.E. Hours S.E. Hours S.E. Dif. S.E. Dif. S.E. Dif. S.E.
Australia 28 (0.2) 35 (0.3) 40 (0.3) 7 (0.4) 12 (0.3) 5 (0.4)
OECD
1. Students were allowed to respond in intervals of no time, between 1-30 minutes per day, between 31-60 minutes per day, between 1 hour and 2 hours per day, between
2 hours and 4 hours per day, between 4 hours and 6 hours per day, and more than 6 hours per day. These responses were converted to the average number of minutes
in the interval (0, 15.5, 45.5, 90.5, 180.5, 300.5, 420.5), then multiplied by 5 if they refer to a school day and by 2 if they refer to a weekend day, and divided by 60 to convert
them into hours. As such, the numbers in this table are the number of hours per week students spent on the Internet in total (outside and inside of school).
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240617
156 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Results for countries and economies Annex B
Table B.1.3 [2/2] Time spent on the Internet in total in 2012, 2015, 2018
Based on students' reports
Change in time (in hours per week) spent on the Internet between:
Time (in hours per week1) spent on the Internet
PISA 2015 and … PISA 2018 and …
PISA 2012 PISA 2015 PISA 2018 PISA 2012 PISA 2012 PISA 2015
Hours S.E. Hours S.E. Hours S.E. Dif. S.E. Dif. S.E. Dif. S.E.
Albania m m m m 25 (0.4) m m m m m m
Partners
Argentina m m m m m m m m m m m m
Baku (Azerbaijan) m m m m m m m m m m m m
Belarus m m m m m m m m m m m m
Bosnia and Herzegovina m m m m m m m m m m m m
Brazil m m 33 (0.4) † 36 (0.4) m m m m 3 (0.5) †
Brunei Darussalam m m m m 31 (0.3) m m m m m m
B-S-J-Z (China) m m m m m m m m m m m m
Bulgaria m m 37 (0.4) 40 (0.4) m m m m 3 (0.6)
Costa Rica 19 (0.4) 36 (0.5) 41 (0.5) 17 (0.6) 22 (0.6) 5 (0.7)
Croatia 21 (0.3) 29 (0.4) 36 (0.4) 8 (0.5) 15 (0.5) 7 (0.5)
Cyprus m m m m m m m m m m m m
Dominican Republic m m 24 (0.4) 28 (0.6) m m m m 4 (0.7)
Georgia m m m m 29 (0.4) m m m m m m
Hong Kong (China) 21 (0.3) 23 (0.3) 29 (0.4) 2 (0.4) 8 (0.5) 6 (0.5)
Indonesia m m m m m m m m m m m m
Jordan 16 (0.3) m m m m m m m m m m
Kazakhstan m m m m 26 (0.3) m m m m m m
Kosovo m m m m m m m m m m m m
Lebanon m m m m m m m m m m m m
Macao (China) 22 (0.2) 26 (0.2) 30 (0.2) 4 (0.3) 8 (0.3) 4 (0.3)
Malaysia m m m m m m m m m m m m
Malta m m m m 34 (0.3) m m m m m m
Moldova m m m m m m m m m m m m
Montenegro m m m m m m m m m m m m
Morocco m m m m 20 (0.5) m m m m m m
North Macedonia m m m m m m m m m m m m
Panama m m m m 26 (0.5) m m m m m m
Peru m m 18 (0.4) m m m m m m m m
Philippines m m m m m m m m m m m m
Qatar m m m m m m m m m m m m
Romania m m m m m m m m m m m m
Russia 25 (0.4) 32 (0.5) 35 (0.5) 7 (0.7) 10 (0.6) 3 (0.7)
Saudi Arabia m m m m m m m m m m m m
Serbia 21 (0.3) m m 38 (0.4) m m 17 (0.5) m m
Singapore 21 (0.3) 29 (0.4) 35 (0.3) 9 (0.5) 15 (0.4) 6 (0.5)
Chinese Taipei 18 (0.3) 26 (0.4) 30 (0.4) 8 (0.5) 12 (0.5) 4 (0.6)
Thailand m m 28 (0.5) 37 (0.4) m m m m 8 (0.6)
Ukraine m m m m m m m m m m m m
United Arab Emirates m m m m m m m m m m m m
Uruguay 23 (0.3) 35 (0.4) 42 (0.4) † 12 (0.5) 19 (0.5) † 7 (0.6) †
Viet Nam m m m m m m m m m m m m
1. Students were allowed to respond in intervals of no time, between 1-30 minutes per day, between 31-60 minutes per day, between 1 hour and 2 hours per day, between
2 hours and 4 hours per day, between 4 hours and 6 hours per day, and more than 6 hours per day. These responses were converted to the average number of minutes
in the interval (0, 15.5, 45.5, 90.5, 180.5, 300.5, 420.5), then multiplied by 5 if they refer to a school day and by 2 if they refer to a weekend day, and divided by 60 to convert
them into hours. As such, the numbers in this table are the number of hours per week students spent on the Internet in total (outside and inside of school).
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240617
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 157
Annex B Results for countries and economies
Table B.2.2 [1/6] Change between 2009 and 2018 in the percentage of students with access to the Internet and having
a computer that they can use for schoolwork at home
Results based on students' self-reports
Percentage of students who reported having:
Austria 93.0 (0.5) 95.7 (0.3) 97.5 (0.2) 95.5 (0.3) 2.5 (0.6) -0.2 (0.5) -2.0 (0.4)
Belgium 87.2 (0.5) 93.4 (0.3) 96.6 (0.2) 93.0 (0.3) 5.9 (0.6) -0.4 (0.5) -3.5 (0.4)
Canada 93.3 (0.3) 96.3 (0.2) 97.1 (0.2) 93.6 (0.2) 0.3 (0.4) -2.7 (0.3) -3.5 (0.3)
Chile m m 54.3 (2.1) 74.0 (1.2) 82.5 (0.7) m m 28.2 (2.2) 8.5 (1.4)
Colombia m m 31.0 (2.1) 47.0 (1.7) 62.0 (1.3) m m 31.0 (2.4) 15.0 (2.1)
Czech Republic 76.7 (0.8) 86.8 (0.8) 95.4 (0.3) 94.6 (0.4) 18.0 (0.9) 7.8 (0.9) -0.8 (0.6)
Denmark 93.3 (0.5) 98.3 (0.2) 99.0 (0.1) 98.3 (0.2) 5.1 (0.5) 0.0 (0.3) -0.7 (0.3)
Estonia m m 82.9 (0.7) 88.2 (0.5) 86.9 (0.5) m m 4.0 (0.9) -1.3 (0.7)
Finland 87.9 (0.6) 95.3 (0.3) 98.6 (0.2) 94.2 (0.3) 6.3 (0.6) -1.2 (0.4) -4.4 (0.4)
France 78.6 (0.9) 86.1 (0.7) 93.8 (0.4) 90.8 (0.4) 12.2 (1.0) 4.7 (0.9) -3.0 (0.6)
Germany 91.0 (0.6) 95.4 (0.4) 97.0 (0.3) 92.0 (0.6) 1.0 (0.8) -3.4 (0.7) -5.0 (0.6)
Greece 52.8 (1.4) 74.0 (0.9) 87.7 (0.8) 89.1 (0.5) 36.3 (1.5) 15.1 (1.0) 1.4 (0.9)
Hungary 67.6 (0.9) 85.3 (0.8) 92.4 (0.7) 91.4 (0.5) 23.8 (1.0) 6.1 (1.0) -1.0 (0.9)
Iceland 96.8 (0.3) 98.1 (0.2) 98.5 (0.2) 96.1 (0.4) -0.7 (0.4) -2.0 (0.4) -2.5 (0.4)
Ireland 79.9 (0.9) 87.9 (0.6) 93.1 (0.5) 86.0 (0.6) 6.2 (1.1) -1.8 (0.8) -7.1 (0.8)
Israel m m 90.0 (0.9) 92.7 (0.7) 93.3 (0.5) m m 3.3 (1.0) 0.6 (0.8)
Italy 78.0 (0.8) 89.4 (0.5) 94.9 (0.2) 90.0 (0.5) 12.0 (1.0) 0.7 (0.7) -4.9 (0.5)
Japan 46.2 (1.0) 62.5 (0.9) 69.1 (0.9) 61.3 (0.9) 15.2 (1.4) -1.2 (1.3) -7.7 (1.3)
Korea 95.1 (0.4) 97.2 (0.3) 95.7 (0.3) 90.3 (0.5) -4.8 (0.6) -6.9 (0.6) -5.4 (0.6)
Latvia 44.0 (1.6) 72.6 (1.0) 89.5 (0.8) 94.3 (0.4) 50.3 (1.6) 21.6 (1.1) 4.8 (0.9)
Lithuania m m 80.1 (0.7) 93.0 (0.5) 96.2 (0.3) m m 16.1 (0.8) 3.2 (0.6)
Luxembourg 90.1 (0.4) 93.2 (0.4) 97.2 (0.3) 92.7 (0.3) 2.6 (0.5) -0.4 (0.5) -4.5 (0.4)
Mexico 33.2 (1.8) 42.0 (1.3) 48.2 (0.9) 56.8 (1.2) 23.6 (2.2) 14.8 (1.8) 8.7 (1.5)
Netherlands 95.9 (0.4) 97.4 (0.3) 98.4 (0.2) 95.2 (0.4) -0.7 (0.6) -2.1 (0.5) -3.2 (0.5)
New Zealand 87.3 (0.6) 93.3 (0.5) 94.1 (0.4) 91.9 (0.5) 4.6 (0.8) -1.4 (0.7) -2.2 (0.6)
Norway 93.6 (0.4) 96.8 (0.3) 98.1 (0.2) 96.6 (0.3) 3.0 (0.5) -0.2 (0.4) -1.4 (0.3)
Poland 60.3 (1.2) 79.6 (0.8) 94.0 (0.4) 96.5 (0.3) 36.2 (1.2) 16.8 (0.9) 2.4 (0.5)
Portugal 74.7 (1.2) 86.1 (0.8) 97.1 (0.3) 93.5 (0.4) 18.8 (1.3) 7.4 (1.0) -3.6 (0.5)
Slovak Republic 57.1 (1.3) 77.1 (1.0) 91.9 (0.6) 91.9 (0.5) 34.8 (1.4) 14.8 (1.1) 0.0 (0.8)
Slovenia m m 96.8 (0.3) 98.4 (0.2) 96.8 (0.3) m m 0.0 (0.4) -1.6 (0.3)
Spain 79.0 (0.9) 88.1 (0.6) 93.4 (0.4) 91.4 (0.3) 12.5 (0.9) 3.4 (0.7) -2.0 (0.5)
Sweden 94.9 (0.4) 97.8 (0.2) 98.4 (0.2) 94.9 (0.4) 0.0 (0.5) -2.9 (0.4) -3.5 (0.4)
Switzerland 86.6 (0.6) 96.0 (0.2) 97.4 (0.2) 95.3 (0.4) 8.8 (0.7) -0.7 (0.4) -2.1 (0.4)
Turkey 23.3 (1.9) 38.2 (1.7) 60.6 (1.2) 67.2 (1.2) 44.0 (2.2) 29.0 (2.1) 6.6 (1.8)
United Kingdom 91.4 (0.5) 95.2 (0.5) 97.7 (0.2) 91.9 (0.4) 0.5 (0.6) -3.2 (0.6) -5.7 (0.5)
United States 87.5 (0.7) 88.8 (1.1) 90.1 (0.7) 88.1 (0.8) 0.6 (1.1) -0.7 (1.4) -2.0 (1.1)
OECD average-31 77.7 (0.2) 86.5 (0.1) 91.9 (0.1) 90.0 (0.1) 12.2 (0.2) 3.5 (0.2) -2.0 (0.1)
OECD average-37 m m 84.2 (0.1) 90.4 (0.1) 89.4 (0.1) m m 5.2 (0.2) -1.0 (0.1)
158
Results for countries and economies Annex B
Table B.2.2 [2/6] Change between 2009 and 2018 in the percentage of students with access to the Internet and having
a computer that they can use for schoolwork at home
Results based on students' self-reports
Percentage of students who reported having:
Argentina m m 48.6 (2.4) 65.4 (1.7) 71.6 (0.9) m m 23.0 (2.6) 6.2 (1.9)
Baku (Azerbaijan) m m m m m m 68.0 (0.8) m m m m m m
Belarus m m m m m m 93.6 (0.4) m m m m m m
Bosnia and Herzegovina m m m m m m 89.7 (0.5) m m m m m m
Brazil 27.2 (1.6) 36.0 (1.2) 52.6 (1.2) 59.4 (0.9) 32.3 (1.8) 23.5 (1.5) 6.9 (1.5)
Brunei Darussalam m m m m m m 67.6 (0.6) m m m m m m
B-S-J-Z (China) m m m m m m 74.3 (0.9) m m m m m m
Bulgaria m m 65.8 (1.7) 87.7 (1.2) 90.1 (0.6) m m 24.3 (1.8) 2.4 (1.3)
Costa Rica m m m m 62.8 (1.3) 73.1 (1.1) m m m m 10.2 (1.7)
Croatia m m 84.1 (0.8) 93.5 (0.5) 91.2 (0.4) m m 7.1 (0.8) -2.3 (0.6)
Cyprus m m m m m m 89.8 (0.5) m m m m m m
Dominican Republic m m m m m m 44.5 (2.0) m m m m m m
Georgia m m m m 49.1 (1.4) 78.3 (0.7) m m m m 29.2 (1.5)
Hong Kong (China) 92.7 (0.5) 97.0 (0.3) 97.9 (0.3) 88.1 (0.6) -4.6 (0.8) -8.9 (0.7) -9.8 (0.7)
Indonesia 7.9 (0.9) 14.4 (2.1) 19.9 (1.9) 33.5 (1.6) 25.6 (1.8) 19.1 (2.7) 13.6 (2.4)
Jordan m m 59.2 (1.2) 72.8 (1.1) 66.3 (0.9) m m 7.1 (1.5) -6.5 (1.4)
Kazakhstan m m m m 52.7 (1.6) 74.2 (0.7) m m m m 21.5 (1.8)
Kosovo m m m m m m 82.1 (0.6) m m m m m m
Lebanon m m m m m m 68.7 (1.0) m m m m m m
Macao (China) 89.4 (1.0) 94.7 (0.4) 97.5 (0.2) 91.9 (0.4) 2.6 (1.1) -2.8 (0.6) -5.5 (0.5)
Malaysia m m m m 56.8 (1.5) 50.6 (1.3) m m m m -6.2 (2.0)
Malta m m m m 97.1 (0.4) 93.7 (0.4) m m m m -3.4 (0.5)
Moldova m m m m 52.6 (1.1) 84.3 (0.6) m m m m 31.7 (1.3)
Montenegro m m 59.9 (0.7) 84.3 (0.5) 88.6 (0.4) m m 28.8 (0.8) 4.3 (0.6)
Morocco m m m m m m 45.7 (1.5) m m m m m m
North Macedonia m m m m m m 92.3 (0.4) m m m m m m
Panama m m m m 46.3 (2.5) 60.4 (1.3) m m m m 14.1 (2.8)
Peru m m m m 37.7 (1.8) 52.8 (1.3) m m m m 15.1 (2.2)
Philippines m m m m m m 40.8 (1.2) m m m m m m
Qatar m m 87.7 (0.4) 92.2 (0.3) 81.9 (0.3) m m -5.8 (0.5) -10.3 (0.4)
Romania m m 61.0 (1.7) 83.4 (1.1) 88.4 (0.9) m m 27.4 (1.9) 5.0 (1.4)
Russia 29.1 (1.7) 58.7 (1.6) 79.0 (1.4) 93.5 (0.5) 64.5 (1.7) 34.8 (1.7) 14.6 (1.5)
Saudi Arabia m m m m m m 73.5 (1.1) m m m m m m
Serbia m m 72.3 (1.1) 88.6 (0.7) 93.4 (0.4) m m 21.1 (1.2) 4.9 (0.8)
Singapore m m m m 94.4 (0.3) 88.2 (0.4) m m m m -6.2 (0.5)
Chinese Taipei m m 89.2 (0.6) 90.7 (0.5) 80.6 (0.5) m m -8.7 (0.8) -10.2 (0.7)
Thailand 26.3 (1.0) 40.6 (1.1) 54.5 (1.3) 53.5 (1.5) 27.2 (1.8) 12.9 (1.8) -1.0 (2.0)
Ukraine m m m m m m 89.2 (0.7) m m m m m m
United Arab Emirates m m m m 92.5 (0.4) 88.3 (0.3) m m m m -4.2 (0.5)
Uruguay 45.6 (1.1) 56.7 (1.1) 75.0 (0.7) 81.5 (0.8) 35.9 (1.4) 24.9 (1.4) 6.5 (1.1)
Viet Nam m m m m m m 41.6 (1.8) m m m m m m
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 159
Annex B Results for countries and economies
Table B.2.2 [3/6] Change between 2009 and 2018 in the percentage of students with access to the Internet and having
a computer that they can use for schoolwork at home
Results based on students' self-reports
Percentage of students who reported having:
Austria 69.4 (1.0) 80.0 (0.8) 95.4 (0.4) 98.0 (0.2) 28.6 (1.0) 18.0 (0.8) 2.6 (0.4)
Belgium 74.8 (0.8) 89.1 (0.5) 96.4 (0.3) 99.0 (0.1) 24.2 (0.8) 9.9 (0.5) 2.6 (0.3)
Canada 88.8 (0.3) 94.0 (0.3) 96.8 (0.2) 98.4 (0.1) 9.6 (0.3) 4.4 (0.3) 1.6 (0.2)
Chile m m 30.2 (1.8) 55.5 (1.5) 88.4 (0.6) m m 58.2 (1.9) 32.9 (1.6)
Colombia m m 15.6 (1.0) 31.4 (1.5) 67.1 (1.3) m m 51.5 (1.7) 35.7 (2.0)
Czech Republic 49.1 (0.9) 66.4 (0.9) 92.3 (0.5) 99.0 (0.2) 49.9 (0.9) 32.5 (0.9) 6.7 (0.5)
Denmark 83.4 (0.8) 95.7 (0.3) 98.9 (0.2) 99.7 (0.1) 16.3 (0.8) 4.0 (0.4) 0.8 (0.2)
Estonia m m 80.7 (1.1) 96.2 (0.4) 99.5 (0.1) m m 18.8 (1.1) 3.2 (0.4)
Finland 76.7 (0.8) 92.6 (0.4) 99.0 (0.1) 99.6 (0.1) 22.9 (0.8) 7.0 (0.4) 0.7 (0.2)
France 55.9 (1.3) 73.0 (1.0) 92.2 (0.6) 98.5 (0.2) 42.5 (1.3) 25.5 (1.1) 6.3 (0.6)
Germany 73.5 (0.8) 87.5 (0.6) 95.8 (0.3) 98.0 (0.2) 24.5 (0.9) 10.5 (0.7) 2.2 (0.4)
Greece 35.3 (1.4) 53.4 (1.2) 71.4 (1.1) 95.7 (0.4) 60.4 (1.5) 42.3 (1.2) 24.3 (1.1)
Hungary 26.0 (0.9) 50.7 (1.3) 85.7 (0.9) 98.5 (0.2) 72.5 (0.9) 47.8 (1.3) 12.8 (0.9)
Iceland 92.3 (0.5) 97.7 (0.2) 98.7 (0.2) 99.4 (0.1) 7.2 (0.5) 1.8 (0.3) 0.8 (0.2)
Ireland 66.2 (1.2) 80.5 (0.8) 92.8 (0.5) 98.8 (0.2) 32.6 (1.2) 18.3 (0.8) 6.0 (0.5)
Israel m m 84.1 (0.8) 85.6 (1.0) 96.2 (0.4) m m 12.1 (0.9) 10.6 (1.0)
Italy 62.4 (1.0) 72.2 (0.6) 87.5 (0.3) 97.2 (0.2) 34.8 (1.0) 24.9 (0.6) 9.7 (0.4)
Japan 60.5 (1.1) 74.9 (1.1) 81.5 (0.8) 95.3 (0.3) 34.8 (1.2) 20.4 (1.1) 13.8 (0.9)
Korea 93.1 (0.5) 96.5 (0.3) 96.9 (0.4) 97.4 (0.2) 4.3 (0.5) 0.9 (0.4) 0.5 (0.4)
Latvia 16.3 (0.9) 52.3 (1.2) 81.4 (1.1) 98.9 (0.2) 82.6 (0.9) 46.6 (1.3) 17.6 (1.1)
Lithuania m m 56.7 (1.1) 85.8 (0.7) 98.7 (0.2) m m 41.9 (1.1) 12.9 (0.7)
Luxembourg 75.4 (0.7) 86.7 (0.5) 97.4 (0.2) 96.8 (0.2) 21.5 (0.7) 10.2 (0.5) -0.6 (0.3)
Mexico 18.4 (1.6) 23.3 (1.1) 35.4 (0.9) 67.9 (1.5) 49.6 (2.2) 44.7 (1.9) 32.5 (1.8)
Netherlands 89.0 (0.8) 96.5 (0.4) 99.1 (0.2) 98.9 (0.2) 9.9 (0.8) 2.4 (0.4) -0.2 (0.2)
New Zealand 82.1 (0.8) 89.4 (0.6) 91.7 (0.5) 97.4 (0.2) 15.3 (0.8) 8.0 (0.7) 5.7 (0.5)
Norway 87.6 (0.7) 95.6 (0.5) 99.0 (0.2) 99.3 (0.1) 11.6 (0.7) 3.7 (0.5) 0.3 (0.2)
Poland 34.2 (0.9) 51.3 (1.1) 85.4 (0.8) 99.4 (0.1) 65.2 (0.9) 48.1 (1.1) 14.0 (0.8)
Portugal 47.5 (1.3) 58.1 (1.4) 91.1 (0.7) 98.3 (0.2) 50.8 (1.3) 40.2 (1.4) 7.2 (0.7)
Slovak Republic 17.4 (0.7) 40.2 (1.1) 85.4 (0.8) 97.9 (0.3) 80.5 (0.8) 57.7 (1.2) 12.5 (0.9)
Slovenia m m 85.9 (0.5) 96.6 (0.3) 99.4 (0.1) m m 13.4 (0.6) 2.8 (0.3)
Spain 49.8 (1.4) 65.8 (1.0) 84.8 (0.8) 97.9 (0.1) 48.1 (1.4) 32.1 (1.0) 13.1 (0.8)
Sweden 89.6 (0.5) 96.7 (0.3) 98.5 (0.2) 99.0 (0.2) 9.4 (0.6) 2.3 (0.3) 0.5 (0.3)
Switzerland 79.1 (0.9) 93.4 (0.3) 98.1 (0.2) 98.7 (0.2) 19.6 (0.9) 5.4 (0.4) 0.7 (0.3)
Turkey 14.4 (1.4) 24.6 (1.3) 53.0 (1.2) 76.6 (1.2) 62.2 (1.8) 52.0 (1.8) 23.6 (1.7)
United Kingdom 80.7 (0.6) 90.4 (0.6) 97.2 (0.2) 99.2 (0.1) 18.5 (0.7) 8.8 (0.6) 2.0 (0.3)
United States 81.8 (0.9) 85.1 (1.2) 89.3 (0.7) 96.4 (0.3) 14.6 (0.9) 11.2 (1.2) 7.1 (0.8)
OECD average-31 63.1 (0.2) 75.7 (0.2) 89.2 (0.1) 96.6 (0.1) 33.5 (0.2) 20.9 (0.2) 7.4 (0.1)
OECD average-37 m m 72.9 (0.1) 86.9 (0.1) 95.7 (0.1) m m 22.8 (0.2) 8.8 (0.1)
160 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Results for countries and economies Annex B
Table B.2.2 [4/6] Change between 2009 and 2018 in the percentage of students with access to the Internet and having
a computer that they can use for schoolwork at home
Results based on students' self-reports
Percentage of students who reported having:
Argentina m m 29.9 (2.2) 50.9 (2.0) 83.4 (0.7) m m 53.5 (2.3) 32.5 (2.1)
Baku (Azerbaijan) m m m m m m 85.5 (0.6) m m m m m m
Belarus m m m m m m 98.4 (0.2) m m m m m m
Bosnia and Herzegovina m m m m m m 96.4 (0.3) m m m m m m
Brazil 23.3 (1.5) 38.8 (1.1) 58.3 (1.1) 90.8 (0.5) 67.6 (1.6) 52.0 (1.2) 32.5 (1.2)
Brunei Darussalam m m m m m m 80.8 (0.5) m m m m m m
B-S-J-Z (China) m m m m m m 93.0 (0.5) m m m m m m
Bulgaria m m 59.0 (1.7) 85.5 (1.1) 97.4 (0.3) m m 38.3 (1.8) 11.9 (1.2)
Costa Rica m m m m 40.3 (1.4) 83.0 (0.8) m m m m 42.7 (1.6)
Croatia m m 71.1 (0.9) 86.8 (0.7) 99.1 (0.1) m m 28.1 (0.9) 12.3 (0.7)
Cyprus m m m m m m 97.7 (0.2) m m m m m m
Dominican Republic m m m m m m 78.4 (1.0) m m m m m m
Georgia m m m m 50.3 (1.3) 93.2 (0.5) m m m m 42.9 (1.4)
Hong Kong (China) 88.4 (0.8) 96.9 (0.3) 98.0 (0.3) 97.6 (0.2) 9.2 (0.8) 0.6 (0.3) -0.4 (0.3)
Indonesia 2.6 (0.4) 4.3 (0.6) 8.3 (0.9) 46.8 (1.5) 44.2 (1.5) 42.5 (1.6) 38.5 (1.7)
Jordan m m 29.7 (1.1) 30.2 (1.1) 83.9 (0.7) m m 54.2 (1.3) 53.6 (1.3)
Kazakhstan m m m m 35.2 (1.5) 89.0 (0.6) m m m m 53.8 (1.6)
Kosovo m m m m m m 92.9 (0.4) m m m m m m
Lebanon m m m m m m 84.6 (0.8) m m m m m m
Macao (China) 67.0 (1.5) 89.4 (0.5) 97.1 (0.2) 98.8 (0.2) 31.8 (1.5) 9.4 (0.5) 1.7 (0.3)
Malaysia m m m m 45.1 (1.7) 76.8 (1.0) m m m m 31.7 (1.9)
Malta m m m m 97.9 (0.3) 97.6 (0.3) m m m m -0.4 (0.4)
Moldova m m m m 50.3 (1.1) 93.0 (0.5) m m m m 42.7 (1.2)
Montenegro m m 54.3 (0.7) 69.9 (0.7) 95.5 (0.3) m m 41.1 (0.8) 25.6 (0.7)
Morocco m m m m m m 54.1 (1.6) m m m m m m
North Macedonia m m m m m m 98.8 (0.2) m m m m m m
Panama m m m m 37.6 (3.0) 67.8 (1.1) m m m m 30.2 (3.2)
Peru m m m m 25.0 (1.5) 57.0 (1.2) m m m m 31.9 (1.9)
Philippines m m m m m m 49.3 (1.4) m m m m m m
Qatar m m 81.3 (0.5) 89.4 (0.3) 95.0 (0.2) m m 13.7 (0.5) 5.5 (0.4)
Romania m m 32.1 (1.9) 69.9 (1.5) 96.4 (0.4) m m 64.3 (1.9) 26.5 (1.6)
Russia 13.9 (1.0) 34.4 (1.6) 56.0 (1.5) 98.3 (0.2) 84.4 (1.0) 63.9 (1.6) 42.3 (1.5)
Saudi Arabia m m m m m m 95.4 (0.4) m m m m m m
Serbia m m 51.6 (1.2) 64.1 (1.0) 97.9 (0.2) m m 46.3 (1.2) 33.8 (1.1)
Singapore m m m m 95.4 (0.3) 98.3 (0.1) m m m m 2.9 (0.3)
Chinese Taipei m m 91.8 (0.5) 93.0 (0.4) 95.6 (0.2) m m 3.8 (0.6) 2.7 (0.5)
Thailand 18.7 (1.1) 23.3 (1.2) 35.8 (1.3) 82.4 (0.8) 63.7 (1.3) 59.0 (1.4) 46.5 (1.5)
Ukraine m m m m m m 97.7 (0.3) m m m m m m
United Arab Emirates m m m m 91.0 (0.5) 95.5 (0.2) m m m m 4.5 (0.5)
Uruguay 35.6 (1.2) 40.3 (1.1) 60.5 (0.8) 87.5 (0.8) 51.9 (1.4) 47.2 (1.3) 27.0 (1.1)
Viet Nam m m m m m m 76.9 (1.4) m m m m m m
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 161
Annex B Results for countries and economies
Table B.2.2 [5/6] Change between 2009 and 2018 in the percentage of students with access to the Internet and having
a computer that they can use for schoolwork at home
Results based on students' self-reports
Percentage of students who reported having:
Access to the Internet and a computer that can be used for schoolwork at home
Change between Change between Change between
2003 and 2018 2006 and 2018 2009 and 2018
(PISA 2018 (PISA 2018 (PISA 2018
PISA 2003 PISA 2006 PISA 2009 PISA 2018 - PISA 2003) - PISA 2006) - PISA 2009)
% S.E. s % S.E. s % S.E. s % S.E. s % dif. S.E. s % dif. S.E. s % dif. S.E. s
Australia 84.0 (0.6) 91.3 (0.4) 95.2 (0.3) 92.9 (0.4) 8.9 (0.7) 1.6 (0.5) -2.3 (0.5)
OECD
Austria 68.4 (1.1) 79.0 (0.8) 94.4 (0.4) 94.1 (0.3) 25.7 (1.1) 15.1 (0.9) -0.3 (0.5)
Belgium 73.6 (0.8) 87.9 (0.5) 95.2 (0.3) 92.5 (0.3) 18.9 (0.9) 4.6 (0.6) -2.7 (0.5)
Canada 88.0 (0.3) 93.4 (0.3) 95.8 (0.2) 92.9 (0.2) 4.9 (0.4) -0.5 (0.4) -2.9 (0.3)
Chile m m 29.5 (1.8) 55.0 (1.5) 75.4 (0.8) m m 45.9 (2.0) 20.4 (1.7)
Colombia m m 14.9 (1.1) 30.4 (1.5) 54.4 (1.3) m m 39.4 (1.7) 23.9 (2.0)
Czech Republic 48.4 (0.9) 65.9 (1.0) 91.5 (0.5) 94.3 (0.4) 45.9 (1.0) 28.4 (1.1) 2.8 (0.6)
Denmark 81.8 (0.8) 95.1 (0.4) 98.4 (0.2) 98.1 (0.2) 16.3 (0.8) 3.0 (0.4) -0.3 (0.3)
Estonia m m 75.4 (1.1) 86.7 (0.6) 86.6 (0.5) m m 11.2 (1.2) -0.1 (0.8)
Finland 75.9 (0.8) 92.0 (0.4) 98.2 (0.2) 94.1 (0.3) 18.2 (0.8) 2.1 (0.6) -4.2 (0.4)
France 54.8 (1.3) 71.4 (1.0) 90.4 (0.6) 90.1 (0.5) 35.4 (1.3) 18.8 (1.1) -0.2 (0.7)
Germany 72.7 (0.8) 86.3 (0.6) 94.4 (0.4) 90.9 (0.6) 18.3 (1.0) 4.7 (0.8) -3.5 (0.7)
Greece 33.3 (1.4) 51.7 (1.2) 70.1 (1.1) 87.4 (0.6) 54.1 (1.5) 35.8 (1.3) 17.3 (1.3)
Hungary 25.4 (0.9) 49.6 (1.3) 84.8 (0.9) 90.7 (0.5) 65.3 (1.0) 41.1 (1.4) 5.9 (1.1)
Iceland 91.9 (0.5) 96.7 (0.3) 98.1 (0.2) 95.8 (0.4) 3.9 (0.6) -0.9 (0.5) -2.3 (0.4)
Ireland 64.0 (1.2) 78.0 (0.9) 89.8 (0.6) 85.5 (0.6) 21.6 (1.4) 7.5 (1.0) -4.3 (0.8)
Israel m m 82.9 (0.9) 84.8 (1.0) 91.6 (0.5) m m 8.7 (1.0) 6.8 (1.1)
Italy 61.1 (1.0) 71.2 (0.6) 86.3 (0.3) 88.6 (0.5) 27.5 (1.1) 17.4 (0.8) 2.3 (0.6)
Japan 39.9 (1.0) 59.1 (0.9) 66.6 (1.0) 60.2 (0.9) 20.3 (1.4) 1.1 (1.3) -6.4 (1.3)
Korea 91.3 (0.5) 95.7 (0.4) 94.1 (0.4) 89.3 (0.5) -2.0 (0.7) -6.4 (0.6) -4.9 (0.7)
Latvia 15.6 (0.9) 50.5 (1.2) 80.5 (1.1) 93.7 (0.4) 78.1 (1.0) 43.2 (1.3) 13.2 (1.2)
Lithuania m m 56.3 (1.1) 85.2 (0.7) 95.7 (0.3) m m 39.3 (1.2) 10.4 (0.8)
Luxembourg 74.1 (0.7) 84.8 (0.5) 95.9 (0.3) 90.9 (0.4) 16.8 (0.8) 6.1 (0.6) -5.1 (0.5)
Mexico 17.3 (1.6) 22.6 (1.1) 34.3 (0.9) 50.5 (1.3) 33.2 (2.1) 27.8 (1.7) 16.2 (1.6)
Netherlands 87.7 (0.9) 94.9 (0.5) 97.7 (0.3) 94.5 (0.5) 6.8 (1.0) -0.4 (0.7) -3.2 (0.6)
New Zealand 80.9 (0.8) 88.4 (0.7) 90.4 (0.5) 90.9 (0.5) 10.0 (1.0) 2.5 (0.8) 0.5 (0.7)
Norway 86.2 (0.7) 94.6 (0.5) 97.7 (0.2) 96.4 (0.3) 10.2 (0.7) 1.9 (0.6) -1.2 (0.4)
Poland 33.6 (0.9) 50.0 (1.1) 84.4 (0.8) 96.1 (0.3) 62.5 (1.0) 46.1 (1.2) 11.7 (0.9)
Portugal 47.0 (1.3) 57.8 (1.4) 90.7 (0.7) 92.7 (0.5) 45.8 (1.3) 34.9 (1.4) 2.0 (0.8)
Slovak Republic 16.9 (0.7) 37.5 (1.1) 83.2 (0.9) 91.3 (0.6) 74.4 (0.9) 53.8 (1.3) 8.1 (1.1)
Slovenia m m 85.1 (0.6) 95.8 (0.3) 96.4 (0.3) m m 11.3 (0.6) 0.7 (0.4)
Spain 48.9 (1.4) 65.1 (1.0) 83.5 (0.8) 90.6 (0.3) 41.7 (1.4) 25.6 (1.0) 7.1 (0.8)
Sweden 88.9 (0.6) 95.8 (0.3) 97.7 (0.3) 94.2 (0.4) 5.3 (0.7) -1.6 (0.5) -3.5 (0.5)
Switzerland 74.3 (0.9) 91.7 (0.3) 96.4 (0.2) 94.6 (0.4) 20.3 (1.0) 2.8 (0.5) -1.8 (0.4)
Turkey 14.0 (1.4) 21.9 (1.4) 50.9 (1.3) 61.5 (1.4) 47.5 (1.9) 39.5 (1.9) 10.6 (1.8)
United Kingdom 80.0 (0.7) 89.8 (0.6) 96.3 (0.3) 91.7 (0.4) 11.7 (0.8) 1.9 (0.7) -4.6 (0.5)
United States 80.9 (0.9) 83.8 (1.3) 87.2 (0.8) 86.6 (0.9) 5.7 (1.3) 2.7 (1.5) -0.6 (1.2)
OECD average-31 61.3 (0.2) 74.0 (0.2) 87.4 (0.1) 88.8 (0.1) 27.5 (0.2) 14.8 (0.2) 1.4 (0.2)
OECD average-37 m m 71.3 (0.1) 85.1 (0.1) 87.9 (0.1) m m 16.6 (0.2) 2.9 (0.2)
162
Results for countries and economies Annex B
Table B.2.2 [6/6] Change between 2009 and 2018 in the percentage of students with access to the Internet and having
a computer that they can use for schoolwork at home
Results based on students' self-reports
Percentage of students who reported having:
Access to the Internet and a computer that can be used for schoolwork at home
Change between Change between Change between
2003 and 2018 2006 and 2018 2009 and 2018
(PISA 2018 (PISA 2018 (PISA 2018
PISA 2003 PISA 2006 PISA 2009 PISA 2018 - PISA 2003) - PISA 2006) - PISA 2009)
% S.E. s % S.E. s % S.E. s % S.E. s % dif. S.E. s % dif. S.E. s % dif. S.E. s
Albania m m m m 25.6 (1.4) 65.5 (1.1) m m m m 40.0 (1.8)
Partners
Argentina m m 28.2 (2.1) 50.1 (2.0) 65.6 (0.9) m m 37.4 (2.3) 15.5 (2.2)
Baku (Azerbaijan) m m m m m m 63.2 (0.9) m m m m m m
Belarus m m m m m m 92.9 (0.4) m m m m m m
Bosnia and Herzegovina m m m m m m 88.4 (0.5) m m m m m m
Brazil 22.4 (1.5) 31.1 (1.2) 46.0 (1.2) 58.1 (0.9) 35.7 (1.8) 27.0 (1.5) 12.1 (1.6)
Brunei Darussalam m m m m m m 59.3 (0.6) m m m m m m
B-S-J-Z (China) m m m m m m 72.6 (1.0) m m m m m m
Bulgaria m m 55.5 (1.9) 84.0 (1.2) 89.5 (0.6) m m 34.0 (2.0) 5.5 (1.4)
Costa Rica m m m m 39.9 (1.4) 66.6 (1.2) m m m m 26.6 (1.9)
Croatia m m 69.1 (0.9) 85.0 (0.7) 90.9 (0.4) m m 21.8 (1.0) 5.8 (0.8)
Cyprus m m m m m m 89.0 (0.5) m m m m m m
Dominican Republic m m m m m m 41.4 (1.8) m m m m m m
Georgia m m m m 44.2 (1.4) 76.8 (0.7) m m m m 32.7 (1.6)
Hong Kong (China) 86.3 (0.8) 95.8 (0.3) 97.1 (0.3) 87.6 (0.7) 1.3 (1.1) -8.2 (0.7) -9.5 (0.7)
Indonesia 1.9 (0.4) 3.2 (0.5) 6.4 (0.8) 23.6 (1.5) 21.7 (1.6) 20.4 (1.6) 17.2 (1.7)
Jordan m m 26.7 (1.1) 29.5 (1.1) 61.3 (1.0) m m 34.6 (1.5) 31.8 (1.4)
Kazakhstan m m m m 31.0 (1.6) 69.9 (0.8) m m m m 38.8 (1.8)
Kosovo m m m m m m 77.9 (0.8) m m m m m m
Lebanon m m m m m m 62.2 (1.2) m m m m m m
Macao (China) 66.4 (1.4) 87.8 (0.6) 95.8 (0.3) 91.3 (0.4) 24.9 (1.5) 3.5 (0.7) -4.5 (0.5)
Malaysia m m m m 41.8 (1.7) 45.5 (1.4) m m m m 3.7 (2.2)
Malta m m m m 95.7 (0.4) 92.5 (0.4) m m m m -3.2 (0.6)
Moldova m m m m 46.7 (1.2) 83.1 (0.7) m m m m 36.4 (1.4)
Montenegro m m 52.3 (0.7) 68.7 (0.7) 86.9 (0.4) m m 34.6 (0.8) 18.1 (0.8)
Morocco m m m m m m 35.3 (1.6) m m m m m m
North Macedonia m m m m m m 91.9 (0.4) m m m m m m
Panama m m m m 36.1 (3.1) 51.9 (1.4) m m m m 15.8 (3.4)
Peru m m m m 22.6 (1.6) 44.7 (1.3) m m m m 22.1 (2.0)
Philippines m m m m m m 30.5 (1.3) m m m m m m
Qatar m m 77.7 (0.5) 86.3 (0.4) 79.7 (0.3) m m 2.0 (0.6) -6.6 (0.5)
Romania m m 31.3 (1.9) 68.6 (1.5) 87.3 (0.9) m m 56.0 (2.1) 18.7 (1.8)
Russia 13.2 (1.0) 32.2 (1.6) 53.6 (1.5) 92.7 (0.6) 79.5 (1.2) 60.5 (1.7) 39.1 (1.6)
Saudi Arabia m m m m m m 72.3 (1.1) m m m m m m
Serbia m m 50.9 (1.2) 63.6 (1.0) 92.7 (0.4) m m 41.7 (1.3) 29.0 (1.1)
Singapore m m m m 93.5 (0.3) 87.7 (0.5) m m m m -5.8 (0.5)
Chinese Taipei m m 86.5 (0.6) 88.7 (0.6) 79.1 (0.6) m m -7.4 (0.8) -9.6 (0.8)
Thailand 16.5 (0.9) 23.1 (1.2) 35.4 (1.3) 50.2 (1.5) 33.7 (1.8) 27.1 (1.9) 14.9 (2.0)
Ukraine m m m m m m 88.4 (0.8) m m m m m m
United Arab Emirates m m m m 87.4 (0.6) 85.8 (0.4) m m m m -1.6 (0.7)
Uruguay 34.8 (1.1) 37.7 (1.1) 58.6 (0.9) 76.3 (1.0) 41.5 (1.5) 38.6 (1.5) 17.7 (1.3)
Viet Nam m m m m m m 39.4 (1.9) m m m m m m
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 163
Annex B Results for countries and economies
All students
To understand
How to use How to compare the consequences
keywords when different web pages of making How to use the
using a search How to decide and decide what information publicly short description How to detect
engine such as whether to trust information is more available online below the links in whether the How to detect
<Google©>, information from relevant for your on <Facebook©>, the list of results of information is phishing or spam
<Yahoo©>, etc. the Internet schoolwork <Instagram©>, etc. a search subjective or biased emails
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Australia 70.1 (0.5) 83.7 (0.4) 77.9 (0.5) 87.6 (0.3) 63.3 (0.5) 74.5 (0.5) 53.2 (0.5)
OECD
Austria 55.6 (0.9) 66.6 (1.1) 55.4 (1.0) 78.6 (0.6) 44.7 (0.7) 50.7 (0.8) 45.2 (1.0)
Belgium 56.1 (0.9) 75.0 (0.6) 63.3 (0.7) 74.8 (0.7) 42.9 (0.6) 62.2 (0.7) 32.4 (0.9)
Canada 61.8 (0.6) 78.7 (0.5) 72.5 (0.7) 81.4 (0.4) 54.5 (0.7) 70.2 (0.7) 38.4 (0.6)
Chile 48.2 (1.0) 65.2 (0.9) 62.2 (0.9) 69.9 (0.9) 46.3 (0.9) 49.7 (0.9) 33.3 (0.9)
Colombia 56.3 (1.0) 71.9 (0.6) 65.1 (0.8) 79.6 (0.8) 56.4 (0.9) 45.0 (1.0) 41.8 (1.0)
Czech Republic 59.7 (0.9) 65.7 (1.0) 59.0 (0.9) 66.7 (0.8) 50.3 (0.7) 50.2 (0.8) 47.3 (1.2)
Denmark 73.6 (0.9) 90.4 (0.6) 84.5 (0.7) 85.9 (0.6) 56.6 (0.9) 73.6 (0.8) 37.9 (0.9)
Estonia 63.0 (0.9) 74.8 (0.7) 67.1 (0.9) 78.2 (0.7) 56.5 (0.9) 54.1 (0.8) 53.0 (0.8)
Finland 70.6 (0.8) 83.3 (0.7) 78.4 (0.7) 85.6 (0.6) 52.2 (0.8) 60.5 (0.9) 47.2 (0.8)
France 68.0 (0.8) 74.5 (0.7) 69.2 (0.8) 78.6 (0.7) 50.1 (0.6) 49.9 (0.6) 36.3 (0.7)
Germany 49.4 (1.0) 54.3 (0.8) 46.5 (0.9) 73.9 (0.8) 37.2 (0.8) 48.7 (0.8) 25.3 (0.8)
Greece 58.5 (0.9) 66.5 (0.9) 60.1 (0.8) 71.9 (0.7) 49.7 (0.8) 51.3 (0.8) 47.6 (0.8)
Hungary 57.9 (1.0) 61.7 (0.9) 52.8 (0.9) 81.8 (0.7) 52.8 (0.8) 46.2 (0.9) 49.4 (1.0)
Iceland 57.6 (0.8) 70.5 (0.7) 66.8 (0.9) 76.1 (0.7) 54.5 (0.9) 51.5 (0.8) 41.5 (0.9)
Ireland 44.3 (0.9) 58.2 (0.9) 45.7 (0.8) 83.1 (0.7) 35.1 (0.8) 59.1 (0.9) 28.0 (0.9)
Israel 48.7 (1.0) 58.4 (1.0) 53.0 (1.1) 77.2 (0.8) 41.8 (1.0) 43.3 (0.9) 38.1 (0.8)
Italy 44.2 (0.9) 57.9 (0.8) 57.3 (0.8) 60.4 (0.9) 31.7 (0.9) 49.0 (0.7) 27.3 (0.8)
Japan 74.2 (0.9) 88.9 (0.5) 64.2 (1.0) 86.9 (0.5) 42.8 (1.0) 66.2 (0.8) 57.5 (1.2)
Korea 49.7 (0.9) 55.0 (1.0) 51.3 (0.7) 46.2 (0.8) 42.6 (0.7) 49.1 (0.8) 34.7 (0.8)
Latvia 47.9 (0.8) 56.5 (0.9) 54.2 (0.9) 74.7 (0.8) 47.3 (0.7) 38.4 (0.8) 48.3 (0.7)
Lithuania 62.1 (0.7) 67.2 (0.7) 56.7 (0.8) 74.2 (0.6) 57.3 (0.7) 54.7 (0.8) 54.1 (0.7)
Luxembourg 54.3 (0.7) 63.3 (0.6) 58.2 (0.7) 77.0 (0.6) 45.1 (0.6) 46.6 (0.7) 37.2 (0.7)
Mexico 67.5 (0.9) 83.2 (0.7) 78.1 (0.8) 81.3 (0.7) 68.0 (0.8) 62.3 (0.9) 45.6 (0.9)
Netherlands 58.4 (1.0) 72.1 (1.1) 61.6 (1.1) 67.6 (0.9) 44.8 (1.0) 61.3 (1.1) 28.4 (1.0)
New Zealand 66.8 (0.9) 78.0 (0.7) 74.1 (0.7) 81.8 (0.6) 58.5 (0.8) 65.0 (0.7) 46.4 (0.7)
Norway 40.9 (0.9) 82.4 (0.8) 73.3 (0.8) 78.0 (0.8) 42.3 (0.8) 47.7 (1.0) 21.6 (0.7)
Poland 35.8 (1.1) 39.5 (0.9) 40.5 (1.0) 78.5 (0.8) 39.0 (0.9) 48.4 (0.9) 48.7 (1.0)
Portugal 56.8 (1.0) 64.4 (1.0) 62.6 (0.9) 78.3 (0.7) 54.3 (1.0) 54.6 (0.9) 54.1 (0.9)
Slovak Republic 49.6 (0.9) 63.2 (1.0) 55.3 (0.9) 58.4 (0.8) 44.6 (0.8) 43.5 (0.9) 44.4 (1.0)
Slovenia 47.7 (0.6) 55.0 (0.7) 52.4 (0.7) 75.9 (0.7) 45.5 (0.8) 40.2 (0.7) 48.6 (0.7)
Spain 40.4 (0.5) 67.3 (0.5) 58.1 (0.6) 82.4 (0.4) 36.2 (0.4) 45.9 (0.5) 35.8 (0.6)
Sweden 51.7 (1.1) 92.5 (0.5) 87.3 (0.6) 76.5 (0.7) 52.0 (1.0) 62.9 (1.0) 34.4 (1.0)
Switzerland 54.2 (1.1) 59.7 (1.0) 53.6 (1.1) 73.6 (0.8) 42.6 (1.1) 44.2 (1.1) 32.8 (1.0)
Turkey 33.9 (1.2) 56.2 (1.3) 53.0 (1.2) 47.5 (1.3) 37.2 (1.3) 48.9 (1.0) 27.1 (1.2)
United Kingdom 57.0 (0.9) 74.9 (0.7) 61.8 (0.8) 89.8 (0.4) 51.9 (0.9) 67.6 (0.8) 52.9 (1.2)
United States 75.0 (0.9) 87.5 (0.7) 82.5 (0.9) 84.4 (0.7) 66.5 (0.8) 78.8 (0.8) 49.0 (1.0)
OECD average 55.9 (0.1) 69.3 (0.1) 62.6 (0.1) 75.8 (0.1) 48.5 (0.1) 54.5 (0.1) 41.2 (0.1)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. Countries which administered the paper-based form had no available data in this item: Argentina, Jordan, Lebanon, the Republic of Moldova, the Republic of North Macedonia,
Romania, Saudi Arabia, Ukraine and Viet Nam.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240636
164
Results for countries and economies Annex B
All students
To understand
How to use How to compare the consequences
keywords when different web pages of making How to use the
using a search How to decide and decide what information publicly short description How to detect
engine such as whether to trust information is more available online below the links in whether the How to detect
<Google©>, information from relevant for your on <Facebook©>, the list of results of information is phishing or spam
<Yahoo©>, etc. the Internet schoolwork <Instagram©>, etc. a search subjective or biased emails
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Albania 74.8 (0.8) 71.9 (0.8) 80.2 (0.6) 74.9 (0.7) 72.9 (0.6) 79.6 (0.6) 57.2 (1.0)
Partners
Argentina 48.9 (0.8) 64.9 (0.7) 56.4 (0.9) 64.4 (0.8) 40.0 (0.8) 37.8 (0.8) 29.1 (0.7)
Baku (Azerbaijan) 57.2 (0.9) 63.7 (0.8) 63.3 (0.9) 50.4 (0.7) 62.8 (0.8) 61.5 (0.8) 45.5 (0.9)
Belarus 52.1 (1.2) 64.2 (1.0) 55.2 (0.9) 68.6 (1.1) 52.2 (1.0) 53.3 (0.9) 54.3 (1.0)
Bosnia and Herzegovina 56.0 (0.9) 63.1 (0.8) 56.0 (0.8) 67.2 (0.8) 51.4 (0.8) 49.1 (0.8) 53.5 (0.9)
Brazil 41.9 (0.6) 52.0 (0.6) 57.0 (0.7) 48.9 (0.8) 41.3 (0.7) 45.7 (0.6) 22.4 (0.7)
Brunei Darussalam 71.8 (0.5) 70.4 (0.5) 62.9 (0.6) 66.0 (0.5) 57.2 (0.6) 37.4 (0.6) 40.9 (0.5)
B-S-J-Z (China) 59.3 (1.1) 72.0 (1.0) 58.1 (1.1) 72.8 (0.8) 56.1 (1.1) 58.6 (0.8) 57.2 (1.1)
Bulgaria 60.7 (1.1) 68.3 (0.7) 64.1 (0.9) 65.4 (0.8) 58.7 (0.9) 57.4 (0.7) 57.0 (1.0)
Costa Rica 49.7 (0.8) 70.0 (0.9) 63.2 (0.8) 75.7 (0.6) 52.2 (0.9) 39.9 (0.7) 32.7 (0.7)
Croatia 57.2 (0.9) 71.6 (0.7) 62.9 (0.8) 80.0 (0.7) 50.2 (0.8) 54.4 (0.8) 53.2 (0.9)
Cyprus 62.7 (0.6) 76.9 (0.5) 60.7 (0.7) 73.7 (0.6) 54.6 (0.7) 55.8 (0.6) 62.0 (0.7)
Dominican Republic 63.1 (0.9) 68.9 (0.9) 65.9 (0.9) 66.5 (0.8) 54.5 (0.8) 52.5 (0.8) 46.0 (0.9)
Georgia 64.8 (0.9) 60.6 (1.0) 51.9 (0.8) 44.6 (0.9) 52.1 (0.8) 50.3 (0.7) 38.0 (0.8)
Hong Kong (China) 67.8 (0.9) 77.4 (0.7) 67.8 (0.8) 72.3 (0.9) 61.7 (0.9) 70.2 (0.9) 53.9 (1.0)
Indonesia 61.7 (1.3) 70.8 (1.2) 62.9 (1.1) 63.9 (1.2) 55.7 (1.2) 56.7 (0.8) 45.1 (1.1)
Jordan 84.1 (0.5) 74.1 (0.7) 73.7 (0.6) 75.8 (0.7) 66.6 (0.8) 63.1 (0.7) 73.6 (0.7)
Kazakhstan 62.8 (0.6) 66.7 (0.6) 58.6 (0.6) 49.4 (0.6) 66.6 (0.6) 62.7 (0.6) 51.8 (0.6)
Kosovo 64.5 (0.7) 69.2 (0.7) 62.2 (0.8) 64.5 (0.9) 59.1 (0.8) 56.7 (0.9) 41.4 (0.8)
Lebanon m m m m m m m m m m m m m m
Macao (China) 58.5 (0.7) 76.2 (0.6) 61.3 (0.8) 63.0 (0.8) 48.4 (0.8) 58.3 (0.8) 44.6 (0.8)
Malaysia 73.9 (0.8) 65.3 (0.8) 65.0 (0.8) 65.4 (0.8) 58.0 (0.8) 47.7 (0.9) 37.4 (0.9)
Malta 73.5 (0.9) 86.7 (0.5) 72.0 (0.9) 87.4 (0.6) 65.0 (0.9) 61.2 (0.9) 75.7 (0.7)
Moldova 51.1 (1.0) 70.8 (0.7) 55.8 (1.1) 70.1 (0.9) 47.5 (1.0) 53.8 (1.0) 41.9 (1.0)
Montenegro 54.8 (0.6) 63.3 (0.6) 59.1 (0.7) 67.6 (0.5) 53.7 (0.7) 57.9 (0.7) 59.0 (0.6)
Morocco 52.9 (1.0) 48.8 (0.9) 44.1 (0.8) 46.6 (0.8) 43.3 (0.8) 39.3 (0.8) 33.9 (0.8)
North Macedonia m m m m m m m m m m m m m m
Panama 60.2 (1.1) 66.0 (1.0) 63.3 (1.0) 71.5 (0.8) 58.6 (1.0) 48.5 (0.9) 39.8 (0.9)
Peru 45.9 (0.7) 65.4 (0.8) 54.4 (0.8) 68.5 (0.8) 47.7 (0.7) 45.1 (0.8) 26.0 (0.6)
Philippines 72.7 (0.7) 75.1 (0.7) 69.9 (0.6) 76.0 (0.7) 73.1 (0.5) 61.0 (0.7) 47.9 (0.7)
Qatar 67.9 (0.4) 67.0 (0.4) 64.8 (0.4) 61.7 (0.4) 58.1 (0.4) 58.5 (0.4) 59.1 (0.4)
Romania 60.6 (1.0) 62.7 (0.9) 57.8 (1.0) 65.5 (0.9) 48.6 (1.0) 44.2 (0.9) 39.7 (1.1)
Russia 55.7 (1.0) 61.7 (1.2) 58.3 (1.0) 57.9 (1.3) 56.7 (1.0) 59.0 (0.8) 47.9 (1.1)
Saudi Arabia 83.7 (0.6) 76.3 (0.7) 68.6 (0.7) 64.7 (0.7) 60.3 (0.9) 58.6 (0.7) 60.8 (0.7)
Serbia 54.2 (0.9) 60.0 (0.7) 56.8 (0.8) 71.2 (0.7) 52.6 (0.8) 50.5 (0.8) 55.3 (0.8)
Singapore 66.3 (0.6) 90.4 (0.4) 77.8 (0.5) 92.9 (0.4) 63.8 (0.6) 84.3 (0.5) 64.5 (0.7)
Chinese Taipei 78.3 (0.6) 84.2 (0.5) 76.8 (0.7) 81.6 (0.6) 73.7 (0.7) 74.4 (0.6) 62.7 (0.7)
Thailand 87.6 (0.5) 88.1 (0.4) 78.8 (0.6) 77.6 (0.7) 73.2 (0.6) 70.8 (0.6) 56.9 (0.8)
Ukraine 66.0 (1.0) 68.2 (0.9) 57.3 (0.9) 66.4 (0.8) 57.4 (0.9) 55.8 (0.8) 47.3 (1.0)
United Arab Emirates 66.6 (0.5) 73.3 (0.5) 68.4 (0.7) 70.9 (0.6) 60.8 (0.7) 65.0 (0.5) 62.9 (0.6)
Uruguay 45.4 (1.3) 73.3 (0.8) 65.3 (0.9) 70.0 (0.9) 47.2 (1.1) 43.5 (0.9) 30.4 (0.8)
Viet Nam 62.6 (1.5) 63.8 (1.1) 41.5 (1.3) 81.2 (0.9) 47.9 (1.2) 39.1 (1.0) 43.8 (1.1)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. Countries which administered the paper-based form had no available data in this item: Argentina, Jordan, Lebanon, the Republic of Moldova, the Republic of North Macedonia,
Romania, Saudi Arabia, Ukraine and Viet Nam.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240636
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 165
Annex B Results for countries and economies
Austria -4.4 (2.0) 2.9 (1.8) 4.1 (1.9) 3.9 (1.5) -3.9 (2.0) 12.5 (2.0) 4.4 (2.6) 44.9 (0.5)
Belgium 3.2 (2.0) 7.1 (1.7) 6.1 (1.7) 6.5 (1.6) -4.5 (1.9) 19.9 (1.8) 0.4 (2.1) 57.5 (0.4)
Canada 1.9 (1.4) 5.8 (1.1) 6.2 (1.2) 4.8 (1.1) -1.4 (1.4) 10.5 (1.2) 3.4 (1.7) 62.1 (0.3)
Chile -9.6 (2.1) -1.0 (2.1) -0.8 (2.3) 3.7 (1.8) -10.0 (2.1) 6.4 (1.8) -5.0 (1.9) 45.8 (0.4)
Colombia -7.0 (2.2) 8.5 (1.8) 6.0 (1.9) 6.9 (1.5) -1.7 (2.7) 8.9 (2.3) 2.1 (2.3) 27.4 (0.5)
Czech Republic -5.1 (2.5) 0.6 (2.2) -2.8 (2.3) 3.0 (2.2) -7.4 (2.1) 2.4 (2.1) 0.0 (2.4) 30.9 (0.2)
Denmark 1.5 (1.9) 4.9 (1.1) 6.9 (1.3) 2.2 (1.4) 2.1 (2.1) 17.6 (1.6) 3.9 (1.9) 60.2 (0.3)
Estonia -3.6 (1.9) 3.1 (1.7) 3.2 (1.9) 2.5 (1.9) -2.9 (2.3) 4.5 (2.0) 0.9 (2.3) 57.1 (0.3)
Finland -1.1 (1.6) 8.3 (1.6) 6.3 (1.7) -0.2 (1.5) -1.0 (1.8) 8.5 (2.0) 1.6 (1.8) 51.3 (0.4)
France 3.5 (1.9) 9.2 (1.8) 4.5 (1.7) 4.4 (1.7) -4.3 (1.8) 7.0 (2.0) -3.8 (1.6) 36.9 (0.4)
Germany -8.0 (2.3) 4.5 (2.3) -2.2 (2.0) -1.0 (2.0) -10.3 (2.3) 15.8 (2.2) -3.5 (2.0) 45.2 (0.5)
Greece -9.3 (2.0) -5.0 (1.9) -1.5 (2.0) 1.5 (1.7) -2.1 (1.9) 1.4 (2.0) 0.4 (2.0) 40.4 (0.6)
Hungary -12.2 (2.6) -10.9 (2.5) -12.4 (2.4) -0.7 (1.9) -13.5 (2.1) -5.9 (2.4) -2.6 (2.6) 43.0 (0.4)
Iceland -5.1 (2.3) 4.8 (2.1) 1.3 (2.3) 6.3 (2.2) -1.0 (2.4) 3.9 (2.7) -2.7 (2.7) 44.7 (0.3)
Ireland -5.9 (2.1) -1.3 (2.1) 0.0 (2.0) 2.8 (1.7) -4.3 (1.8) 7.3 (1.7) -1.9 (2.0) 57.6 (0.4)
Israel -10.7 (2.6) † 0.9 (2.3) † -5.6 (2.5) † 4.1 (1.8) † -8.9 (2.7) † 0.7 (2.6) † -12.5 (2.1) † 53.0 (0.6)
Italy -13.3 (2.0) -3.5 (2.2) -3.6 (2.0) 0.5 (2.0) -8.2 (2.1) 6.4 (2.1) -3.9 (2.2) 40.0 (0.4)
Japan 1.9 (1.6) 3.3 (1.2) 5.1 (1.9) 3.8 (1.4) 0.9 (2.1) 5.3 (1.7) -3.0 (2.4) 47.9 (0.5)
Korea 2.7 (1.9) 9.2 (1.9) 5.1 (1.9) 6.9 (1.9) 3.0 (2.0) 8.9 (1.9) 3.1 (1.8) 25.6 (0.4)
Latvia -6.2 (2.1) 0.0 (2.0) -4.2 (2.1) 1.6 (2.0) -3.3 (2.0) 5.6 (2.0) 3.7 (1.8) 39.8 (0.3)
Lithuania -5.4 (2.0) -1.2 (1.9) -0.1 (1.9) 4.6 (1.6) -6.7 (1.9) 1.4 (2.0) -3.1 (2.1) 39.6 (0.3)
Luxembourg 0.5 (1.8) 5.8 (1.7) 4.5 (2.0) 5.3 (1.7) -4.0 (1.9) 17.8 (1.8) 5.1 (1.8) 38.4 (0.2)
Mexico 0.9 (2.6) 6.9 (2.1) 11.5 (1.8) 7.0 (2.2) 2.1 (2.3) 9.8 (2.7) 12.1 (2.4) 34.8 (0.5)
Netherlands -1.6 (2.1) 6.7 (2.2) 8.7 (2.6) 0.1 (2.2) 0.2 (2.4) 10.9 (2.5) -1.3 (2.1) 62.9 (0.5)
New Zealand 0.5 (1.9) 3.4 (1.5) 1.1 (1.9) 5.8 (1.4) -6.6 (1.9) 12.5 (1.6) -0.1 (1.9) 60.7 (0.3)
Norway -0.4 (2.2) 7.2 (1.6) 5.7 (1.9) 3.1 (2.0) -1.9 (1.8) 11.8 (2.2) -0.4 (1.7) 36.6 (0.4)
Poland -6.3 (2.1) -4.8 (2.1) -9.3 (2.1) 4.0 (1.9) -8.1 (2.1) 4.4 (2.3) -0.8 (2.4) 46.8 (0.5)
Portugal -9.8 (2.4) -8.9 (2.2) -6.2 (1.9) -0.3 (1.5) -12.1 (2.2) -6.8 (2.3) -8.1 (2.1) 50.4 (0.4)
Slovak Republic -9.1 (2.6) 2.9 (2.4) -6.2 (2.5) -1.3 (2.2) -11.3 (2.6) 1.3 (2.3) 4.3 (2.2) 28.1 (0.3)
Slovenia -6.9 (2.1) 5.0 (2.0) 1.8 (1.9) 1.4 (1.8) -4.9 (1.8) 7.0 (2.0) -3.5 (2.4) 46.8 (0.2)
Spain -2.7 (1.4) -1.3 (1.1) -1.3 (1.2) 4.5 (0.9) -8.0 (1.2) 8.4 (1.2) -1.3 (1.3) 41.5 (0.3)
Sweden -6.6 (1.9) 3.7 (0.9) 4.0 (1.3) 2.0 (2.0) -5.0 (2.2) 15.1 (1.9) -5.3 (2.1) 54.1 (0.5)
Switzerland -5.9 (2.3) 2.3 (2.3) 1.1 (2.3) 1.8 (1.9) -9.2 (2.2) 8.4 (1.9) -1.3 (1.9) 43.6 (0.5)
Turkey -0.7 (2.8) 4.9 (2.6) 1.0 (2.9) 10.4 (2.8) 1.1 (2.8) 3.3 (2.4) 5.9 (2.7) 63.3 (0.4)
United Kingdom -0.3 (1.7) 2.4 (1.5) 1.5 (1.9) 4.0 (1.2) -2.8 (1.8) 13.8 (1.8) 4.4 (2.2) 65.2 (0.4)
United States 3.6 (2.4) 6.9 (1.6) 5.7 (2.0) 7.7 (1.8) -1.7 (2.1) 14.4 (2.0) 1.1 (2.3) 69.0 (0.6)
OECD average -3.7 (0.3) 2.7 (0.3) 1.4 (0.3) 3.6 (0.3) -4.3 (0.3) 7.9 (0.3) 0.0 (0.3) 47.4 (0.1)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. Countries which administered the paper-based form had no available data in this item: Argentina, Jordan, Lebanon, the Republic of Moldova, the Republic of North
Macedonia, Romania, Saudi Arabia, Ukraine and Viet Nam.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12https://doi.org/10.1787/888934240636
166
Results for countries and economies Annex B
Table B.2.6 [4/4] Frequencyof opportunity to learn digital literacy skills at school
Results based on students' self-reports
Students reported that during their entire school experience were taught the following:
Argentina 2.5 (1.9) 10.1 (2.0) 9.0 (1.9) 13.1 (2.0) -2.6 (2.0) 7.5 (1.9) 3.2 (1.8) m m
Baku (Azerbaijan) 10.2 (2.2) 7.1 (2.1) 8.4 (2.2) 9.7 (1.7) 6.0 (2.0) 6.9 (2.0) 13.1 (2.1) 21.5 (0.3)
Belarus -12.7 (2.5) -7.0 (2.0) -8.3 (2.1) -4.5 (1.9) -8.1 (2.1) -5.5 (2.0) -2.4 (2.2) 30.7 (0.4)
Bosnia and Herzegovina -3.3 (2.1) -0.1 (2.0) -3.5 (2.1) 3.6 (2.0) -0.6 (2.2) -0.3 (2.0) 3.3 (2.0) 22.1 (0.5)
Brazil 6.6 (1.8) 11.6 (1.8) 9.7 (1.6) 16.9 (1.8) 7.7 (1.4) 12.0 (1.7) 3.6 (1.4) 32.7 (0.3)
Brunei Darussalam 1.0 (1.5) 11.9 (1.5) 3.2 (1.7) 20.4 (1.4) 1.3 (1.9) 15.8 (1.7) 14.9 (1.6) 39.9 (0.1)
B-S-J-Z (China) 4.4 (2.0) 1.0 (1.9) 7.0 (2.2) 6.8 (1.8) 5.8 (1.9) 4.1 (1.9) 6.2 (2.1) 46.6 (0.5)
Bulgaria -12.5 (2.3) -4.4 (1.9) -7.3 (2.3) -2.6 (2.0) -6.2 (2.3) -2.4 (2.4) -1.0 (2.6) 29.9 (0.6)
Costa Rica 3.5 (2.1) 13.9 (1.8) 11.9 (1.8) 8.7 (1.6) 2.6 (2.4) 6.3 (2.1) 7.9 (1.9) 28.7 (0.4)
Croatia -4.4 (2.0) -4.6 (1.8) -3.8 (1.8) -0.2 (1.5) -1.8 (1.9) 3.4 (1.8) -0.6 (1.9) 33.1 (0.5)
Cyprus -1.7 (2.0) -0.4 (1.9) 1.2 (1.8) 3.0 (1.8) 0.3 (2.0) 8.8 (1.9) -1.5 (2.0) 32.1 (0.2)
Dominican Republic -2.1 (2.3) 8.9 (2.0) 3.0 (2.4) 11.9 (1.9) 0.2 (2.1) 0.5 (2.1) 1.2 (2.4) 20.4 (0.4)
Georgia -6.8 (1.7) -7.6 (1.7) -9.9 (2.2) -5.9 (1.9) -6.4 (2.1) -1.1 (2.1) -2.6 (2.2) 14.5 (0.3)
Hong Kong (China) 3.4 (2.3) 8.7 (1.6) 10.3 (2.2) 8.3 (1.9) 5.0 (1.9) 11.5 (2.0) 3.3 (2.0) 51.0 (0.5)
Indonesia 1.8 (2.8) 0.7 (3.0) 4.0 (2.5) 3.3 (2.8) 6.1 (2.7) 1.8 (2.3) 2.8 (2.4) 15.9 (0.3)
Jordan 9.3 (1.4) 13.1 (2.1) 7.6 (1.9) 13.3 (1.8) 6.2 (1.8) 10.0 (2.1) 10.4 (1.6) m m
Kazakhstan -2.7 (1.3) -2.3 (1.5) 3.3 (1.5) 1.2 (1.4) -2.3 (1.4) 0.0 (1.4) 0.8 (1.5) 21.7 (0.2)
Kosovo -5.5 (2.2) -1.2 (2.2) -4.1 (2.1) 3.0 (2.5) 0.2 (2.3) -1.3 (2.3) 3.8 (2.4) 13.3 (0.1)
Lebanon m m m m m m m m m m m m m m m m
Macao (China) 2.6 (2.6) 6.6 (2.1) 7.4 (2.4) 17.9 (2.2) 9.5 (2.4) 10.2 (2.4) 9.1 (2.4) 48.8 (0.2)
Malaysia -1.2 (2.4) -3.5 (1.8) -4.0 (2.2) 9.0 (2.0) -3.6 (2.0) -7.8 (1.9) -0.6 (1.7) 25.8 (0.4)
Malta -3.1 (2.1) -1.4 (1.6) -4.8 (2.7) 4.2 (1.6) -5.3 (2.4) 2.5 (2.9) -1.1 (2.0) 39.4 (0.3)
Moldova -13.3 (2.3) -4.7 (1.8) -7.7 (2.3) 7.8 (1.9) -8.6 (2.0) -2.2 (2.0) -3.3 (2.6) m m
Montenegro -5.4 (1.7) 0.5 (1.7) -3.9 (1.9) -0.8 (1.6) -3.7 (2.0) 2.3 (1.8) -0.6 (1.7) 21.0 (0.2)
Morocco 12.5 (2.3) 14.9 (2.0) 12.1 (2.0) 11.8 (1.8) 5.7 (1.8) 5.9 (2.1) 10.7 (1.6) 17.3 (0.4)
North Macedonia m m m m m m m m m m m m m m m m
Panama -4.5 (2.8) 5.4 (2.7) 0.0 (2.7) 13.3 (2.2) -2.6 (2.8) 0.8 (2.8) 3.7 (3.0) 19.1 (0.4)
Peru 2.2 (1.9) 16.7 (1.9) 14.8 (2.1) 22.0 (2.2) 2.7 (2.2) 13.1 (2.1) 11.4 (2.0) 23.9 (0.3)
Philippines 9.2 (1.8) 18.6 (1.7) 14.1 (1.6) 16.2 (1.5) 5.3 (1.5) 11.4 (1.5) 11.3 (1.7) 17.4 (0.5)
Qatar -1.4 (1.1) 9.6 (1.1) 4.2 (1.2) 11.7 (1.2) 0.7 (1.2) 8.9 (1.1) 2.3 (1.1) 39.5 (0.1)
Romania -0.2 (2.3) -5.3 (2.1) -5.0 (2.2) 2.8 (2.2) -2.7 (2.3) 1.6 (2.5) -4.0 (2.3) m m
Russia -2.4 (1.7) -2.9 (1.9) -1.0 (2.4) 0.1 (2.2) -2.0 (1.7) -3.2 (2.0) 1.4 (2.3) 42.8 (1.3)
Saudi Arabia 8.1 (1.5) 8.1 (1.5) 6.8 (1.8) 9.5 (1.8) 4.1 (1.9) 7.4 (1.7) 8.6 (1.8) m m
Serbia -1.9 (2.4) -3.4 (2.0) 0.8 (2.2) 0.0 (2.1) -3.5 (2.4) 4.4 (2.3) 2.7 (2.2) 29.8 (0.5)
Singapore 4.7 (1.6) 3.9 (0.9) 2.2 (1.2) 3.4 (1.0) -4.8 (1.6) 8.6 (1.5) 0.9 (1.7) 57.7 (0.2)
Chinese Taipei -2.9 (1.9) 2.3 (1.4) -0.8 (1.7) 1.5 (1.4) 0.1 (1.4) 6.7 (1.7) 1.4 (2.0) 38.1 (0.5)
Thailand -2.2 (1.3) 4.9 (1.0) 4.5 (1.8) 0.6 (1.8) 0.6 (1.7) 6.5 (1.9) -0.4 (2.2) 32.2 (0.5)
Ukraine -5.0 (2.3) 1.3 (1.9) -3.0 (2.2) -2.6 (1.9) -4.5 (2.2) -3.1 (2.1) -2.3 (2.1) m m
United Arab Emirates -5.7 (1.3) 4.9 (1.4) 2.8 (1.5) 6.6 (1.6) -3.3 (1.8) 6.5 (1.2) -8.4 (1.5) 41.5 (0.3)
Uruguay -5.7 (2.8) 1.4 (2.0) 4.0 (2.2) 5.1 (2.3) -9.1 (2.3) 4.7 (2.7) 3.0 (2.3) 31.4 (0.5)
Viet Nam -0.9 (2.8) 2.2 (3.0) -2.0 (2.6) 5.3 (2.0) -3.6 (2.8) -1.1 (1.9) 0.9 (2.4) m m
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. Countries which administered the paper-based form had no available data in this item: Argentina, Jordan, Lebanon, the Republic of Moldova, the Republic of North Macedonia,
Romania, Saudi Arabia, Ukraine and Viet Nam.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240636
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 167
Annex B Results for countries and economies
Austria 22.4 (1.5) 22.6 (1.3) 11.2 (0.9) 11.4 (1.1) 54.7 (1.5)
Belgium 26.6 (1.0) 19.9 (1.3) 11.2 (1.0) 13.5 (0.9) 55.3 (1.5)
Canada 25.8 (1.1) 26.1 (0.8) 14.8 (0.7) 12.5 (0.8) 46.6 (1.2)
Chile 25.3 (1.4) 13.8 (1.5) 9.8 (1.0) 14.3 (1.2) 62.1 (1.8)
Colombia 22.1 (1.5) 8.4 (0.9) 4.5 (0.6) 11.1 (1.1) 76.0 (1.4)
Czech Republic 27.3 (1.4) 20.5 (1.2) 9.5 (0.9) 12.6 (1.0) 57.4 (1.5)
Denmark 15.6 (1.4) 19.9 (1.4) 5.1 (0.7) 10.2 (1.0) 64.8 (1.6)
Estonia 23.7 (1.4) 22.8 (1.4) 11.4 (1.2) 10.1 (0.9) 55.6 (1.8)
Finland 20.1 (1.3) 25.9 (1.5) 12.4 (1.1) 9.6 (0.8) 52.0 (1.7)
France 28.1 (1.3) 20.0 (1.3) 9.5 (1.0) 11.9 (1.1) 58.6 (1.8)
Germany 21.8 (1.5) 21.6 (1.5) 9.8 (1.0) 12.1 (1.2) 56.5 (1.8)
Greece 18.3 (1.3) 13.6 (1.4) 7.8 (0.9) 13.0 (1.2) 65.6 (1.8)
Hungary 25.5 (1.5) 18.5 (1.3) 9.8 (0.9) 15.6 (1.3) 56.1 (1.6)
Iceland 30.7 (2.3) 14.4 (1.7) 6.8 (1.1) 10.3 (1.3) 68.5 (2.2)
Ireland 20.3 (1.3) 24.3 (1.3) 13.9 (1.2) 9.2 (0.9) 52.6 (1.5)
Israel 17.1 (1.3) 26.2 (1.5) 12.0 (1.4) 8.0 (0.8) 53.9 (1.6)
Italy 25.3 (1.3) 17.1 (1.3) 9.6 (1.1) 13.8 (1.1) 59.5 (1.6)
Japan 34.0 (1.4) 30.5 (1.6) 12.0 (0.9) 8.5 (1.0) 49.0 (1.7)
Korea 39.1 (1.2) 29.8 (1.3) 33.4 (1.3) 13.3 (1.0) 23.6 (1.2)
Latvia 18.3 (1.4) 23.0 (1.7) 6.4 (1.0) 9.4 (1.3) 61.1 (1.9)
Lithuania 22.3 (1.5) 20.9 (1.3) 9.2 (0.9) 10.3 (0.9) 59.6 (1.7)
Luxembourg 28.4 (1.8) 14.6 (1.4) 10.3 (1.2) 10.8 (1.2) 64.2 (1.9)
Mexico 15.3 (1.6) 13.2 (1.3) 7.1 (0.9) 12.9 (1.2) 66.8 (1.8)
Netherlands 22.2 (2.0) 23.8 (1.8) 15.3 (1.3) 14.0 (1.5) 46.8 (2.1)
New Zealand 29.5 (1.3) 23.0 (1.4) 17.4 (1.2) 16.0 (1.2) 43.6 (1.6)
Norway 16.3 (1.2) 18.5 (1.3) 6.0 (0.8) 7.8 (1.0) 67.8 (1.6)
Poland 28.5 (1.3) 21.2 (1.3) 14.6 (1.1) 16.6 (1.3) 47.6 (1.6)
Portugal 25.5 (1.6) 17.2 (1.3) 10.5 (1.1) 10.9 (1.2) 61.3 (1.9)
Slovak Republic 19.0 (1.2) 15.1 (1.4) 8.1 (0.9) 12.8 (1.2) 64.0 (1.9)
Slovenia 19.8 (1.6) 22.2 (1.6) 15.0 (1.3) 12.9 (1.3) 49.9 (1.9)
Spain 1 22.6 (0.8) 13.0 (0.6) 10.7 (0.5) 18.2 (0.8) 58.1 (1.0)
Sweden 17.9 (1.3) 15.3 (1.2) 8.1 (0.9) 12.0 (1.0) 64.6 (1.7)
Switzerland 23.1 (1.5) 17.5 (1.4) 8.6 (1.1) 13.4 (1.4) 60.5 (2.2)
Turkey 35.6 (1.5) 14.9 (1.0) 13.9 (1.4) 15.4 (1.2) 55.8 (1.7)
United Kingdom 22.3 (1.2) 27.8 (1.5) 19.1 (1.0) 11.3 (1.0) 41.8 (1.8)
United States 29.7 (2.0) 26.5 (1.6) 15.4 (1.3) 10.4 (1.1) 47.7 (2.1)
OECD average 24.1 (0.2) 20.2 (0.2) 11.5 (0.2) 12.2 (0.2) 56.1 (0.3)
1. In 2018, some regions in Spain conducted their high-stakes exams for tenth-grade students earlier in the year than in the past, which resulted in the testing period for
these exams coinciding with the end of the PISA testing window. Because of this overlap, a number of students were negatively disposed towards the PISA test and did not
try their best to demonstrate their proficiency. Although the data of only a minority of students show clear signs of lack of engagement (see PISA 2018 Results Volume I,
Annex A9), the comparability of PISA 2018 data for Spain with those from earlier PISA assessments cannot be fully ensured.
Notes: The computation is based on 10 plausible values with student weights 80 replicates.
The process (log) data analysed in this report focuses on a total of 76 270 students from 70 countries and economies who responded to the “Rapa Nui” reading unit (CR551)
with an overall average reading performance score of 517 points, marginally higher than the average reading score of the full sample 460 points from the 70 countries and
economies.
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934240655
168
Results for countries and economies Annex B
Baku (Azerbaijan) 15.0 (1.5) 7.0 (1.2) 3.5 (0.9) 9.1 (1.5) 80.4 (2.4)
Belarus 27.4 (1.8) 22.2 (1.4) 9.7 (1.2) 14.3 (1.1) 53.8 (2.0)
Bosnia and Herzegovina 12.2 (1.6) 10.7 (1.2) 4.0 (0.9) 9.8 (1.3) 75.5 (1.8)
Brazil 14.5 (1.3) 13.8 (1.2) 6.4 (0.8) 6.3 (1.0) 73.5 (1.7)
Brunei Darussalam 26.3 (1.8) 21.0 (1.6) 14.7 (1.5) 11.6 (1.4) 52.7 (2.2)
B-S-J-Z (China) 39.4 (1.5) 30.5 (1.4) 31.2 (1.3) 8.8 (0.8) 29.5 (1.4)
Bulgaria 14.4 (1.7) 15.2 (1.6) 9.2 (1.1) 9.5 (1.2) 66.1 (1.8)
Costa Rica 21.1 (2.3) 15.3 (1.8) 7.1 (1.3) 9.3 (1.0) 68.3 (2.3)
Croatia 18.7 (1.3) 23.4 (1.3) 11.5 (1.2) 10.3 (1.0) 54.8 (1.8)
Cyprus 11.0 (1.2) 15.6 (1.3) 7.1 (1.1) 11.4 (1.5) 65.9 (1.7)
Dominican Republic 9.0 (2.0) 8.4 (2.1) 2.9 (1.0) 3.7 (1.0) 85.0 (2.5)
Georgia 15.4 (2.0) 11.3 (1.5) 4.9 (1.1) 8.8 (1.4) 75.0 (2.3)
Hong Kong (China) 34.3 (1.5) 36.3 (1.2) 21.8 (1.2) 8.2 (0.8) 33.7 (1.4)
Indonesia 13.8 (1.9) 13.5 (1.6) 10.6 (1.6) 12.3 (2.0) 63.5 (2.4)
Kazakhstan 20.2 (1.4) 11.9 (1.0) 7.4 (0.9) 11.1 (1.1) 69.6 (1.7)
Kosovo 6.7 (1.5) 6.2 (1.7) 2.8 (1.0) 8.7 (1.6) 82.3 (2.2)
Macao (China) 38.8 (2.1) 23.9 (1.7) 25.8 (1.6) 12.5 (1.1) 37.8 (2.0)
Malaysia 27.2 (1.9) 23.2 (1.8) 10.4 (1.5) 10.8 (1.6) 55.7 (2.3)
Malta 18.1 (1.5) 23.5 (1.9) 12.0 (1.4) 9.1 (1.3) 55.4 (2.2)
Montenegro 13.7 (1.1) 10.4 (1.0) 6.7 (0.7) 11.2 (1.1) 71.7 (1.4)
Morocco 5.9 (1.3) 5.8 (1.3) 2.6 (1.0) 5.3 (1.2) 86.2 (2.0)
Panama 16.6 (2.9) 10.8 (1.9) 2.5 (0.9) 9.6 (2.1) 77.1 (2.3)
Peru 18.4 (1.7) 7.1 (1.3) 5.9 (1.4) 15.9 (1.8) 71.0 (2.3)
Philippines 19.3 (2.8) 16.0 (2.1) 6.9 (1.3) 10.1 (1.6) 67.0 (2.9)
Qatar 22.8 (1.0) 16.5 (0.8) 12.1 (0.8) 10.6 (0.7) 60.8 (1.2)
Russia 22.1 (1.3) 28.0 (1.2) 8.1 (0.8) 8.9 (1.0) 55.1 (1.5)
Serbia 17.7 (1.6) 13.3 (1.2) 7.8 (1.0) 12.3 (1.1) 66.5 (1.8)
Singapore 38.4 (1.2) 32.1 (1.2) 31.9 (1.3) 9.5 (0.8) 26.5 (1.0)
Chinese Taipei 45.6 (1.4) 27.1 (1.2) 26.3 (1.4) 13.6 (1.0) 33.0 (1.5)
Thailand 23.9 (1.9) 14.5 (1.4) 14.1 (1.6) 13.0 (1.4) 58.4 (2.4)
United Arab Emirates 24.8 (1.2) 20.8 (1.1) 17.0 (1.2) 12.0 (1.0) 50.2 (1.4)
Uruguay 7.7 (1.2) 9.6 (1.4) 3.7 (0.8) 7.3 (1.2) 79.5 (1.7)
Overall Average 22.3 (0.2) 18.8 (0.2) 11.1 (0.1) 11.2 (0.1) 59.0 (0.2)
1. In 2018, some regions in Spain conducted their high-stakes exams for tenth-grade students earlier in the year than in the past, which resulted in the testing period for
these exams coinciding with the end of the PISA testing window. Because of this overlap, a number of students were negatively disposed towards the PISA test and did not
try their best to demonstrate their proficiency. Although the data of only a minority of students show clear signs of lack of engagement (see PISA 2018 Results Volume I,
Annex A9), the comparability of PISA 2018 data for Spain with those from earlier PISA assessments cannot be fully ensured.
Notes: The computation is based on 10 plausible values with student weights 80 replicates.
The process (log) data analysed in this report focuses on a total of 76 270 students from 70 countries and economies who responded to the “Rapa Nui” reading unit (CR551)
with an overall average reading performance score of 517 points, marginally higher than the average reading score of the full sample 460 points from the 70 countries and
economies.
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934240655
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 169
Annex B Results for countries and economies
OECD average 570 (1.1) 582 (1.5) 539 (1.4) 524 (0.7)
1. In 2018, some regions in Spain conducted their high-stakes exams for tenth-grade students earlier in the year than in the past, which resulted in the testing period for
these exams coinciding with the end of the PISA testing window. Because of this overlap, a number of students were negatively disposed towards the PISA test and did not
try their best to demonstrate their proficiency. Although the data of only a minority of students show clear signs of lack of engagement (see PISA 2018 Results Volume I,
Annex A9), the comparability of PISA 2018 data for Spain with those from earlier PISA assessments cannot be fully ensured.
Notes: The computation is based on 10 plausible values with student weights 80 replicates.
The process (log) data analysed in this report focuses on a total of 76 270 students from 70 countries and economies who responded to the “Rapa Nui” reading unit (CR551)
with an overall average reading performance score of 517 points, marginally higher than the average reading score of the full sample 460 points from the 70 countries and
economies.
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934240655
170
Results for countries and economies Annex B
Baku (Azerbaijan) 465 (10.4) 505 (19.7) 457 (10.6) 434 (4.9)
Belarus 545 (6.1) 563 (10.6) 522 (7.2) 504 (4.0)
Bosnia and Herzegovina 502 (9.5) 507 (13.2) 468 (11.6) 457 (4.3)
Brazil 530 (7.8) 566 (11.1) 546 (14.2) 468 (4.0)
Brunei Darussalam 523 (8.1) 549 (8.0) 472 (11.4) 458 (5.1)
B-S-J-Z (China) 592 (4.1) 618 (4.4) 573 (7.4) 548 (3.5)
Bulgaria 536 (9.4) 532 (9.9) 512 (12.6) 483 (5.2)
Costa Rica 515 (10.2) 525 (13.1) 482 (16.5) 472 (6.5)
Croatia 558 (5.5) 560 (8.1) 543 (7.7) 507 (4.4)
Cyprus 520 (8.8) 556 (12.9) 508 (8.8) 475 (4.2)
Dominican Republic 518 (19.4) 528 (26.9) 467 (23.0) 406 (6.7)
Georgia 499 (13.4) 511 (16.3) 477 (14.6) 438 (5.4)
Hong Kong (China) 585 (3.8) 601 (4.5) 567 (9.2) 535 (5.5)
Indonesia 478 (8.6) 482 (9.9) 449 (11.0) 398 (5.1)
Kazakhstan 490 (7.5) 508 (11.5) 450 (8.8) 427 (4.1)
Kosovo 447 (15.7) 495 (22.8) 418 (13.2) 394 (4.9)
Macao (China) 585 (7.2) 599 (5.7) 536 (9.5) 538 (5.0)
Malaysia 501 (7.1) 520 (9.5) 461 (11.6) 444 (3.8)
Malta 561 (6.7) 572 (9.6) 540 (12.8) 504 (6.5)
Montenegro 514 (9.1) 540 (11.0) 490 (7.4) 470 (3.3)
Morocco 442 (16.1) 468 (27.2) 430 (22.7) 403 (4.5)
Panama 493 (16.6) 488 (26.8) 498 (18.0) 434 (10.6)
Peru 514 (19.7) 549 (14.4) 479 (10.9) 457 (6.7)
Philippines 478 (11.1) 466 (21.0) 454 (17.9) 396 (7.4)
Qatar 556 (4.9) 570 (6.9) 509 (7.6) 468 (2.9)
Russia 554 (4.4) 575 (8.8) 527 (10.0) 508 (4.2)
Serbia 545 (7.4) 568 (7.6) 496 (8.5) 482 (4.8)
Singapore 615 (3.7) 635 (3.8) 578 (6.9) 549 (5.2)
Chinese Taipei 565 (4.7) 589 (6.1) 548 (7.5) 515 (4.7)
Thailand 463 (9.1) 493 (7.7) 458 (10.9) 431 (4.7)
United Arab Emirates 569 (4.6) 571 (5.7) 511 (7.7) 477 (3.6)
Uruguay 532 (11.7) 540 (17.0) 511 (12.1) 483 (4.6)
Overall average 548 (1.0) 563 (1.4) 520 (1.3) 497 (0.6)
1. In 2018, some regions in Spain conducted their high-stakes exams for tenth-grade students earlier in the year than in the past, which resulted in the testing period for
these exams coinciding with the end of the PISA testing window. Because of this overlap, a number of students were negatively disposed towards the PISA test and did not
try their best to demonstrate their proficiency. Although the data of only a minority of students show clear signs of lack of engagement (see PISA 2018 Results Volume I,
Annex A9), the comparability of PISA 2018 data for Spain with those from earlier PISA assessments cannot be fully ensured.
Notes: The computation is based on 10 plausible values with student weights 80 replicates.
The process (log) data analysed in this report focuses on a total of 76 270 students from 70 countries and economies who responded to the “Rapa Nui” reading unit (CR551)
with an overall average reading performance score of 517 points, marginally higher than the average reading score of the full sample 460 points from the 70 countries and
economies.
Source: OECD, PISA 2018 Database.
12 https://doi.org/10.1787/888934240655
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 171
Annex B Results for countries and economies
Austria -0.28 (0.02) 19 (0.9) 25.5 (0.7) 21.9 (0.6) 25.6 (0.6) 27.0 (0.7)
Belgium -0.44 (0.02) 19 (0.9) 17.6 (0.5) 24.6 (0.5) 29.4 (0.6) 28.4 (0.6)
Canada 0.01 (0.01) 28 (0.9) 20.5 (0.4) 31.5 (0.5) 30.3 (0.5) 17.6 (0.4)
Chile 0.04 (0.02) 20 (1.2) 16.6 (0.6) 22.2 (0.5) 38.1 (0.7) 23.1 (0.6)
Colombia 0.38 (0.01) 24 (1.4) 21.3 (0.6) 36.9 (0.6) 33.2 (0.8) 8.6 (0.4)
Czech Republic -0.05 (0.02) 18 (1.1) 21.5 (0.6) 29.2 (0.6) 28.5 (0.6) 20.8 (0.6)
Denmark -0.37 (0.02) 20 (1.4) 13.7 (0.5) 29.2 (0.7) 34.0 (0.7) 23.1 (0.6)
Estonia 0.00 (0.01) 26 (1.9) 21.7 (0.7) 34.9 (0.7) 29.7 (0.8) 13.7 (0.6)
Finland -0.25 (0.02) 29 (1.2) 17.3 (0.6) 32.1 (0.7) 30.4 (0.7) 20.3 (0.6)
France -0.12 (0.02) 24 (1.1) 27.1 (0.7) 29.9 (0.6) 22.5 (0.6) 20.6 (0.6)
Germany -0.29 (0.02) 19 (1.2) 25.4 (0.7) 24.2 (0.8) 26.4 (0.7) 23.9 (0.8)
Greece 0.11 (0.01) 26 (1.6) 18.6 (0.6) 33.6 (0.8) 36.8 (0.7) 11.0 (0.5)
Hungary 0.03 (0.02) 20 (1.2) 26.8 (0.8) 32.7 (0.7) 27.4 (0.8) 13.1 (0.5)
Iceland -0.22 (0.02) 28 (1.8) 14.5 (0.6) 29.0 (0.8) 34.6 (0.9) 22.0 (0.7)
Ireland -0.07 (0.02) 31 (1.1) 18.3 (0.6) 30.2 (0.7) 32.4 (0.8) 19.1 (0.6)
Israel 0.09 (0.02) 13 (1.7) 26.8 (0.7) 28.9 (0.6) 29.2 (0.8) 15.1 (0.5)
Italy 0.09 (0.02) 15 (1.1) 28.5 (0.8) 33.1 (0.9) 23.6 (0.6) 14.8 (0.5)
Japan 0.30 (0.02) 26 (0.9) 28.6 (0.6) 32.1 (0.6) 21.0 (0.5) 18.3 (0.5)
Korea 0.23 (0.01) 26 (1.7) 14.2 (0.4) 31.8 (0.7) 44.3 (0.8) 9.7 (0.4)
Latvia 0.02 (0.01) 25 (1.5) 20.7 (0.6) 34.7 (0.7) 29.8 (0.6) 14.7 (0.5)
Lithuania -0.11 (0.02) 11 (1.1) 20.0 (0.5) 20.5 (0.6) 35.5 (0.6) 24.0 (0.7)
Luxembourg -0.24 (0.01) 20 (1.3) 23.0 (0.6) 23.6 (0.6) 26.5 (0.6) 26.9 (0.6)
Mexico 0.35 (0.01) 22 (1.4) 19.7 (0.6) 33.6 (0.7) 37.7 (0.8) 9.1 (0.4)
Netherlands -0.57 (0.02) 23 (1.5) 12.5 (0.6) 24.6 (0.6) 34.1 (0.9) 28.8 (0.7)
New Zealand -0.08 (0.02) 29 (1.2) 18.2 (0.5) 29.6 (0.6) 32.2 (0.5) 20.0 (0.6)
Norway -0.51 (0.02) 21 (1.2) 14.3 (0.4) 27.7 (0.6) 30.7 (0.6) 27.2 (0.6)
Poland 0.18 (0.02) 28 (1.3) 27.3 (0.8) 33.2 (0.6) 26.8 (0.7) 12.7 (0.5)
Portugal 0.08 (0.02) 23 (1.4) 29.5 (0.7) 40.0 (0.8) 21.0 (0.6) 9.6 (0.5)
Slovak Republic 0.11 (0.02) 21 (1.3) 25.2 (0.7) 30.3 (0.7) 28.6 (0.7) 15.9 (0.6)
Slovenia -0.22 (0.02) 14 (1.3) 16.2 (0.6) 27.0 (0.7) 33.8 (0.7) 23.0 (0.6)
Spain2 0.08 (0.01) 22 (0.7) 27.2 (0.4) 29.8 (0.3) 23.6 (0.3) 19.4 (0.3)
Sweden -0.31 (0.02) 20 (1.6) 15.5 (0.6) 27.6 (0.8) 32.1 (0.7) 24.8 (0.7)
Switzerland -0.32 (0.02) 21 (1.3) 22.4 (0.7) 24.2 (0.7) 26.1 (0.6) 27.2 (0.8)
Turkey 0.68 (0.02) 17 (1.4) 39.8 (0.7) 36.2 (0.6) 17.7 (0.5) 6.3 (0.4)
United Kingdom -0.21 (0.02) 26 (1.2) 16.5 (0.5) 29.3 (0.6) 32.8 (0.6) 21.4 (0.6)
United States -0.07 (0.02) 25 (1.6) 15.2 (0.6) 29.8 (0.7) 33.5 (0.6) 21.4 (0.6)
OECD average -0.06 (0.00) 22 (0.2) 21.3 (0.1) 29.7 (0.1) 30.1 (0.1) 19.0 (0.1)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
3. Jordan and Morocco have reliabilities lower than 0.60 on the index of enjoyment of reading.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger (†)
means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
172
Results for countries and economies Annex B
Table B.4.1 [2/6] Enjoyment of reading
Based on students' reports
Change in reading Percentage of students who reported that:
performance associated
with a one-unit increase I read only if I have to
in the index of enjoyment
Index of enjoyment of
of reading, after
reading
accounting for students'
and schools'
socio-economic profile1
and gender Strongly disagree Disagree Agree Strongly agree
Mean index S.E. s Score dif. S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Albania 0.69 (0.01) 17 (1.6) 28.4 (0.7) 32.7 (0.6) 28.1 (0.6) 10.8 (0.4)
Partners
Argentina 0.02 (0.01) 21 (1.2) 16.9 (0.5) 21.4 (0.5) 42.5 (0.8) 19.3 (0.5)
Baku (Azerbaijan) 0.43 (0.01) 20 (1.3) 20.8 (0.6) 26.5 (0.6) 35.0 (0.7) 17.7 (0.5)
Belarus 0.30 (0.02) 23 (1.6) 16.1 (0.7) 38.0 (0.6) 38.9 (0.9) 7.1 (0.3)
Bosnia and Herzegovina -0.03 (0.02) 16 (1.3) 19.3 (0.7) 28.9 (0.7) 35.1 (0.7) 16.8 (0.6)
Brazil 0.37 (0.01) 22 (1.3) 33.5 (0.6) 42.2 (0.6) 17.9 (0.5) 6.4 (0.3)
Brunei Darussalam 0.25 (0.01) 26 (1.1) 14.4 (0.4) 26.5 (0.5) 43.2 (0.5) 16.0 (0.5)
B-S-J-Z (China) 0.97 (0.02) 22 (1.3) 46.1 (0.9) 42.0 (0.8) 9.6 (0.5) 2.3 (0.3)
Bulgaria 0.23 (0.02) 18 (1.7) 27.1 (0.8) 26.3 (0.7) 34.5 (0.8) 12.0 (0.5)
Costa Rica 0.10 (0.02) 14 (1.1) 18.3 (0.7) 24.3 (0.6) 37.1 (0.7) 20.3 (0.6)
Croatia -0.28 (0.02) 14 (1.3) 14.2 (0.5) 28.5 (0.7) 37.6 (0.6) 19.7 (0.6)
Cyprus -0.21 (0.01) 19 (1.2) 15.9 (0.5) 29.9 (0.6) 38.6 (0.8) 15.6 (0.5)
Dominican Republic 0.40 (0.01) 16 (1.6) 24.2 (0.6) 26.9 (0.7) 33.7 (0.7) 15.1 (0.7)
Georgia 0.43 (0.01) 28 (1.8) 25.9 (0.7) 36.0 (0.8) 31.3 (0.8) 6.8 (0.4)
Hong Kong (China) 0.29 (0.01) 26 (1.4) 16.2 (0.5) 37.7 (0.7) 34.9 (0.7) 11.2 (0.5)
Indonesia 0.50 (0.01) 16 (2.1) 19.3 (0.8) 48.9 (0.9) 27.7 (1.0) 4.1 (0.3)
Jordan 0.47 (0.01) 21 (1.5) 42.4 (0.7) 31.5 (0.7) 20.2 (0.6) 5.9 (0.3)
Kazakhstan 0.53 (0.01) 10 (1.3) 30.0 (0.6) 51.4 (0.5) 15.1 (0.4) 3.4 (0.1)
Kosovo 0.61 (0.01) 18 (1.6) 25.2 (0.8) 32.8 (0.8) 33.2 (0.8) 8.8 (0.4)
Lebanon m m m m m m m m m m m m
Macao (China) 0.26 (0.01) 31 (1.8) 15.4 (0.6) 38.0 (0.8) 35.2 (0.8) 11.5 (0.5)
Malaysia 0.40 (0.01) 24 (1.7) 21.5 (0.7) 46.7 (0.8) 26.7 (0.8) 5.1 (0.3)
Malta 0.04 (0.02) 30 (1.5) 23.6 (0.7) 30.7 (0.8) 30.8 (0.9) 14.9 (0.7)
Moldova 0.24 (0.01) 19 (1.6) 19.7 (0.6) 39.6 (0.8) 34.3 (0.8) 6.3 (0.4)
Montenegro 0.15 (0.01) 17 (1.1) 25.5 (0.6) 31.6 (0.6) 29.2 (0.6) 13.7 (0.4)
Morocco 3 0.38 (0.01) 9 (1.4) 29.1 (0.6) 40.4 (0.7) 24.1 (0.7) 6.3 (0.4)
North Macedonia m m m m m m m m m m m m
Panama 0.32 (0.02) 17 (1.3) 19.7 (0.7) 27.1 (0.7) 38.1 (0.8) 15.1 (0.5)
Peru 0.46 (0.01) 19 (1.5) 17.2 (0.6) 38.2 (0.8) 35.8 (0.8) 8.8 (0.4)
Philippines 0.53 (0.01) 27 (1.4) 9.7 (0.5) 27.3 (0.5) 53.9 (0.7) 9.2 (0.4)
Qatar 0.27 (0.01) 26 (0.9) 23.5 (0.4) 29.8 (0.4) 34.5 (0.5) 12.2 (0.3)
Romania 0.10 (0.03) 15 (1.5) 17.0 (0.7) 29.9 (1.0) 41.7 (1.2) 11.4 (0.6)
Russia 0.33 (0.01) 27 (1.3) 24.5 (0.7) 42.2 (0.6) 26.4 (0.7) 6.9 (0.4)
Saudi Arabia 0.33 (0.01) 18 (1.4) 24.8 (0.7) 27.5 (0.7) 39.3 (0.7) 8.3 (0.4)
Serbia -0.02 (0.02) 15 (1.5) 19.9 (0.6) 29.3 (0.6) 33.6 (0.8) 17.3 (0.5)
Singapore 0.19 (0.01) 26 (1.2) 20.4 (0.5) 33.9 (0.6) 31.3 (0.6) 14.4 (0.4)
Chinese Taipei 0.34 (0.02) 31 (1.4) 15.9 (0.6) 34.4 (0.6) 38.3 (0.7) 11.4 (0.5)
Thailand 0.27 (0.01) 19 (1.6) 10.2 (0.5) 34.7 (0.7) 47.1 (0.7) 8.1 (0.4)
Ukraine 0.28 (0.01) 24 (1.3) 26.1 (0.7) 47.3 (0.7) 20.9 (0.6) 5.8 (0.3)
United Arab Emirates 0.38 (0.01) 27 (1.1) 24.2 (0.5) 27.6 (0.3) 34.6 (0.5) 13.7 (0.4)
Uruguay 0.24 (0.02) 25 (1.4) 30.2 (0.8) 33.3 (0.7) 23.8 (0.7) 12.8 (0.5)
Viet Nam 0.49 (0.02) m m 22.0 (0.8) 52.5 (0.9) 23.2 (0.9) 2.4 (0.2)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
3. Jordan and Morocco have reliabilities lower than 0.60 on the index of enjoyment of reading.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger (†)
means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 173
Annex B Results for countries and economies
Reading is one of my favourite hobbies I like talking about books with other people
Strongly Strongly
disagree Disagree Agree Strongly agree disagree Disagree Agree Strongly agree
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Australia 31.6 (0.5) 35.2 (0.5) 21.7 (0.4) 11.5 (0.3) 32.3 (0.5) 33.6 (0.5) 25.1 (0.5) 9.0 (0.3)
OECD
Austria 46.7 (0.8) 24.9 (0.6) 17.3 (0.4) 11.1 (0.5) 48.3 (0.8) 25.4 (0.7) 15.9 (0.5) 10.4 (0.5)
Belgium 47.8 (0.8) 30.6 (0.6) 14.8 (0.4) 6.8 (0.3) 44.9 (0.7) 29.6 (0.6) 20.6 (0.5) 5.0 (0.3)
Canada 26.9 (0.5) 36.5 (0.4) 24.0 (0.4) 12.6 (0.4) 27.8 (0.5) 33.0 (0.5) 29.4 (0.5) 9.8 (0.3)
Chile 27.5 (0.6) 34.5 (0.6) 24.1 (0.6) 13.9 (0.5) 26.6 (0.7) 31.0 (0.6) 28.9 (0.6) 13.5 (0.6)
Colombia 9.7 (0.4) 34.5 (0.6) 40.6 (0.7) 15.1 (0.6) 10.8 (0.5) 36.6 (0.8) 39.9 (0.8) 12.6 (0.4)
Czech Republic 28.9 (0.8) 34.3 (0.7) 24.1 (0.7) 12.7 (0.5) 28.3 (0.8) 32.8 (0.7) 28.8 (0.7) 10.0 (0.5)
Denmark 50.5 (0.7) 32.8 (0.7) 11.6 (0.5) 5.1 (0.3) 37.9 (0.8) 33.0 (0.7) 22.2 (0.7) 6.9 (0.4)
Estonia 28.3 (0.7) 40.3 (0.8) 23.0 (0.6) 8.4 (0.4) 25.4 (0.6) 35.0 (0.8) 31.6 (0.7) 7.9 (0.4)
Finland 37.6 (0.8) 37.0 (0.7) 18.1 (0.5) 7.3 (0.4) 35.3 (0.8) 34.0 (0.8) 23.1 (0.6) 7.5 (0.4)
France 40.1 (0.8) 29.3 (0.6) 20.1 (0.6) 10.5 (0.4) 38.3 (0.9) 26.3 (0.7) 26.6 (0.8) 8.8 (0.4)
Germany 46.8 (0.9) 26.7 (0.7) 16.0 (0.7) 10.5 (0.5) 47.9 (1.0) 28.1 (0.7) 14.8 (0.7) 9.1 (0.4)
Greece 21.6 (0.6) 42.8 (0.8) 27.3 (0.7) 8.4 (0.4) 25.8 (0.6) 36.1 (0.7) 28.1 (0.6) 10.0 (0.4)
Hungary 26.6 (0.8) 38.5 (0.7) 23.5 (0.7) 11.3 (0.5) 30.6 (0.8) 34.3 (0.7) 24.8 (0.7) 10.3 (0.5)
Iceland 37.9 (0.7) 38.4 (0.7) 16.5 (0.6) 7.3 (0.5) 35.7 (0.8) 29.1 (0.7) 26.1 (0.7) 9.1 (0.5)
Ireland 29.2 (0.7) 40.0 (0.6) 20.0 (0.5) 10.8 (0.5) 28.7 (0.7) 37.9 (0.7) 24.9 (0.7) 8.5 (0.4)
Israel 26.5 (0.7) 31.7 (0.7) 26.6 (0.8) 15.2 (0.5) 30.8 (0.7) 29.4 (0.6) 28.3 (0.7) 11.6 (0.5)
Italy 29.2 (0.7) 31.6 (0.6) 25.9 (0.6) 13.3 (0.5) 28.1 (0.7) 31.2 (0.8) 30.0 (0.7) 10.8 (0.5)
Japan 23.8 (0.7) 31.0 (0.6) 26.2 (0.6) 19.1 (0.6) 23.5 (0.7) 33.3 (0.7) 29.5 (0.6) 13.7 (0.5)
Korea 16.6 (0.6) 41.2 (0.7) 30.4 (0.6) 11.7 (0.4) 16.4 (0.6) 38.4 (0.7) 33.0 (0.7) 12.3 (0.4)
Latvia 24.1 (0.6) 42.6 (0.7) 23.4 (0.6) 9.9 (0.4) 21.0 (0.6) 39.0 (0.7) 30.9 (0.6) 9.1 (0.4)
Lithuania 36.7 (0.8) 29.1 (0.7) 23.8 (0.7) 10.4 (0.4) 32.3 (0.7) 27.0 (0.6) 27.3 (0.7) 13.5 (0.4)
Luxembourg 42.0 (0.7) 30.6 (0.7) 17.3 (0.5) 10.1 (0.4) 42.0 (0.6) 29.5 (0.7) 20.0 (0.5) 8.4 (0.4)
Mexico 10.4 (0.4) 33.0 (0.7) 41.0 (0.8) 15.6 (0.5) 12.2 (0.5) 36.3 (0.7) 38.2 (0.7) 13.4 (0.5)
Netherlands 52.1 (0.9) 29.7 (0.7) 12.8 (0.5) 5.4 (0.4) 51.6 (0.9) 29.5 (0.8) 15.2 (0.5) 3.8 (0.4)
New Zealand 30.0 (0.8) 36.2 (0.7) 21.8 (0.6) 11.9 (0.5) 31.4 (0.7) 35.0 (0.8) 24.4 (0.6) 9.2 (0.4)
Norway 49.8 (0.7) 29.9 (0.7) 13.2 (0.5) 7.1 (0.4) 48.3 (0.8) 28.1 (0.6) 17.6 (0.6) 6.0 (0.3)
Poland 22.4 (0.6) 37.0 (0.6) 26.7 (0.6) 13.8 (0.6) 19.2 (0.7) 36.1 (0.7) 33.4 (0.7) 11.3 (0.5)
Portugal 28.8 (0.7) 39.2 (0.8) 22.9 (0.6) 9.2 (0.5) 24.9 (0.7) 37.1 (0.7) 30.1 (0.7) 8.0 (0.5)
Slovak Republic 21.6 (0.7) 37.7 (0.7) 26.6 (0.7) 14.1 (0.5) 23.2 (0.7) 37.8 (0.7) 28.6 (0.7) 10.4 (0.5)
Slovenia 36.9 (0.7) 37.3 (0.8) 16.9 (0.6) 8.9 (0.5) 31.6 (0.7) 34.9 (0.7) 24.9 (0.7) 8.6 (0.4)
Spain2 31.2 (0.4) 33.0 (0.5) 22.3 (0.3) 13.5 (0.3) 29.5 (0.4) 29.3 (0.4) 28.5 (0.3) 12.7 (0.3)
Sweden 40.0 (0.9) 35.6 (0.7) 16.9 (0.6) 7.6 (0.5) 34.7 (0.8) 33.1 (0.7) 23.9 (0.7) 8.3 (0.4)
Switzerland 47.4 (1.0) 26.7 (0.8) 16.4 (0.5) 9.5 (0.5) 46.2 (0.9) 27.1 (0.7) 18.2 (0.7) 8.5 (0.5)
Turkey 10.6 (0.4) 25.1 (0.6) 40.3 (0.6) 23.9 (0.7) 10.9 (0.5) 26.0 (0.6) 39.7 (0.6) 23.4 (0.6)
United Kingdom 35.2 (0.9) 37.3 (0.8) 18.8 (0.5) 8.6 (0.3) 34.0 (0.8) 35.4 (0.6) 23.1 (0.6) 7.4 (0.4)
United States 28.9 (0.7) 38.1 (0.8) 22.3 (0.6) 10.7 (0.5) 24.6 (0.7) 35.4 (0.8) 30.4 (0.8) 9.7 (0.5)
OECD average 31.9 (0.1) 34.3 (0.1) 22.6 (0.1) 11.2 (0.1) 30.8 (0.1) 32.6 (0.1) 26.6 (0.1) 10.0 (0.1)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
3. Jordan and Morocco have reliabilities lower than 0.60 on the index of enjoyment of reading.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
174
Results for countries and economies Annex B
Reading is one of my favourite hobbies I like talking about books with other people
Strongly Strongly
disagree Disagree Agree Strongly agree disagree Disagree Agree Strongly agree
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Albania 4.5 (0.3) 18.7 (0.5) 50.6 (0.7) 26.2 (0.7) 5.5 (0.3) 18.1 (0.6) 53.5 (0.6) 22.9 (0.7)
Partners
Argentina 23.5 (0.5) 36.5 (0.7) 26.2 (0.6) 13.8 (0.4) 26.7 (0.7) 35.5 (0.8) 26.4 (0.6) 11.4 (0.5)
Baku (Azerbaijan) 10.7 (0.4) 20.5 (0.6) 45.5 (0.7) 23.3 (0.5) 12.3 (0.5) 24.6 (0.7) 41.6 (0.9) 21.6 (0.5)
Belarus 7.6 (0.4) 45.5 (0.9) 35.7 (0.7) 11.3 (0.5) 8.1 (0.4) 36.7 (0.9) 42.5 (0.7) 12.7 (0.6)
Bosnia and Herzegovina 26.1 (0.7) 39.5 (0.8) 25.1 (0.6) 9.3 (0.5) 24.9 (0.8) 33.7 (0.7) 31.6 (0.7) 9.8 (0.5)
Brazil 14.5 (0.4) 38.5 (0.6) 33.1 (0.6) 13.9 (0.4) 14.7 (0.4) 36.9 (0.5) 35.6 (0.5) 12.8 (0.4)
Brunei Darussalam 10.3 (0.4) 33.0 (0.6) 40.0 (0.6) 16.7 (0.5) 12.7 (0.4) 41.2 (0.6) 34.1 (0.6) 12.0 (0.4)
B-S-J-Z (China) 2.3 (0.2) 14.5 (0.6) 52.8 (0.9) 30.4 (0.7) 3.0 (0.2) 21.3 (0.5) 53.0 (0.8) 22.7 (0.8)
Bulgaria 16.4 (0.6) 35.4 (0.8) 32.8 (0.8) 15.5 (0.6) 20.1 (0.7) 32.3 (0.7) 33.6 (0.8) 14.0 (0.6)
Costa Rica 23.7 (0.6) 34.3 (0.6) 25.3 (0.5) 16.7 (0.6) 24.6 (0.7) 36.4 (0.6) 26.1 (0.5) 12.9 (0.5)
Croatia 36.7 (0.8) 40.8 (0.6) 16.0 (0.5) 6.6 (0.3) 35.1 (0.7) 35.8 (0.6) 22.0 (0.6) 7.1 (0.4)
Cyprus 38.3 (0.7) 38.4 (0.7) 17.2 (0.6) 6.1 (0.3) 32.9 (0.6) 32.0 (0.7) 26.4 (0.6) 8.6 (0.4)
Dominican Republic 12.0 (0.5) 26.1 (0.7) 40.9 (0.9) 21.0 (0.7) 13.1 (0.5) 28.3 (0.7) 41.3 (0.8) 17.4 (0.6)
Georgia 9.3 (0.5) 29.7 (0.7) 43.2 (0.7) 17.7 (0.6) 9.3 (0.5) 26.5 (0.6) 45.4 (0.6) 18.9 (0.6)
Hong Kong (China) 9.9 (0.5) 30.8 (0.6) 43.5 (0.8) 15.8 (0.5) 12.0 (0.5) 39.3 (0.6) 39.7 (0.6) 8.9 (0.4)
Indonesia 3.9 (0.3) 22.4 (0.8) 57.6 (0.8) 16.1 (0.8) 4.2 (0.4) 26.1 (0.9) 59.9 (0.9) 9.9 (0.5)
Jordan 11.9 (0.4) 21.7 (0.7) 44.5 (0.9) 21.8 (0.7) 14.2 (0.5) 27.1 (0.8) 41.5 (0.8) 17.3 (0.6)
Kazakhstan 5.7 (0.3) 19.9 (0.4) 57.4 (0.5) 17.0 (0.4) 7.4 (0.3) 30.7 (0.5) 49.6 (0.6) 12.2 (0.3)
Kosovo 7.0 (0.4) 18.2 (0.6) 53.7 (0.8) 21.2 (0.7) 7.2 (0.4) 19.6 (0.7) 53.0 (0.9) 20.2 (0.8)
Lebanon m m m m m m m m m m m m m m m m
Macao (China) 9.1 (0.5) 34.1 (0.8) 41.6 (0.7) 15.2 (0.5) 11.0 (0.5) 43.1 (0.8) 38.4 (0.8) 7.5 (0.4)
Malaysia 6.5 (0.3) 31.0 (0.7) 47.4 (0.8) 15.1 (0.6) 7.8 (0.3) 39.6 (0.9) 42.4 (0.9) 10.2 (0.5)
Malta 25.4 (0.8) 35.9 (0.8) 25.4 (0.8) 13.3 (0.6) 27.4 (0.8) 34.4 (0.8) 28.9 (0.8) 9.3 (0.5)
Moldova 11.2 (0.5) 41.8 (0.9) 35.6 (0.8) 11.4 (0.4) 11.6 (0.6) 37.6 (0.6) 40.0 (0.7) 10.8 (0.5)
Montenegro 22.4 (0.5) 37.9 (0.6) 28.7 (0.6) 11.0 (0.3) 21.2 (0.5) 31.5 (0.6) 35.0 (0.6) 12.2 (0.4)
Morocco3 14.5 (0.5) 24.5 (0.6) 43.3 (0.7) 17.7 (0.6) 15.8 (0.5) 27.6 (0.7) 43.5 (0.8) 13.2 (0.5)
North Macedonia m m m m m m m m m m m m m m m m
Panama 13.7 (0.6) 29.0 (0.6) 35.9 (0.8) 21.3 (0.6) 16.5 (0.8) 32.3 (0.8) 34.2 (0.9) 17.0 (0.7)
Peru 6.3 (0.4) 30.7 (0.7) 46.2 (0.7) 16.7 (0.7) 6.6 (0.4) 33.3 (0.8) 45.8 (0.6) 14.3 (0.5)
Philippines 4.8 (0.3) 22.3 (0.5) 51.6 (0.6) 21.3 (0.5) 5.9 (0.4) 26.8 (0.7) 50.8 (0.7) 16.5 (0.5)
Qatar 15.8 (0.3) 29.8 (0.4) 35.4 (0.4) 19.0 (0.4) 17.8 (0.3) 30.9 (0.4) 36.5 (0.4) 14.7 (0.3)
Romania 18.6 (0.7) 38.1 (0.9) 31.2 (0.8) 12.1 (0.7) 18.5 (0.8) 37.2 (0.9) 32.7 (1.0) 11.6 (0.6)
Russia 10.5 (0.5) 39.3 (0.7) 36.8 (0.8) 13.5 (0.4) 11.2 (0.5) 35.6 (0.6) 40.5 (0.8) 12.7 (0.5)
Saudi Arabia 14.4 (0.4) 25.6 (0.7) 39.3 (0.6) 20.7 (0.7) 18.0 (0.6) 28.0 (0.7) 36.7 (0.6) 17.3 (0.6)
Serbia 25.1 (0.7) 40.7 (0.7) 24.4 (0.7) 9.8 (0.4) 24.5 (0.7) 33.6 (0.7) 31.3 (0.7) 10.6 (0.4)
Singapore 16.1 (0.5) 35.0 (0.5) 33.0 (0.6) 15.9 (0.4) 20.6 (0.5) 38.2 (0.6) 29.2 (0.6) 11.9 (0.4)
Chinese Taipei 7.9 (0.4) 31.9 (0.6) 42.4 (0.7) 17.8 (0.6) 9.4 (0.4) 37.3 (0.7) 41.5 (0.6) 11.8 (0.5)
Thailand 4.8 (0.3) 28.6 (0.7) 55.0 (0.7) 11.6 (0.5) 7.3 (0.4) 38.2 (0.7) 48.1 (0.8) 6.3 (0.4)
Ukraine 13.0 (0.5) 43.1 (0.7) 33.2 (0.8) 10.7 (0.5) 12.8 (0.5) 32.2 (0.8) 42.6 (0.8) 12.4 (0.5)
United Arab Emirates 13.0 (0.3) 26.2 (0.4) 38.0 (0.5) 22.8 (0.4) 14.4 (0.3) 25.6 (0.6) 39.8 (0.6) 20.2 (0.4)
Uruguay 22.2 (0.7) 33.1 (0.7) 28.5 (0.7) 16.2 (0.7) 24.1 (0.6) 36.2 (0.7) 26.8 (0.7) 12.9 (0.6)
Viet Nam 3.2 (0.3) 24.3 (0.9) 58.9 (0.9) 13.5 (0.6) 6.3 (0.4) 43.3 (0.9) 41.2 (0.9) 9.1 (0.5)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
3. Jordan and Morocco have reliabilities lower than 0.60 on the index of enjoyment of reading.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 175
Annex B Results for countries and economies
For me, reading is a waste of time I read only to get information that I need
Strongly Strongly
disagree Disagree Agree Strongly agree disagree Disagree Agree Strongly agree
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Australia 30.3 (0.5) 38.2 (0.5) 19.1 (0.4) 12.4 (0.3) 16.5 (0.3) 30.3 (0.4) 35.0 (0.5) 18.1 (0.4)
OECD
Austria 38.9 (0.8) 26.0 (0.6) 20.0 (0.5) 15.1 (0.5) 23.8 (0.7) 22.7 (0.6) 28.3 (0.6) 25.2 (0.7)
Belgium 23.7 (0.6) 34.2 (0.6) 21.9 (0.6) 20.2 (0.6) 17.8 (0.4) 27.3 (0.6) 38.4 (0.6) 16.5 (0.5)
Canada 32.9 (0.5) 40.7 (0.4) 16.5 (0.4) 10.0 (0.3) 19.0 (0.4) 34.0 (0.5) 31.4 (0.5) 15.6 (0.4)
Chile 40.3 (0.9) 39.9 (0.7) 14.0 (0.5) 5.8 (0.4) 16.5 (0.6) 25.1 (0.5) 39.1 (0.7) 19.3 (0.5)
Colombia 40.0 (0.8) 47.6 (0.7) 9.5 (0.5) 3.0 (0.2) 17.5 (0.6) 36.4 (0.8) 36.2 (0.8) 9.9 (0.4)
Czech Republic 27.2 (0.7) 34.5 (0.6) 21.2 (0.6) 17.2 (0.6) 19.9 (0.5) 34.1 (0.8) 35.3 (0.7) 10.8 (0.4)
Denmark 26.5 (0.6) 42.5 (0.7) 18.4 (0.5) 12.6 (0.5) 13.9 (0.5) 31.9 (0.8) 36.5 (0.7) 17.8 (0.6)
Estonia 31.8 (0.7) 41.2 (0.7) 17.0 (0.6) 10.0 (0.5) 16.0 (0.5) 37.5 (0.7) 36.8 (0.7) 9.7 (0.5)
Finland 25.7 (0.7) 36.7 (0.6) 21.7 (0.6) 16.0 (0.6) 15.0 (0.5) 33.1 (0.7) 36.7 (0.7) 15.2 (0.6)
France 32.4 (0.8) 36.0 (0.6) 17.7 (0.5) 14.0 (0.5) 23.0 (0.7) 29.0 (0.7) 31.9 (0.7) 16.1 (0.6)
Germany 37.4 (0.9) 28.3 (0.7) 19.6 (0.6) 14.7 (0.6) 22.2 (0.7) 23.3 (0.7) 30.3 (0.7) 24.1 (0.7)
Greece 40.1 (0.8) 43.9 (0.6) 10.6 (0.6) 5.4 (0.3) 21.0 (0.7) 41.8 (0.8) 27.5 (0.7) 9.7 (0.4)
Hungary 33.5 (0.9) 36.6 (0.8) 20.2 (0.8) 9.8 (0.5) 19.8 (0.6) 31.5 (0.7) 34.7 (0.8) 14.0 (0.6)
Iceland 27.3 (0.8) 42.0 (0.9) 19.1 (0.8) 11.5 (0.5) 16.4 (0.7) 32.6 (0.8) 34.7 (0.8) 16.3 (0.7)
Ireland 32.5 (0.7) 40.7 (0.7) 17.7 (0.6) 9.1 (0.4) 15.1 (0.5) 32.9 (0.8) 37.5 (0.8) 14.5 (0.5)
Israel 37.4 (0.8) 35.6 (0.5) 17.4 (0.5) 9.6 (0.4) 22.8 (0.7) 28.2 (0.6) 33.1 (0.8) 15.9 (0.5)
Italy 36.8 (0.7) 37.1 (0.8) 16.3 (0.6) 9.7 (0.4) 18.8 (0.6) 33.6 (0.6) 34.4 (0.7) 13.2 (0.5)
Japan 46.5 (0.8) 37.9 (0.7) 9.7 (0.4) 5.9 (0.3) 30.9 (0.7) 41.1 (0.6) 20.0 (0.5) 7.9 (0.3)
Korea 39.0 (0.8) 45.5 (0.7) 12.6 (0.5) 3.0 (0.2) 23.6 (0.6) 41.8 (0.7) 28.3 (0.6) 6.3 (0.3)
Latvia 29.2 (0.6) 43.7 (0.8) 17.1 (0.6) 10.0 (0.5) 13.5 (0.5) 35.1 (0.7) 38.1 (0.7) 13.3 (0.5)
Lithuania 37.6 (0.7) 28.1 (0.7) 21.5 (0.5) 12.9 (0.5) 18.8 (0.6) 22.5 (0.6) 39.5 (0.6) 19.3 (0.6)
Luxembourg 33.3 (0.6) 33.0 (0.6) 18.6 (0.6) 15.0 (0.5) 23.1 (0.6) 27.4 (0.6) 30.3 (0.5) 19.1 (0.5)
Mexico 43.0 (0.8) 44.1 (0.8) 10.1 (0.5) 2.8 (0.2) 15.6 (0.5) 30.7 (0.8) 40.6 (0.8) 13.1 (0.5)
Netherlands 21.2 (0.6) 36.3 (0.8) 22.1 (0.6) 20.4 (0.6) 13.6 (0.5) 27.2 (0.7) 41.4 (1.0) 17.9 (0.6)
New Zealand 32.6 (0.6) 39.6 (0.6) 17.8 (0.5) 10.0 (0.5) 16.1 (0.5) 32.0 (0.6) 33.8 (0.6) 18.1 (0.5)
Norway 23.1 (0.7) 36.4 (0.8) 20.6 (0.5) 19.9 (0.6) 12.3 (0.4) 26.0 (0.6) 37.3 (0.7) 24.4 (0.6)
Poland 35.0 (0.8) 40.0 (0.7) 16.3 (0.5) 8.7 (0.4) 18.9 (0.6) 35.8 (0.7) 35.7 (0.7) 9.7 (0.5)
Portugal 35.1 (0.8) 43.3 (0.7) 14.7 (0.6) 6.9 (0.4) 17.9 (0.6) 35.7 (0.8) 34.2 (0.7) 12.3 (0.6)
Slovak Republic 32.8 (0.8) 39.3 (0.7) 18.1 (0.7) 9.8 (0.5) 19.7 (0.6) 33.4 (0.7) 35.9 (0.8) 10.9 (0.4)
Slovenia 25.9 (0.8) 37.4 (0.8) 21.5 (0.6) 15.2 (0.5) 16.5 (0.6) 30.5 (0.7) 38.2 (0.8) 14.8 (0.5)
Spain2 39.4 (0.4) 38.1 (0.4) 14.0 (0.3) 8.5 (0.2) 23.0 (0.4) 31.7 (0.4) 32.5 (0.4) 12.8 (0.2)
Sweden 24.7 (0.8) 37.5 (0.7) 21.4 (0.6) 16.3 (0.5) 14.6 (0.5) 28.3 (0.6) 37.0 (0.8) 20.0 (0.7)
Switzerland 34.5 (0.9) 30.0 (0.6) 19.5 (0.6) 16.1 (0.6) 21.7 (0.6) 25.8 (0.7) 30.5 (0.8) 22.0 (0.7)
Turkey 55.7 (0.8) 32.1 (0.6) 7.7 (0.4) 4.5 (0.3) 30.2 (0.7) 36.2 (0.6) 23.5 (0.6) 10.1 (0.4)
United Kingdom 28.8 (0.7) 40.4 (0.6) 18.9 (0.5) 11.9 (0.5) 14.9 (0.5) 28.3 (0.6) 37.9 (0.6) 18.9 (0.6)
United States 30.2 (0.7) 41.8 (0.7) 18.3 (0.6) 9.7 (0.4) 14.6 (0.6) 31.9 (0.7) 36.4 (0.7) 17.1 (0.6)
OECD average 33.6 (0.1) 38.0 (0.1) 17.3 (0.1) 11.2 (0.1) 18.8 (0.1) 31.5 (0.1) 34.3 (0.1) 15.4 (0.1)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
3. Jordan and Morocco have reliabilities lower than 0.60 on the index of enjoyment of reading.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
176
Results for countries and economies Annex B
For me, reading is a waste of time I read only to get information that I need
Strongly Strongly
disagree Disagree Agree Strongly agree disagree Disagree Agree Strongly agree
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Albania 57.0 (0.7) 29.4 (0.6) 9.2 (0.4) 4.3 (0.3) 19.6 (0.6) 35.1 (0.8) 31.5 (0.7) 13.8 (0.4)
Partners
Argentina 37.4 (0.7) 41.8 (0.7) 14.0 (0.4) 6.8 (0.4) 15.1 (0.5) 20.8 (0.7) 42.4 (0.8) 21.8 (0.6)
Baku (Azerbaijan) 45.5 (0.6) 34.6 (0.7) 13.0 (0.5) 6.9 (0.3) 19.3 (0.5) 27.9 (0.7) 34.2 (0.6) 18.6 (0.5)
Belarus 36.6 (1.0) 49.5 (0.9) 11.0 (0.4) 2.9 (0.2) 10.2 (0.5) 34.8 (0.7) 42.5 (0.8) 12.5 (0.5)
Bosnia and Herzegovina 28.4 (0.7) 42.6 (0.8) 18.5 (0.5) 10.5 (0.5) 16.0 (0.6) 27.7 (0.6) 41.0 (0.6) 15.3 (0.5)
Brazil 44.3 (0.7) 45.5 (0.6) 7.0 (0.3) 3.2 (0.2) 16.1 (0.4) 32.2 (0.5) 38.1 (0.5) 13.5 (0.4)
Brunei Darussalam 43.8 (0.5) 43.3 (0.5) 10.3 (0.3) 2.5 (0.2) 10.6 (0.4) 24.4 (0.4) 40.9 (0.5) 24.1 (0.5)
B-S-J-Z (China) 63.2 (0.9) 31.5 (0.8) 3.9 (0.3) 1.4 (0.2) 26.1 (0.7) 46.6 (0.6) 22.3 (0.6) 5.1 (0.3)
Bulgaria 37.8 (0.9) 38.2 (0.9) 15.5 (0.6) 8.5 (0.5) 18.6 (0.7) 27.3 (0.7) 37.9 (0.8) 16.2 (0.6)
Costa Rica 38.8 (1.0) 42.6 (0.8) 12.6 (0.5) 5.9 (0.3) 18.4 (0.5) 25.9 (0.8) 36.8 (0.7) 18.9 (0.7)
Croatia 21.8 (0.7) 39.8 (0.6) 24.1 (0.6) 14.3 (0.5) 13.8 (0.5) 27.8 (0.5) 42.0 (0.7) 16.5 (0.5)
Cyprus 24.0 (0.6) 45.3 (0.6) 18.3 (0.5) 12.4 (0.4) 13.5 (0.5) 31.4 (0.6) 39.3 (0.7) 15.8 (0.5)
Dominican Republic 43.4 (0.8) 41.3 (0.8) 10.2 (0.5) 5.1 (0.3) 18.8 (0.6) 26.7 (0.7) 37.3 (0.7) 17.2 (0.5)
Georgia 38.8 (0.8) 43.7 (0.8) 12.2 (0.5) 5.3 (0.4) 11.7 (0.5) 30.6 (0.7) 41.3 (0.9) 16.3 (0.6)
Hong Kong (China) 35.1 (0.6) 43.7 (0.7) 15.7 (0.5) 5.5 (0.4) 15.3 (0.5) 38.1 (0.7) 35.9 (0.8) 10.7 (0.4)
Indonesia 39.8 (0.9) 46.0 (0.8) 11.6 (0.8) 2.6 (0.3) 8.9 (0.5) 35.5 (0.9) 44.8 (1.0) 10.8 (0.6)
Jordan 42.8 (0.8) 33.5 (0.6) 15.9 (0.6) 7.7 (0.4) 15.5 (0.4) 28.1 (0.7) 37.9 (0.7) 18.5 (0.5)
Kazakhstan 37.1 (0.6) 51.3 (0.6) 8.7 (0.3) 2.8 (0.1) 10.9 (0.3) 34.5 (0.5) 41.9 (0.5) 12.7 (0.3)
Kosovo 56.4 (0.8) 31.1 (0.7) 9.4 (0.5) 3.1 (0.2) 17.9 (0.6) 37.3 (0.8) 33.1 (0.8) 11.7 (0.5)
Lebanon m m m m m m m m m m m m m m m m
Macao (China) 34.5 (0.7) 47.1 (0.8) 13.9 (0.5) 4.5 (0.3) 13.2 (0.6) 36.4 (0.9) 36.4 (0.7) 13.9 (0.6)
Malaysia 51.1 (0.9) 39.1 (0.8) 7.7 (0.4) 2.1 (0.3) 6.8 (0.4) 26.2 (0.8) 45.7 (0.8) 21.4 (0.6)
Malta 35.5 (0.8) 37.9 (0.7) 16.4 (0.6) 10.2 (0.6) 17.7 (0.7) 32.9 (0.8) 33.0 (0.8) 16.4 (0.7)
Moldova 31.8 (0.7) 47.0 (0.9) 17.5 (0.7) 3.6 (0.3) 11.1 (0.5) 29.6 (0.8) 44.6 (0.8) 14.7 (0.6)
Montenegro 39.1 (0.6) 39.3 (0.6) 12.9 (0.4) 8.7 (0.3) 18.1 (0.5) 32.1 (0.6) 35.4 (0.6) 14.4 (0.5)
Morocco3 37.8 (0.7) 49.1 (0.7) 9.5 (0.4) 3.6 (0.3) 13.9 (0.5) 30.8 (0.7) 40.0 (0.7) 15.3 (0.5)
North Macedonia m m m m m m m m m m m m m m m m
Panama 40.1 (0.8) 44.5 (0.8) 11.1 (0.6) 4.3 (0.3) 16.9 (0.6) 26.6 (0.8) 37.7 (0.8) 18.7 (0.6)
Peru 42.0 (0.8) 49.3 (0.8) 6.7 (0.4) 1.9 (0.2) 15.1 (0.6) 38.2 (0.6) 37.7 (0.7) 9.0 (0.4)
Philippines 35.1 (0.8) 40.6 (0.7) 20.2 (0.7) 4.2 (0.3) 9.0 (0.4) 28.6 (0.6) 48.7 (0.8) 13.7 (0.5)
Qatar 36.3 (0.5) 36.7 (0.4) 19.1 (0.3) 7.9 (0.2) 17.3 (0.4) 30.6 (0.5) 35.1 (0.5) 17.0 (0.3)
Romania 32.6 (1.1) 39.9 (0.8) 20.0 (0.8) 7.5 (0.5) 12.4 (0.6) 26.7 (1.0) 42.3 (1.0) 18.7 (0.7)
Russia 36.2 (0.9) 45.7 (0.8) 13.2 (0.5) 4.9 (0.3) 12.2 (0.4) 35.2 (0.7) 38.3 (0.7) 14.3 (0.4)
Saudi Arabia 45.8 (0.8) 31.1 (0.7) 16.4 (0.5) 6.7 (0.4) 16.2 (0.6) 24.4 (0.6) 39.0 (0.7) 20.4 (0.6)
Serbia 30.1 (0.6) 40.3 (0.7) 18.9 (0.6) 10.7 (0.4) 14.5 (0.5) 28.5 (0.6) 39.8 (0.6) 17.2 (0.5)
Singapore 37.0 (0.6) 41.0 (0.5) 16.1 (0.4) 5.9 (0.3) 17.8 (0.5) 34.3 (0.6) 29.3 (0.6) 18.5 (0.5)
Chinese Taipei 33.6 (0.8) 43.8 (0.8) 17.4 (0.6) 5.2 (0.3) 16.8 (0.7) 35.0 (0.7) 36.9 (0.6) 11.3 (0.5)
Thailand 26.6 (0.8) 48.9 (0.8) 21.0 (0.8) 3.5 (0.3) 9.4 (0.4) 32.4 (0.7) 46.6 (0.7) 11.7 (0.4)
Ukraine 35.1 (0.8) 49.0 (0.6) 11.5 (0.5) 4.5 (0.3) 11.1 (0.4) 29.4 (0.8) 42.5 (0.7) 17.1 (0.5)
United Arab Emirates 41.2 (0.5) 33.6 (0.4) 17.0 (0.4) 8.2 (0.3) 17.9 (0.3) 29.8 (0.4) 33.6 (0.5) 18.8 (0.4)
Uruguay 44.8 (0.9) 39.7 (0.7) 10.1 (0.4) 5.4 (0.4) 22.9 (0.8) 29.8 (0.7) 30.3 (0.8) 17.0 (0.6)
Viet Nam 42.3 (1.2) 50.0 (1.0) 5.8 (0.5) 1.8 (0.2) 12.6 (0.6) 41.2 (0.9) 37.9 (0.9) 8.3 (0.5)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
3. Jordan and Morocco have reliabilities lower than 0.60 on the index of enjoyment of reading.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 177
Annex B Results for countries and economies
Table B.4.16 [1/6] Average time of reading for enjoyment, reading performance, and enjoyment of reading, by the format
of reading
Average time of reading for enjoyment (in hours)1, by the format of reading
OECD average 0.9 (0.01) 5.2 (0.02) 4.3 (0.04) 6.6 (0.04)
1. Students were allowed to respond in intervals of no time, I do not read for enjoyment, 30 minutes or less a day, more than 30 minutes to less than 60 minutes a day,
1 to 2 hours a day, and more than 2 hours a day. The responses were converted to the average number of minutes in the interval (0, 15.5, 45.5, 90.5, 180.5), multiply by 7 and
divided by 60 to reflect the total number of hours a week reading for enjoyment.
2. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
3. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12https://doi.org/10.1787/888934240674
178
Results for countries and economies Annex B
Table B.4.16 [2/6] Average time of reading for enjoyment, reading performance, and enjoyment of reading, by the format
of reading
Average time of reading for enjoyment (in hours)1, by the format of reading
1. Students were allowed to respond in intervals of no time, I do not read for enjoyment, 30 minutes or less a day, more than 30 minutes to less than 60 minutes a day,
1 to 2 hours a day, and more than 2 hours a day. The responses were converted to the average number of minutes in the interval (0, 15.5, 45.5, 90.5, 180.5), multiply by 7 and
divided by 60 to reflect the total number of hours a week reading for enjoyment.
2. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
3. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 179
Annex B Results for countries and economies
Table B.4.16 [3/6] Average time of reading for enjoyment, reading performance, and enjoyment of reading, by the format of
reading
Difference between students who read books in the following way and those who "rarely or never read books"
Before accounting for students' and school's After accounting for students' and school's
socio-economic profile2, and gender socio-economic profile, and gender
Austria 4.3 (0.16) 3.6 (0.23) 5.3 (0.25) 3.9 (0.17) 3.4 (0.23) 4.9 (0.24)
Belgium 3.9 (0.11) 4.0 (0.29) 6.0 (0.30) 3.7 (0.11) 3.8 (0.28) 5.8 (0.29)
Canada 4.2 (0.10) 4.1 (0.17) 5.8 (0.15) 3.9 (0.10) 4.0 (0.16) 5.5 (0.15)
Chile 4.0 (0.16) 4.9 (0.19) 6.5 (0.22) 3.6 (0.16) 4.5 (0.19) 6.0 (0.21)
Colombia 3.7 (0.15) 3.3 (0.17) 4.5 (0.23) 3.5 (0.15) 3.0 (0.17) 4.1 (0.22)
Czech Republic 4.8 (0.14) 2.9 (0.24) 6.1 (0.23) 4.1 (0.17) 2.5 (0.25) 5.5 (0.24)
Denmark 2.8 (0.12) 2.2 (0.16) 3.4 (0.22) 2.6 (0.12) 2.1 (0.16) 3.2 (0.22)
Estonia 4.7 (0.16) 3.9 (0.29) 5.5 (0.26) 4.0 (0.17) 3.6 (0.28) 4.9 (0.26)
Finland 4.1 (0.14) 2.6 (0.21) 6.3 (0.33) 3.7 (0.15) 2.6 (0.21) 6.0 (0.33)
France 4.5 (0.15) 3.9 (0.25) 6.6 (0.29) 4.3 (0.15) 3.7 (0.25) 6.3 (0.30)
Germany 4.5 (0.16) 3.0 (0.30) 5.2 (0.30) 4.2 (0.17) 2.9 (0.29) 4.9 (0.30)
Greece 4.4 (0.22) 3.0 (0.21) 5.1 (0.24) 3.8 (0.23) 2.8 (0.21) 4.5 (0.25)
Hungary 6.3 (0.20) 3.7 (0.21) 6.6 (0.28) 5.4 (0.21) 3.3 (0.21) 5.9 (0.27)
Iceland 2.7 (0.15) 2.6 (0.26) 4.4 (0.31) 2.6 (0.15) 2.6 (0.26) 4.3 (0.31)
Ireland 4.3 (0.14) 3.5 (0.21) 5.9 (0.27) 4.2 (0.15) 3.4 (0.20) 5.8 (0.26)
Israel 5.4 (0.17) 3.1 (0.27) 6.2 (0.32) 5.2 (0.17) 3.0 (0.27) 6.0 (0.33)
Italy 5.5 (0.14) 3.8 (0.27) 7.9 (0.36) 4.8 (0.15) 3.5 (0.26) 7.2 (0.36)
Japan 4.3 (0.12) 2.5 (0.23) 6.6 (0.25) 4.4 (0.12) 2.4 (0.23) 6.6 (0.25)
Korea 4.5 (0.13) 3.3 (0.14) 6.6 (0.28) 4.4 (0.12) 3.2 (0.14) 6.4 (0.28)
Latvia 5.6 (0.21) 4.0 (0.29) 6.3 (0.28) 4.8 (0.24) 3.6 (0.30) 5.6 (0.29)
Lithuania 4.3 (0.15) 3.2 (0.28) 4.5 (0.22) 3.7 (0.15) 3.0 (0.27) 4.0 (0.23)
Luxembourg 4.0 (0.16) 3.9 (0.28) 5.5 (0.26) 3.6 (0.16) 3.8 (0.27) 5.1 (0.24)
Mexico 3.6 (0.16) 3.6 (0.20) 5.7 (0.23) 3.5 (0.16) 3.4 (0.19) 5.2 (0.22)
Netherlands 3.4 (0.12) 4.0 (0.34) 4.4 (0.31) 3.1 (0.15) 3.8 (0.32) 4.0 (0.31)
New Zealand 4.1 (0.11) 4.4 (0.20) 6.1 (0.25) 3.9 (0.13) 4.3 (0.20) 5.8 (0.25)
Norway 3.2 (0.12) 2.2 (0.16) 4.8 (0.31) 3.0 (0.12) 2.1 (0.17) 4.6 (0.30)
Poland 6.3 (0.16) 3.7 (0.25) 6.6 (0.28) 5.5 (0.16) 3.4 (0.24) 5.8 (0.28)
Portugal 4.2 (0.15) 3.9 (0.27) 5.8 (0.26) 3.5 (0.17) 3.6 (0.25) 5.1 (0.25)
Slovak Republic 5.9 (0.18) 3.2 (0.25) 5.8 (0.26) 5.2 (0.20) 3.0 (0.24) 5.3 (0.26)
Slovenia 3.9 (0.14) 2.7 (0.25) 4.6 (0.31) 3.3 (0.15) 2.5 (0.25) 4.2 (0.30)
Spain3 4.4 (0.09) 4.1 (0.13) 6.6 (0.13) 4.1 (0.09) 3.8 (0.13) 6.2 (0.13)
Sweden 3.2 (0.12) 2.9 (0.21) 4.4 (0.26) 3.0 (0.12) 2.8 (0.20) 4.2 (0.26)
Switzerland 3.6 (0.12) 3.2 (0.32) 5.5 (0.33) 3.3 (0.13) 3.0 (0.32) 5.2 (0.32)
Turkey 4.4 (0.16) 3.3 (0.25) 6.2 (0.31) 3.6 (0.16) 3.0 (0.23) 5.3 (0.27)
United Kingdom 3.7 (0.12) 3.2 (0.20) 5.5 (0.23) 3.6 (0.13) 3.0 (0.19) 5.3 (0.23)
United States 4.0 (0.19) 4.1 (0.27) 5.4 (0.20) 3.8 (0.18) 3.9 (0.26) 5.2 (0.20)
OECD average 4.3 (0.02) 3.5 (0.04) 5.7 (0.04) 3.9 (0.03) 3.3 (0.04) 5.3 (0.04)
1. Students were allowed to respond in intervals of no time, I do not read for enjoyment, 30 minutes or less a day, more than 30 minutes to less than 60 minutes a day,
1 to 2 hours a day, and more than 2 hours a day. The responses were converted to the average number of minutes in the interval (0, 15.5, 45.5, 90.5, 180.5), multiply by 7 and
divided by 60 to reflect the total number of hours a week reading for enjoyment.
2. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
3. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
180
Results for countries and economies Annex B
Table B.4.16 [4/6] Average time of reading for enjoyment, reading performance, and enjoyment of reading, by the format of reading
Difference between students who read books in the following way and those who "rarely or never read books"
Before accounting for students' and school's After accounting for students' and school's
socio-economic profile2, and gender socio-economic profile, and gender
Argentina 4.9 (0.24) 4.7 (0.26) 6.3 (0.23) 4.6 (0.24) 4.5 (0.25) 5.8 (0.23)
Baku (Azerbaijan) 2.8 (0.30) 2.6 (0.34) 3.5 (0.30) 2.7 (0.31) 2.4 (0.34) 3.4 (0.31)
Belarus 6.7 (0.21) 4.6 (0.22) 7.1 (0.21) 5.4 (0.19) 3.8 (0.20) 5.9 (0.20)
Bosnia and Herzegovina 5.5 (0.20) 3.0 (0.16) 5.7 (0.19) 4.9 (0.20) 2.7 (0.16) 5.1 (0.18)
Brazil 5.3 (0.16) 3.8 (0.15) 6.9 (0.21) 4.7 (0.15) 3.6 (0.15) 6.3 (0.21)
Brunei Darussalam 3.8 (0.23) 4.7 (0.21) 6.5 (0.22) 3.0 (0.23) 3.7 (0.22) 5.0 (0.22)
B-S-J-Z (China) 3.7 (0.32) 4.9 (0.38) 5.4 (0.35) 4.0 (0.35) 5.0 (0.39) 5.7 (0.37)
Bulgaria 5.9 (0.20) 3.6 (0.21) 7.1 (0.34) 5.0 (0.19) 3.2 (0.22) 6.2 (0.31)
Costa Rica 5.2 (0.25) 5.0 (0.21) 6.9 (0.23) 4.8 (0.23) 4.7 (0.20) 6.3 (0.23)
Croatia 4.5 (0.13) 2.7 (0.13) 5.3 (0.20) 3.8 (0.12) 2.4 (0.14) 4.8 (0.20)
Cyprus 3.6 (0.17) 3.3 (0.20) 5.8 (0.29) 3.3 (0.18) 3.2 (0.20) 5.5 (0.30)
Dominican Republic 2.3 (0.26) 4.0 (0.27) 4.1 (0.31) 2.5 (0.26) 3.6 (0.26) 3.8 (0.30)
Georgia 6.5 (0.24) 2.9 (0.35) 5.9 (0.28) 5.2 (0.25) 2.5 (0.33) 4.7 (0.27)
Hong Kong (China) 4.4 (0.21) 5.2 (0.21) 7.0 (0.29) 4.3 (0.21) 5.0 (0.21) 6.8 (0.28)
Indonesia 2.4 (0.33) 2.1 (0.27) 3.5 (0.33) 2.2 (0.32) 1.6 (0.28) 3.2 (0.33)
Jordan 2.5 (0.19) 2.4 (0.23) 4.9 (0.28) 2.2 (0.18) 2.1 (0.22) 4.5 (0.27)
Kazakhstan 4.9 (0.16) 3.4 (0.18) 5.2 (0.18) 4.2 (0.16) 2.8 (0.17) 4.5 (0.18)
Kosovo 5.6 (0.36) 3.8 (0.39) 6.5 (0.36) 4.7 (0.37) 3.2 (0.39) 5.5 (0.35)
Lebanon m m m m m m m m m m m m
Macao (China) 4.6 (0.35) 5.8 (0.30) 6.5 (0.25) 4.6 (0.35) 5.7 (0.31) 6.3 (0.26)
Malaysia 4.2 (0.25) 4.0 (0.24) 5.8 (0.28) 3.5 (0.24) 3.2 (0.26) 4.7 (0.29)
Malta 4.3 (0.16) 3.2 (0.20) 5.6 (0.30) 4.0 (0.16) 3.0 (0.19) 5.3 (0.30)
Moldova 5.2 (0.23) 3.5 (0.24) 6.4 (0.22) 4.2 (0.21) 2.8 (0.24) 5.0 (0.25)
Montenegro 5.8 (0.17) 3.1 (0.19) 5.7 (0.21) 5.1 (0.16) 2.7 (0.20) 5.0 (0.20)
Morocco 2.4 (0.17) 2.2 (0.18) 3.0 (0.25) 2.2 (0.18) 1.9 (0.17) 2.7 (0.23)
North Macedonia m m m m m m m m m m m m
Panama 3.8 (0.23) 4.6 (0.24) 6.6 (0.28) 3.7 (0.23) 4.4 (0.24) 6.3 (0.27)
Peru 3.2 (0.19) 3.4 (0.21) 4.7 (0.21) 3.1 (0.18) 3.3 (0.21) 4.3 (0.21)
Philippines 2.0 (0.27) 4.0 (0.24) 4.8 (0.24) 1.6 (0.28) 2.8 (0.25) 3.5 (0.23)
Qatar 4.0 (0.12) 3.4 (0.14) 5.4 (0.16) 3.7 (0.12) 3.3 (0.15) 5.1 (0.16)
Romania 6.3 (0.22) 3.2 (0.25) 6.9 (0.25) 5.2 (0.23) 2.6 (0.25) 5.8 (0.26)
Russia 6.2 (0.21) 5.0 (0.27) 6.5 (0.24) 5.3 (0.23) 4.5 (0.26) 5.7 (0.25)
Saudi Arabia 3.7 (0.20) 3.2 (0.19) 4.3 (0.26) 3.3 (0.21) 2.8 (0.20) 3.7 (0.27)
Serbia 5.3 (0.16) 3.1 (0.20) 6.2 (0.24) 4.7 (0.15) 2.9 (0.20) 5.6 (0.23)
Singapore 3.7 (0.15) 4.3 (0.16) 5.5 (0.19) 3.4 (0.15) 4.1 (0.16) 5.1 (0.20)
Chinese Taipei 6.2 (0.27) 3.9 (0.21) 6.7 (0.25) 5.8 (0.25) 3.7 (0.20) 6.3 (0.26)
Thailand 2.5 (0.26) 3.8 (0.22) 5.4 (0.27) 2.2 (0.26) 3.0 (0.21) 4.2 (0.27)
Ukraine 6.1 (0.17) 4.3 (0.17) 6.2 (0.21) 5.1 (0.18) 3.7 (0.17) 5.1 (0.22)
United Arab Emirates 4.1 (0.19) 3.5 (0.17) 5.7 (0.15) 3.7 (0.18) 3.3 (0.17) 5.2 (0.15)
Uruguay 5.7 (0.23) 5.0 (0.26) 7.1 (0.26) 5.1 (0.23) 4.6 (0.26) 6.4 (0.26)
Viet Nam 3.8 (0.18) 3.9 (0.15) 5.8 (0.25) 3.5 (0.18) 3.4 (0.16) 5.2 (0.25)
1. Students were allowed to respond in intervals of no time, I do not read for enjoyment, 30 minutes or less a day, more than 30 minutes to less than 60 minutes a day,
1 to 2 hours a day, and more than 2 hours a day. The responses were converted to the average number of minutes in the interval (0, 15.5, 45.5, 90.5, 180.5), multiply by 7 and
divided by 60 to reflect the total number of hours a week reading for enjoyment.
2. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
3. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger (†)
means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 181
Annex B Results for countries and economies
Table B.4.16 [5/6] Average time of reading for enjoyment, reading performance, and enjoyment of reading, by the format of
reading
Difference between students who read books in the following way and those who "rarely or never read books"
Reading performance
Before accounting for students' and school's After accounting for students' and school's
socio-economic profile2, and gender socio-economic profile, and gender
1. Students were allowed to respond in intervals of no time, I do not read for enjoyment, 30 minutes or less a day, more than 30 minutes to less than 60 minutes a day, 1 to 2
hours a day, and more than 2 hours a day. The responses were converted to the average number of minutes in the interval (0, 15.5, 45.5, 90.5, 180.5), multiply by 7 and divided
by 60 to reflect the total number of hours a week reading for enjoyment.
2. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
3. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240674
182
Results for countries and economies Annex B
Table B.4.16 [6/6] Average time of reading for enjoyment, reading performance, and enjoyment of reading, by the format of reading
Difference between students who read books in the following way and those who "rarely or never read books"
Reading performance
Before accounting for students' and school's After accounting for students' and school's
socio-economic profile2, and gender socio-economic profile, and gender
Austria -0.35 (0.02) 54.9 (0.8) 33.9 (0.8) 8.7 (0.4) 2.5 (0.3)
Belgium 0.16 (0.01) 26.2 (0.5) 54.0 (0.7) 16.8 (0.5) 3.0 (0.2)
Canada -0.14 (0.01) 38.7 (0.5) 47.3 (0.4) 12.0 (0.3) 2.1 (0.1)
Chile 0.17 (0.02) 28.0 (0.7) 45.2 (0.8) 23.2 (0.8) 3.7 (0.3)
Colombia 0.59 (0.02) 13.4 (0.6) 45.2 (0.8) 37.2 (0.8) 4.1 (0.3)
Czech Republic 0.30 (0.02) 26.3 (0.8) 55.9 (0.8) 14.9 (0.6) 2.9 (0.3)
Denmark -0.35 (0.01) 44.0 (0.8) 47.2 (0.8) 7.4 (0.4) 1.4 (0.2)
Estonia -0.09 (0.01) 37.1 (0.7) 51.7 (0.7) 10.0 (0.4) 1.2 (0.2)
Finland -0.25 (0.02) 44.0 (0.8) 44.8 (0.8) 9.3 (0.4) 1.9 (0.2)
France 0.13 (0.02) 33.3 (0.8) 41.8 (0.7) 20.8 (0.6) 4.1 (0.3)
Germany -0.36 (0.02) 51.3 (0.9) 38.6 (0.8) 8.3 (0.4) 1.8 (0.2)
Greece 0.13 (0.02) 34.8 (0.7) 46.7 (0.7) 16.4 (0.7) 2.1 (0.2)
Hungary -0.07 (0.02) 37.1 (0.8) 47.5 (0.9) 13.8 (0.6) 1.6 (0.2)
Iceland 0.04 (0.02) 35.9 (0.8) 43.3 (0.8) 16.8 (0.6) 4.0 (0.4)
Ireland -0.02 (0.02) 32.6 (0.7) 49.0 (0.6) 15.9 (0.5) 2.5 (0.2)
Israel 0.05 (0.02) 38.7 (0.9) 41.0 (0.6) 17.0 (0.6) 3.3 (0.2)
Italy -0.13 (0.02) 40.0 (0.7) 46.3 (0.7) 11.4 (0.5) 2.3 (0.2)
Japan 0.32 (0.02) 24.4 (0.8) 51.7 (0.7) 19.2 (0.5) 4.8 (0.3)
Korea 0.08 (0.02) 36.0 (0.9) 43.6 (0.8) 17.4 (0.6) 3.0 (0.2)
Latvia 0.08 (0.01) 31.4 (0.7) 55.5 (0.7) 11.4 (0.5) 1.7 (0.2)
Lithuania -0.32 (0.01) 49.2 (0.6) 31.1 (0.7) 16.4 (0.5) 3.3 (0.2)
Luxembourg -0.18 (0.01) 44.9 (0.7) 39.9 (0.7) 12.3 (0.5) 2.9 (0.2)
Mexico 0.33 (0.02) 19.2 (0.7) 49.1 (0.8) 28.6 (0.8) 3.0 (0.2)
Netherlands -0.07 (0.02) 36.0 (0.8) 50.3 (0.8) 11.6 (0.6) 2.1 (0.2)
New Zealand 0.01 (0.01) 33.5 (0.6) 47.2 (0.6) 16.8 (0.5) 2.6 (0.2)
Norway 0.01 (0.02) 37.2 (0.8) 47.2 (0.7) 12.4 (0.5) 3.2 (0.3)
Poland 0.16 (0.02) 29.4 (0.7) 52.8 (0.7) 15.7 (0.5) 2.0 (0.2)
Portugal 0.05 (0.02) 34.5 (0.9) 46.9 (0.8) 16.9 (0.8) 1.6 (0.2)
Slovak Republic 0.24 (0.02) 29.3 (0.8) 52.6 (0.8) 15.1 (0.5) 2.9 (0.2)
Slovenia -0.02 (0.02) 34.1 (0.7) 52.4 (0.8) 11.3 (0.5) 2.1 (0.2)
Spain -0.06 (0.01) 35.8 (0.5) 46.2 (0.5) 15.4 (0.3) 2.7 (0.1)
Sweden 0.10 (0.02) 30.9 (0.8) 49.6 (0.8) 16.3 (0.6) 3.2 (0.3)
Switzerland -0.08 (0.02) 39.1 (1.0) 44.1 (0.8) 14.1 (0.7) 2.7 (0.2)
Turkey 0.21 (0.02) 27.1 (0.7) 46.1 (0.7) 22.3 (0.6) 4.5 (0.3)
United Kingdom -0.01 (0.02) 35.1 (0.8) 44.8 (0.7) 16.7 (0.5) 3.3 (0.2)
United States -0.06 (0.02) 34.5 (0.9) 48.0 (0.7) 15.1 (0.7) 2.3 (0.3)
OECD average 0.01 (0.00) 35.0 (0.1) 46.7 (0.1) 15.7 (0.1) 2.7 (0.0)
Note: Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one
dagger (†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240693
184
Results for countries and economies Annex B
Argentina 0.36 (0.02) 26.5 (0.8) 39.0 (0.6) 27.3 (0.7) 7.2 (0.4)
Baku (Azerbaijan) 0.38 (0.02) 25.1 (0.9) 45.0 (0.8) 24.4 (0.8) 5.5 (0.4)
Belarus 0.07 (0.02) 24.9 (0.9) 61.3 (0.8) 12.8 (0.6) 1.0 (0.1)
Bosnia and Herzegovina 0.17 (0.02) 31.0 (0.8) 43.1 (0.7) 22.4 (0.6) 3.5 (0.3)
Brazil 0.44 (0.01) 19.6 (0.5) 45.9 (0.5) 31.3 (0.6) 3.2 (0.2)
Brunei Darussalam 0.75 (0.01) 10.4 (0.3) 42.0 (0.6) 40.5 (0.6) 7.0 (0.3)
B-S-J-Z (China) 0.13 (0.02) 24.0 (0.8) 55.3 (0.7) 19.4 (0.7) 1.4 (0.2)
Bulgaria -0.15 (0.03) 51.1 (1.0) 32.8 (0.9) 11.4 (0.7) 4.7 (0.4)
Costa Rica 0.32 (0.02) 22.9 (0.8) 42.3 (0.7) 29.8 (0.8) 5.1 (0.3)
Croatia 0.14 (0.01) 28.1 (0.5) 52.9 (0.7) 16.0 (0.5) 3.0 (0.2)
Cyprus 0.31 (0.02) 25.6 (0.6) 47.9 (0.6) 21.7 (0.6) 4.9 (0.3)
Dominican Republic 0.48 (0.02) † 20.5 (0.6) 38.4 (1.0) 35.4 (0.8) 5.6 (0.4)
Georgia 0.12 (0.02) 31.1 (0.8) 50.9 (0.8) 15.4 (0.6) 2.5 (0.2)
Hong Kong (China) 0.24 (0.02) 21.1 (0.6) 60.0 (0.7) 16.7 (0.6) 2.2 (0.2)
Indonesia 0.83 (0.02) 8.5 (0.5) 41.3 (0.8) 47.1 (0.8) 3.1 (0.3)
Jordan 0.48 (0.02) 26.2 (0.8) 38.1 (0.7) 30.3 (0.7) 5.5 (0.3)
Kazakhstan 0.34 (0.01) 18.5 (0.4) 55.6 (0.6) 23.8 (0.5) 2.2 (0.1)
Kosovo 0.27 (0.02) 24.3 (0.7) 46.1 (0.7) 26.5 (0.8) 3.1 (0.3)
Lebanon m m m m m m m m m m
Macao (China) 0.45 (0.02) 15.9 (0.6) 55.0 (0.8) 24.8 (0.7) 4.3 (0.3)
Malaysia 0.73 (0.02) 12.1 (0.5) 48.6 (0.8) 35.7 (0.9) 3.6 (0.2)
Malta 0.05 (0.02) 32.6 (0.8) 47.5 (0.8) 16.8 (0.6) 3.1 (0.3)
Moldova 0.21 (0.02) 23.6 (0.8) 47.1 (0.8) 26.5 (0.8) 2.8 (0.2)
Montenegro 0.00 (0.01) 39.4 (0.6) 41.6 (0.6) 15.7 (0.5) 3.4 (0.2)
Morocco m m 20.3 (0.6) 36.1 (0.7) 38.7 (0.8) 4.8 (0.3)
North Macedonia m m m m m m m m m m
Panama 0.61 (0.02) 16.0 (0.7) 36.4 (0.9) 40.8 (1.0) 6.8 (0.4)
Peru 0.45 (0.02) 14.2 (0.6) 48.0 (0.7) 35.7 (0.7) 2.2 (0.2)
Philippines 0.87 (0.02) 9.2 (0.5) 38.9 (0.7) 47.7 (0.8) 4.3 (0.3)
Qatar 0.42 (0.01) 24.3 (0.4) 36.8 (0.5) 33.6 (0.4) 5.3 (0.2)
Romania 0.26 (0.03) 28.0 (1.0) 45.2 (0.8) 24.5 (1.1) 2.3 (0.3)
Russia -0.06 (0.02) 37.8 (0.9) 49.5 (0.7) 10.9 (0.5) 1.8 (0.2)
Saudi Arabia 0.51 (0.02) 25.0 (0.7) 32.5 (0.7) 34.2 (0.7) 8.3 (0.4)
Serbia 0.02 (0.02) 36.6 (0.5) 44.7 (0.7) 16.0 (0.6) 2.8 (0.2)
Singapore 0.18 (0.01) 27.0 (0.6) 48.5 (0.7) 20.7 (0.5) 3.8 (0.2)
Chinese Taipei 0.33 (0.02) 25.6 (0.7) 52.3 (0.7) 19.9 (0.6) 2.2 (0.2)
Thailand 1.04 (0.01) 7.2 (0.4) 31.7 (0.7) 56.8 (0.8) 4.2 (0.3)
Ukraine -0.05 (0.02) 35.0 (0.9) 52.3 (0.8) 10.6 (0.5) 2.1 (0.2)
United Arab Emirates 0.36 (0.01) 25.9 (0.5) 40.7 (0.6) 28.9 (0.5) 4.6 (0.2)
Uruguay 0.25 (0.02) 25.7 (0.7) 45.3 (0.7) 25.0 (0.7) 3.9 (0.3)
Viet Nam 0.90 (0.02) 7.8 (0.5) 43.9 (0.9) 44.7 (1.0) 3.6 (0.3)
Note: Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one
dagger (†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240693
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 185
Annex B Results for countries and economies
Many texts were too difficult for me I was lost when I had to navigate between different pages
Strongly Strongly
disagree Disagree Agree Strongly agree disagree Disagree Agree Strongly agree
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Australia 35.7 (0.5) 50.8 (0.5) 11.3 (0.3) 2.2 (0.2) 39.8 (0.5) 44.0 (0.5) 13.6 (0.3) 2.6 (0.1)
OECD
Austria 51.0 (0.8) 37.9 (0.9) 9.1 (0.4) 2.0 (0.2) 52.7 (0.7) 30.2 (0.7) 13.7 (0.5) 3.4 (0.3)
Belgium 26.2 (0.6) 57.4 (0.6) 13.6 (0.5) 2.8 (0.2) 28.9 (0.6) 48.0 (0.6) 19.5 (0.6) 3.6 (0.2)
Canada 39.5 (0.5) 49.3 (0.4) 9.4 (0.3) 1.8 (0.1) 41.5 (0.5) 43.0 (0.5) 13.0 (0.3) 2.6 (0.2)
Chile 27.9 (0.7) 52.2 (0.7) 17.2 (0.6) 2.6 (0.2) 33.0 (0.7) 46.5 (0.6) 17.4 (0.6) 3.0 (0.2)
Colombia 12.2 (0.5) 53.4 (0.8) 31.2 (0.9) 3.2 (0.3) 18.1 (0.6) 55.2 (0.9) 23.2 (0.7) 3.6 (0.3)
Czech Republic 19.5 (0.6) 55.4 (0.7) 21.6 (0.6) 3.5 (0.3) 21.9 (0.7) 50.0 (0.7) 23.5 (0.5) 4.6 (0.3)
Denmark 40.1 (0.8) 51.3 (0.8) 7.6 (0.4) 1.0 (0.2) 63.7 (0.7) 30.4 (0.7) 4.3 (0.3) 1.6 (0.2)
Estonia 32.7 (0.7) 55.1 (0.8) 11.1 (0.5) 1.1 (0.2) 39.3 (0.7) 43.5 (0.7) 15.3 (0.5) 1.8 (0.2)
Finland 40.5 (0.7) 47.0 (0.7) 10.1 (0.5) 2.4 (0.2) 54.2 (0.7) 38.3 (0.7) 5.9 (0.4) 1.7 (0.2)
France 31.5 (0.7) 45.3 (0.7) 18.8 (0.6) 4.4 (0.3) 38.0 (0.8) 39.5 (0.8) 17.6 (0.5) 5.0 (0.3)
Germany 46.5 (0.9) 42.5 (0.8) 9.4 (0.5) 1.6 (0.2) 57.2 (0.9) 30.7 (0.8) 10.0 (0.5) 2.2 (0.2)
Greece 28.0 (0.7) 52.4 (0.7) 17.3 (0.7) 2.3 (0.2) 28.7 (0.6) 42.4 (0.6) 24.0 (0.6) 4.9 (0.3)
Hungary 34.3 (0.8) 50.7 (0.9) 13.2 (0.6) 1.8 (0.2) 41.0 (0.8) 45.1 (0.8) 11.9 (0.5) 2.0 (0.2)
Iceland 35.2 (0.8) 43.7 (0.8) 16.8 (0.7) 4.3 (0.3) 41.2 (0.8) 39.3 (0.9) 15.4 (0.7) 4.0 (0.3)
Ireland 32.5 (0.7) 54.4 (0.6) 11.4 (0.4) 1.7 (0.1) 37.4 (0.8) 48.0 (0.7) 12.5 (0.5) 2.1 (0.2)
Israel 34.1 (0.8) 43.3 (0.6) 18.2 (0.6) 4.4 (0.3) 40.4 (0.8) 36.8 (0.6) 18.0 (0.6) 4.7 (0.3)
Italy 37.1 (0.7) 50.2 (0.6) 10.7 (0.5) 2.1 (0.2) 41.2 (0.9) 44.3 (0.8) 12.1 (0.5) 2.4 (0.3)
Japan 21.9 (0.7) 48.2 (0.7) 23.9 (0.6) 6.1 (0.3) 31.6 (0.8) 46.2 (0.6) 17.0 (0.6) 5.2 (0.3)
Korea 32.1 (0.9) 45.5 (0.7) 19.7 (0.6) 2.8 (0.2) 33.4 (0.9) 46.4 (0.8) 17.8 (0.6) 2.4 (0.2)
Latvia 27.0 (0.7) 57.9 (0.7) 13.1 (0.5) 1.9 (0.2) 28.1 (0.6) 48.8 (0.6) 20.1 (0.5) 2.9 (0.2)
Lithuania 47.1 (0.7) 35.6 (0.7) 14.3 (0.5) 3.1 (0.3) 76.3 (0.5) 13.5 (0.4) 7.1 (0.4) 3.1 (0.3)
Luxembourg 43.5 (0.7) 42.7 (0.7) 11.5 (0.4) 2.3 (0.2) 45.6 (0.7) 35.9 (0.6) 15.1 (0.5) 3.4 (0.2)
Mexico 20.8 (0.7) 58.2 (0.7) 18.7 (0.6) 2.3 (0.2) 23.8 (0.7) 53.0 (0.7) 19.8 (0.7) 3.4 (0.3)
Netherlands 33.3 (1.0) 54.9 (1.0) 9.4 (0.5) 2.3 (0.2) 39.1 (0.7) 44.3 (0.8) 14.4 (0.5) 2.2 (0.2)
New Zealand 32.5 (0.7) 52.9 (0.7) 12.6 (0.5) 2.0 (0.2) 37.4 (0.7) 45.6 (0.7) 14.8 (0.5) 2.2 (0.2)
Norway 34.1 (0.7) 48.9 (0.7) 13.7 (0.5) 3.2 (0.2) 35.7 (0.7) 42.7 (0.7) 17.5 (0.5) 4.1 (0.3)
Poland 25.2 (0.7) 57.8 (0.7) 15.1 (0.6) 1.9 (0.2) 25.7 (0.7) 49.3 (0.7) 21.2 (0.7) 3.8 (0.3)
Portugal 28.9 (1.0) 53.3 (0.9) 16.1 (0.7) 1.7 (0.2) 33.8 (0.9) 48.9 (0.8) 15.2 (0.6) 2.1 (0.2)
Slovak Republic 22.0 (0.7) 52.2 (0.6) 22.7 (0.7) 3.2 (0.2) 26.6 (0.7) 49.5 (0.7) 19.4 (0.6) 4.6 (0.3)
Slovenia 33.2 (0.7) 54.9 (0.7) 9.9 (0.4) 1.9 (0.2) 32.7 (0.9) 48.5 (0.8) 15.9 (0.6) 2.9 (0.3)
Spain 34.4 (0.5) 51.0 (0.4) 12.3 (0.3) 2.3 (0.1) 43.0 (0.5) 44.1 (0.5) 10.7 (0.3) 2.2 (0.1)
Sweden 30.8 (0.9) 52.8 (0.9) 13.4 (0.6) 3.0 (0.3) 31.3 (0.8) 45.1 (0.8) 19.2 (0.6) 4.5 (0.4)
Switzerland 37.7 (0.9) 47.0 (0.9) 12.9 (0.6) 2.5 (0.3) 43.2 (0.9) 37.1 (0.7) 16.3 (0.6) 3.4 (0.3)
Turkey 23.2 (0.6) 52.1 (0.8) 21.4 (0.7) 3.3 (0.2) 39.1 (0.7) 45.1 (0.5) 12.5 (0.5) 3.3 (0.2)
United Kingdom 34.3 (0.8) 49.8 (0.7) 13.1 (0.4) 2.8 (0.2) 40.2 (0.8) 43.4 (0.6) 13.2 (0.5) 3.1 (0.2)
United States 36.5 (1.0) 50.8 (0.8) 11.0 (0.6) 1.7 (0.2) 38.5 (1.0) 45.7 (0.9) 13.2 (0.6) 2.6 (0.3)
OECD average 32.5 (0.1) 50.2 (0.1) 14.7 (0.1) 2.6 (0.0) 38.5 (0.1) 42.9 (0.1) 15.4 (0.1) 3.2 (0.0)
Note: Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one
dagger (†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12https://doi.org/10.1787/888934240693
186
Results for countries and economies Annex B
Many texts were too difficult for me I was lost when I had to navigate between different pages
Strongly Strongly
disagree Disagree Agree Strongly agree disagree Disagree Agree Strongly agree
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Albania 22.2 (0.7) 49.6 (0.8) 24.4 (0.6) 3.8 (0.3) 31.4 (0.6) 45.6 (0.7) 18.2 (0.6) 4.7 (0.3)
Partners
Argentina 24.6 (0.8) 44.7 (0.8) 24.6 (0.8) 6.1 (0.4) 31.8 (0.8) 40.5 (0.7) 21.5 (0.6) 6.2 (0.3)
Baku (Azerbaijan) 22.3 (0.7) 48.8 (0.7) 23.3 (0.7) 5.6 (0.4) 26.9 (0.7) 42.7 (0.7) 22.7 (0.6) 7.6 (0.4)
Belarus 23.0 (0.8) 62.5 (0.8) 13.5 (0.6) 1.1 (0.1) 30.0 (0.9) 59.2 (0.8) 9.5 (0.5) 1.3 (0.1)
Bosnia and Herzegovina 27.1 (0.7) 52.0 (0.7) 17.5 (0.6) 3.5 (0.3) 31.1 (0.8) 47.6 (0.8) 17.3 (0.5) 4.0 (0.3)
Brazil 17.6 (0.5) 54.1 (0.6) 25.7 (0.6) 2.6 (0.2) 21.2 (0.6) 50.6 (0.6) 24.7 (0.6) 3.4 (0.2)
Brunei Darussalam 11.1 (0.4) 51.1 (0.5) 33.3 (0.5) 4.4 (0.2) 13.5 (0.4) 45.9 (0.5) 35.1 (0.5) 5.5 (0.3)
B-S-J-Z (China) 25.4 (0.7) 57.9 (0.6) 15.3 (0.5) 1.4 (0.2) 29.3 (0.8) 56.1 (0.7) 13.3 (0.6) 1.3 (0.2)
Bulgaria 40.9 (1.1) 40.0 (0.8) 14.6 (0.8) 4.5 (0.4) 48.9 (1.0) 32.3 (0.8) 13.0 (0.6) 5.7 (0.4)
Costa Rica 24.7 (0.8) 51.6 (0.6) 19.7 (0.6) 4.0 (0.3) 29.1 (0.8) 46.0 (0.6) 20.4 (0.6) 4.5 (0.2)
Croatia 24.3 (0.5) 54.2 (0.6) 18.6 (0.6) 2.9 (0.2) 34.3 (0.6) 49.6 (0.6) 13.4 (0.4) 2.7 (0.2)
Cyprus 22.7 (0.6) 52.7 (0.8) 20.6 (0.6) 4.0 (0.2) 26.9 (0.7) 46.2 (0.7) 22.0 (0.6) 5.0 (0.3)
Dominican Republic 20.4 (0.7) 48.7 (0.9) 25.7 (0.8) 5.2 (0.3) 24.2 (0.7) † 47.6 (0.9) † 22.3 (0.8) † 5.8 (0.4) †
Georgia 26.3 (0.8) 53.2 (0.8) 17.8 (0.6) 2.7 (0.3) 30.3 (0.8) 48.1 (0.9) 18.2 (0.6) 3.4 (0.3)
Hong Kong (China) 20.3 (0.6) 60.1 (0.7) 16.8 (0.6) 2.7 (0.2) 24.5 (0.6) 56.0 (0.8) 16.5 (0.6) 2.9 (0.2)
Indonesia 7.9 (0.6) 51.0 (0.9) 37.8 (0.9) 3.3 (0.3) 9.0 (0.5) 44.1 (1.0) 41.4 (0.9) 5.5 (0.4)
Jordan 21.3 (0.7) 46.6 (0.7) 26.3 (0.7) 5.8 (0.3) 24.4 (0.7) 36.5 (0.7) 29.5 (0.7) 9.6 (0.4)
Kazakhstan 17.6 (0.4) 61.1 (0.5) 19.6 (0.4) 1.7 (0.1) 20.2 (0.4) 56.4 (0.5) 21.1 (0.5) 2.3 (0.1)
Kosovo 24.2 (0.7) 52.3 (0.8) 20.6 (0.7) 2.9 (0.3) 28.6 (0.8) 48.4 (0.8) 18.9 (0.7) 4.1 (0.4)
Lebanon m m m m m m m m m m m m m m m m
Macao (China) 15.5 (0.6) 59.5 (0.8) 21.4 (0.7) 3.6 (0.3) 18.2 (0.6) 54.4 (0.7) 23.7 (0.7) 3.7 (0.3)
Malaysia 9.5 (0.5) 48.1 (0.8) 38.8 (0.8) 3.5 (0.2) 13.1 (0.6) 45.9 (0.8) 35.2 (0.8) 5.8 (0.4)
Malta 33.3 (0.8) 50.7 (0.9) 13.2 (0.6) 2.8 (0.3) 35.3 (0.9) 43.7 (0.9) 17.3 (0.7) 3.7 (0.3)
Moldova 25.1 (0.9) 54.8 (0.7) 18.1 (0.6) 1.9 (0.2) 29.7 (1.0) 50.1 (0.9) 18.1 (0.7) 2.1 (0.2)
Montenegro 34.4 (0.7) 48.3 (0.6) 14.1 (0.4) 3.1 (0.2) 36.4 (0.7) 41.9 (0.6) 17.5 (0.5) 4.2 (0.3)
Morocco3 17.9 (0.6) 46.2 (0.9) 31.0 (0.8) 5.0 (0.4) 20.8 (0.7) 46.2 (0.8) 27.6 (0.7) 5.4 (0.4)
North Macedonia m m m m m m m m m m m m m m m m
Panama 17.1 (0.6) 47.4 (0.8) 30.1 (0.8) 5.4 (0.4) 22.0 (0.7) 44.5 (1.0) 27.3 (0.8) 6.2 (0.4)
Peru 15.3 (0.6) 61.5 (0.6) 21.4 (0.7) 1.8 (0.2) 19.7 (0.6) 55.4 (0.7) 22.3 (0.7) 2.7 (0.3)
Philippines 8.6 (0.4) 42.8 (0.8) 44.1 (0.9) 4.4 (0.3) 10.7 (0.5) 43.5 (0.7) 41.1 (0.7) 4.7 (0.3)
Qatar 23.8 (0.4) 47.2 (0.4) 24.5 (0.4) 4.5 (0.2) 27.3 (0.4) 40.0 (0.4) 26.1 (0.4) 6.7 (0.2)
Romania 25.8 (1.0) 51.7 (0.9) 20.4 (0.9) 2.1 (0.2) 26.3 (0.9) 40.8 (0.8) 27.7 (1.0) 5.2 (0.4)
Russia 31.1 (0.7) 51.9 (0.7) 15.4 (0.4) 1.7 (0.2) 39.4 (0.9) 48.2 (0.8) 10.6 (0.4) 1.8 (0.2)
Saudi Arabia 24.2 (0.7) 42.0 (0.7) 26.7 (0.7) 7.1 (0.4) 26.7 (0.7) 34.4 (0.7) 29.6 (0.8) 9.3 (0.4)
Serbia 32.3 (0.6) 50.8 (0.8) 14.0 (0.6) 2.9 (0.3) 37.2 (0.8) 42.9 (0.8) 15.6 (0.6) 4.2 (0.3)
Singapore 27.7 (0.6) 53.3 (0.6) 16.1 (0.4) 2.9 (0.2) 30.8 (0.7) 46.9 (0.7) 18.7 (0.5) 3.7 (0.2)
Chinese Taipei 20.8 (0.6) 47.0 (0.6) 28.6 (0.7) 3.7 (0.3) 25.8 (0.6) 49.5 (0.7) 21.1 (0.5) 3.7 (0.2)
Thailand 6.8 (0.3) 34.3 (0.7) 53.6 (0.7) 5.2 (0.2) 8.9 (0.4) 37.5 (0.7) 46.8 (0.7) 6.8 (0.3)
Ukraine 32.2 (0.9) 54.4 (0.6) 11.5 (0.6) 1.9 (0.2) 35.9 (0.8) 49.0 (0.7) 12.8 (0.6) 2.3 (0.2)
United Arab Emirates 26.8 (0.5) 46.4 (0.6) 22.5 (0.5) 4.4 (0.2) 26.9 (0.4) 36.8 (0.5) 29.0 (0.4) 7.4 (0.3)
Uruguay 24.4 (0.7) 53.4 (0.8) 19.4 (0.7) 2.8 (0.3) 27.6 (0.7) 48.9 (0.8) 19.8 (0.7) 3.7 (0.3)
Viet Nam 5.7 (0.4) 39.7 (0.9) 50.3 (1.0) 4.3 (0.4) 9.9 (0.6) 49.0 (0.9) 36.6 (0.9) 4.4 (0.3)
Note: Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one
dagger (†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240693
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 187
Annex B Results for countries and economies
Table B.5.11 [1/4] Student’s knowledge of reading strategies for assessing the credibility of sources
Based on students' reports
Percentage of students who reported about the usefulness of
Index of
knowledge the following strategies for assessing credibility:
of reading
strategies for Answer the email and ask for more information
Check the sender’s email address
assessing the about the smartphone
credibility of
sources Not very Somewhat Not very Somewhat
appropriate appropriate Very appropriate appropriate appropriate Very appropriate
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Australia 0.14 (0.01) 58.9 (0.6) 26.6 (0.5) 14.6 (0.3) 18.1 (0.5) 30.6 (0.5) 51.2 (0.5)
OECD
Austria 0.15 (0.02) 49.0 (0.7) 33.2 (0.7) 17.8 (0.6) 15.6 (0.5) 30.5 (0.7) 53.8 (0.8)
Belgium 0.04 (0.02) 46.6 (0.7) 34.4 (0.6) 19.0 (0.5) 19.0 (0.5) 34.0 (0.6) 46.9 (0.7)
Canada 0.02 (0.01) 46.9 (0.5) 32.7 (0.5) 20.4 (0.5) 17.6 (0.4) 32.1 (0.5) 50.4 (0.6)
Chile -0.37 (0.02) 42.1 (0.6) 32.4 (0.7) 25.5 (0.7) 25.5 (0.7) 31.0 (0.6) 43.5 (0.8)
Colombia -0.29 (0.02) 49.2 (0.8) 31.7 (0.7) 19.0 (0.7) 28.8 (0.9) 32.9 (0.6) 38.4 (1.0)
Czech Republic -0.02 (0.02) 47.5 (0.8) 32.6 (0.6) 19.9 (0.6) 26.4 (0.7) 33.9 (0.7) 39.7 (0.8)
Denmark 0.21 (0.02) 51.3 (0.9) 30.9 (0.8) 17.8 (0.6) 15.2 (0.6) 31.5 (0.7) 53.3 (0.8)
Estonia 0.05 (0.02) 37.1 (0.7) 39.0 (0.7) 23.9 (0.6) 12.0 (0.5) 33.1 (0.8) 54.9 (0.8)
Finland 0.19 (0.02) 53.4 (0.7) 31.6 (0.7) 15.0 (0.5) 13.4 (0.5) 33.8 (0.6) 52.8 (0.7)
France 0.07 (0.02) 54.6 (0.7) 26.1 (0.6) 19.3 (0.5) 21.3 (0.6) 27.4 (0.6) 51.3 (0.7)
Germany 0.24 (0.02) 51.4 (0.9) 31.4 (0.8) 17.2 (0.7) 16.4 (0.6) 30.0 (0.8) 53.5 (1.0)
Greece 0.10 (0.02) 56.6 (0.8) 27.3 (0.6) 16.1 (0.6) 25.4 (0.7) 30.5 (0.6) 44.1 (0.8)
Hungary -0.27 (0.02) 28.7 (0.7) 43.5 (0.8) 27.9 (0.7) 14.6 (0.6) 38.1 (0.6) 47.2 (0.7)
Iceland -0.20 (0.02) 35.0 (0.8) 35.6 (0.9) 29.4 (0.9) 18.8 (0.7) 30.1 (0.9) 51.1 (0.9)
Ireland 0.21 (0.01) 49.7 (0.7) 28.6 (0.6) 21.7 (0.6) 15.2 (0.5) 26.7 (0.6) 58.1 (0.8)
Israel -0.20 (0.02) † 48.5 (0.7) 28.1 (0.7) 23.4 (0.7) 25.7 (0.7) 31.0 (0.6) 43.3 (1.0)
Italy -0.05 (0.02) 48.1 (0.7) 30.9 (0.6) 21.0 (0.6) 22.0 (0.7) 31.1 (0.7) 47.0 (0.7)
Japan 0.28 (0.02) 58.2 (0.7) 26.8 (0.7) 15.0 (0.4) 18.6 (0.5) 31.9 (0.7) 49.5 (0.8)
Korea -0.30 (0.02) 24.4 (0.5) 46.1 (0.6) 29.5 (0.7) 11.0 (0.4) 42.0 (0.7) 47.0 (0.8)
Latvia 0.03 (0.01) 42.2 (0.8) 37.2 (0.7) 20.5 (0.6) 20.6 (0.6) 34.3 (0.7) 45.1 (0.8)
Lithuania -0.09 (0.02) 42.1 (0.7) 31.8 (0.6) 26.0 (0.6) 20.3 (0.6) 30.7 (0.6) 48.9 (0.7)
Luxembourg -0.10 (0.01) 48.7 (0.8) 33.2 (0.7) 18.1 (0.5) 23.7 (0.6) 33.1 (0.7) 43.2 (0.6)
Mexico -0.40 (0.02) 42.5 (0.8) 34.9 (0.8) 22.6 (0.6) 24.1 (0.8) 33.1 (0.6) 42.7 (1.0)
Netherlands 0.21 (0.02) 45.1 (0.8) 35.7 (0.8) 19.1 (0.7) 12.5 (0.6) 30.4 (0.7) 57.0 (0.9)
New Zealand 0.12 (0.02) 49.9 (0.9) 31.2 (0.7) 18.8 (0.5) 16.1 (0.5) 30.2 (0.8) 53.7 (0.8)
Norway -0.03 (0.02) 53.8 (0.8) 30.3 (0.6) 15.9 (0.5) 22.6 (0.6) 32.0 (0.7) 45.5 (0.7)
Poland -0.03 (0.02) 42.4 (0.6) 35.4 (0.7) 22.2 (0.6) 21.0 (0.7) 36.3 (0.8) 42.7 (0.9)
Portugal 0.03 (0.02) 43.1 (0.7) 32.0 (0.6) 24.9 (0.7) 13.4 (0.6) 30.3 (0.8) 56.2 (0.9)
Slovak Republic -0.20 (0.02) 42.5 (0.7) 34.6 (0.8) 22.9 (0.5) 24.0 (0.7) 34.2 (0.6) 41.7 (0.9)
Slovenia -0.02 (0.01) 46.3 (0.7) 36.3 (0.7) 17.3 (0.6) 18.1 (0.6) 38.6 (0.8) 43.3 (0.9)
Spain -0.01 (0.01) 42.4 (0.4) 34.8 (0.4) 22.8 (0.4) 17.1 (0.4) 33.0 (0.4) 49.9 (0.5)
Sweden 0.07 (0.02) 54.7 (0.8) 30.1 (0.8) 15.2 (0.6) 19.5 (0.7) 32.8 (0.7) 47.7 (1.0)
Switzerland 0.04 (0.02) 44.3 (1.0) 33.6 (0.8) 22.0 (0.7) 16.3 (0.6) 32.5 (0.8) 51.2 (1.0)
Turkey -0.23 (0.02) 47.7 (0.7) 27.3 (0.6) 25.0 (0.5) 28.2 (0.7) 28.0 (0.6) 43.8 (0.8)
United Kingdom 0.29 (0.02) 56.8 (0.6) 26.3 (0.5) 16.9 (0.5) 17.4 (0.5) 27.8 (0.6) 54.9 (0.7)
United States 0.01 (0.02) 51.5 (0.9) 29.3 (0.6) 19.2 (0.7) 20.1 (0.9) 28.7 (0.7) 51.2 (1.2)
OECD average -0.01 (0.00) 46.8 (0.1) 32.5 (0.1) 20.6 (0.1) 19.3 (0.1) 32.1 (0.1) 48.5 (0.1)
Note: Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one
dagger (†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240693
188 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Results for countries and economies Annex B
Table B.5.11 [2/4] Student’s knowledge of reading strategies for assessing the credibility of sources
Based on students' reports
Percentage of students who reported about the usefulness of
Index of
knowledge the following strategies for assessing credibility:
of reading
strategies for Answer the email and ask for more information
Check the sender’s email address
assessing the about the smartphone
credibility of
sources Not very Somewhat Not very Somewhat
appropriate appropriate Very appropriate appropriate appropriate Very appropriate
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Albania -0.63 (0.01) 40.6 (0.7) 27.4 (0.7) 32.0 (0.8) 28.2 (0.7) 26.6 (0.6) 45.2 (0.8)
Partners
Argentina -0.10 (0.02) 46.9 (0.8) 24.0 (0.6) 29.1 (0.7) 25.2 (0.8) 25.2 (0.6) 49.6 (0.8)
Baku (Azerbaijan) -0.70 (0.01) † 41.1 (0.6) † 27.8 (0.6) † 31.1 (0.7) † 36.0 (0.7) † 29.7 (0.7) † 34.3 (0.7) †
Belarus -0.14 (0.02) 39.4 (0.7) 36.7 (0.7) 23.9 (0.5) 20.8 (0.7) 36.0 (0.8) 43.1 (0.9)
Bosnia and Herzegovina -0.47 (0.02) 34.4 (0.8) 34.3 (0.6) 31.3 (0.7) 23.0 (0.7) 33.1 (0.6) 43.9 (0.8)
Brazil -0.38 (0.01) 54.3 (0.6) 21.6 (0.5) 24.1 (0.5) 35.5 (0.7) 22.7 (0.5) 41.7 (0.8)
Brunei Darussalam -0.26 (0.01) 46.9 (0.6) 33.8 (0.6) 19.3 (0.5) 30.3 (0.5) 30.2 (0.5) 39.5 (0.6)
B-S-J-Z (China) 0.05 (0.02) 51.5 (0.8) 26.6 (0.6) 21.9 (0.6) 17.5 (0.6) 24.8 (0.7) 57.7 (0.8)
Bulgaria -0.49 (0.02) 38.1 (0.8) 30.5 (0.7) 31.4 (0.8) 27.2 (1.0) 32.5 (0.8) 40.3 (1.0)
Costa Rica -0.27 (0.03) 51.4 (0.7) 27.1 (0.6) 21.5 (0.6) 30.1 (0.9) 28.7 (0.6) 41.3 (0.9)
Croatia -0.18 (0.02) 37.3 (0.6) 37.1 (0.6) 25.6 (0.6) 17.4 (0.5) 33.1 (0.7) 49.5 (0.7)
Cyprus -0.16 (0.01) 46.8 (0.8) 29.6 (0.7) 23.7 (0.7) 22.5 (0.6) 31.1 (0.7) 46.4 (0.7)
Dominican Republic -0.56 (0.02) † 45.3 (0.9) † 23.9 (0.8) † 30.8 (0.7) † 35.5 (1.1) † 25.0 (0.8) † 39.5 (1.0) †
Georgia -0.48 (0.01) 44.0 (0.9) 26.9 (0.7) 29.1 (0.7) 31.3 (0.8) 30.3 (0.7) 38.5 (0.8)
Hong Kong (China) -0.15 (0.02) 44.3 (0.8) 40.5 (0.7) 15.2 (0.5) 18.6 (0.6) 42.0 (0.9) 39.3 (0.9)
Indonesia -0.71 (0.02) 51.3 (0.8) 21.7 (0.7) 27.0 (0.7) 40.9 (0.9) 22.8 (0.6) 36.3 (0.8)
Jordan -0.25 (0.02) 45.8 (0.8) 23.7 (0.5) 30.6 (0.7) 23.3 (0.7) 23.2 (0.6) 53.5 (0.9)
Kazakhstan -0.65 (0.01) 36.2 (0.5) 32.9 (0.4) 30.8 (0.5) 34.1 (0.4) 34.3 (0.4) 31.6 (0.5)
Kosovo -0.59 (0.01) 50.1 (0.9) 25.3 (0.8) 24.5 (0.7) 38.0 (0.9) 26.5 (0.8) 35.5 (0.9)
Lebanon m m m m m m m m m m m m m m
Macao (China) -0.13 (0.02) 42.6 (0.7) 40.9 (0.7) 16.5 (0.6) 21.7 (0.6) 39.7 (0.7) 38.5 (0.9)
Malaysia -0.47 (0.02) 41.6 (0.7) 33.6 (0.7) 24.7 (0.7) 32.8 (0.8) 32.8 (0.6) 34.4 (0.9)
Malta -0.19 (0.02) 36.7 (0.9) 33.0 (0.9) 30.3 (0.9) 15.7 (0.6) 28.0 (0.8) 56.3 (0.8)
Moldova -0.11 (0.02) 40.3 (0.9) 29.5 (0.7) 30.2 (0.7) 22.9 (0.7) 21.7 (0.7) 55.4 (0.9)
Montenegro -0.47 (0.01) 40.8 (0.6) 33.4 (0.7) 25.8 (0.6) 24.5 (0.8) 34.3 (0.7) 41.3 (0.7)
Morocco -0.44 (0.01) † 23.0 (0.7) 37.1 (0.6) 39.8 (0.6) 14.8 (0.7) 29.8 (0.7) 55.4 (1.0)
North Macedonia m m m m m m m m m m m m m m
Panama -0.40 (0.02) † 44.6 (0.8) † 27.8 (0.8) † 27.6 (0.7) † 30.8 (0.9) † 29.4 (0.9) † 39.8 (1.0) †
Peru -0.44 (0.02) 50.5 (0.8) 30.0 (0.7) 19.5 (0.6) 36.1 (1.0) 30.7 (0.7) 33.2 (0.9)
Philippines -0.65 (0.01) 48.1 (0.9) 29.0 (0.8) 22.9 (0.6) 40.3 (1.0) 29.4 (0.7) 30.3 (0.8)
Qatar -0.26 (0.01) 45.6 (0.5) 29.7 (0.5) 24.7 (0.4) 24.5 (0.4) 29.9 (0.4) 45.5 (0.5)
Romania -0.14 (0.03) 36.9 (0.8) 30.2 (0.7) 32.9 (0.8) 19.0 (0.9) 26.2 (0.8) 54.8 (1.0)
Russia -0.10 (0.02) 47.1 (0.6) 32.8 (0.6) 20.1 (0.5) 25.9 (0.6) 35.7 (0.5) 38.4 (0.6)
Saudi Arabia -0.15 (0.02) 45.4 (0.8) 28.4 (0.7) 26.2 (0.7) 20.6 (0.7) 24.9 (0.6) 54.5 (0.9)
Serbia -0.33 (0.02) 39.5 (0.8) 34.1 (0.6) 26.4 (0.7) 21.1 (0.9) 31.6 (0.6) 47.3 (1.0)
Singapore 0.16 (0.01) 48.1 (0.6) 33.3 (0.6) 18.7 (0.5) 12.0 (0.4) 30.7 (0.5) 57.3 (0.5)
Chinese Taipei -0.35 (0.02) 44.6 (0.8) 35.1 (0.7) 20.4 (0.5) 24.6 (0.6) 37.3 (0.7) 38.1 (0.8)
Thailand -0.71 (0.01) 30.4 (0.6) 41.9 (0.6) 27.7 (0.7) 24.1 (0.8) 43.5 (0.7) 32.5 (1.0)
Ukraine 0.04 (0.02) 40.6 (0.7) 31.4 (0.7) 28.0 (0.6) 18.4 (0.6) 29.8 (0.7) 51.8 (0.7)
United Arab Emirates -0.26 (0.01) 42.8 (0.5) 28.9 (0.5) 28.3 (0.5) 20.4 (0.4) 27.3 (0.4) 52.3 (0.6)
Uruguay -0.27 (0.02) † 51.8 (0.9) 28.6 (0.7) 19.6 (0.7) 30.9 (0.9) 31.2 (0.7) 37.9 (0.8)
Viet Nam -0.15 (0.02) 30.7 (1.1) 36.2 (0.9) 33.1 (0.9) 14.0 (0.9) 30.4 (1.0) 55.6 (1.3)
Note: Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one
dagger (†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240693
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 189
Annex B Results for countries and economies
Table B.5.11 [3/4] Student’s knowledge of reading strategies for assessing the credibility of sources
Based on students' reports
Percentage of students who reported about the usefulness of the following strategies for assessing credibility:
Not very Somewhat Very Not very Somewhat Very Not very Somewhat Very
appropriate appropriate appropriate appropriate appropriate appropriate appropriate appropriate appropriate
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Australia 69.2 (0.6) 21.7 (0.5) 9.1 (0.3) 29.7 (0.5) 25.1 (0.5) 45.2 (0.6) 31.0 (0.5) 31.9 (0.6) 37.1 (0.6)
OECD
Austria 65.4 (0.9) 25.7 (0.7) 8.9 (0.5) 38.5 (0.7) 29.8 (0.6) 31.7 (0.8) 17.8 (0.6) 33.1 (0.7) 49.0 (0.8)
Belgium 60.3 (0.7) 28.5 (0.6) 11.2 (0.4) 37.6 (0.6) 29.6 (0.7) 32.8 (0.7) 23.9 (0.5) 34.9 (0.6) 41.1 (0.8)
Canada 61.8 (0.6) 26.2 (0.5) 12.0 (0.4) 34.2 (0.5) 29.5 (0.5) 36.3 (0.6) 24.8 (0.4) 34.2 (0.5) 41.0 (0.6)
Chile 50.5 (0.8) 31.1 (0.6) 18.4 (0.6) 48.2 (0.7) 30.3 (0.5) 21.5 (0.6) 30.0 (0.7) 28.7 (0.6) 41.2 (0.9)
Colombia 50.6 (0.8) 33.1 (0.7) 16.3 (0.8) 47.8 (0.8) 32.5 (0.7) 19.7 (0.7) 31.3 (0.9) 30.5 (0.6) 38.2 (1.0)
Czech Republic 61.5 (0.8) 28.9 (0.7) 9.5 (0.5) 33.6 (0.7) 32.1 (0.8) 34.3 (0.7) 21.8 (0.6) 33.9 (0.8) 44.2 (0.8)
Denmark 68.8 (0.8) 21.9 (0.7) 9.3 (0.4) 29.6 (0.8) 26.2 (0.7) 44.2 (0.8) 19.3 (0.7) 29.0 (0.7) 51.7 (0.8)
Estonia 61.2 (0.8) 30.1 (0.7) 8.7 (0.5) 36.1 (0.8) 36.4 (0.7) 27.5 (0.7) 21.3 (0.6) 31.7 (0.7) 47.0 (0.7)
Finland 70.0 (0.7) 23.9 (0.6) 6.0 (0.3) 40.1 (0.7) 30.9 (0.6) 29.0 (0.7) 15.9 (0.5) 32.3 (0.6) 51.8 (0.7)
France 63.2 (0.8) 21.6 (0.7) 15.2 (0.5) 39.5 (0.6) 22.8 (0.5) 37.7 (0.7) 26.8 (0.6) 27.9 (0.6) 45.3 (0.7)
Germany 68.9 (1.0) 22.6 (0.9) 8.6 (0.5) 36.4 (0.9) 26.4 (0.7) 37.2 (0.9) 19.8 (0.6) 30.3 (0.7) 49.9 (0.9)
Greece 61.3 (0.9) 26.2 (0.8) 12.5 (0.5) 35.8 (0.6) 29.5 (0.7) 34.7 (0.8) 27.8 (0.7) 29.5 (0.7) 42.7 (0.9)
Hungary 39.8 (0.8) 44.2 (0.8) 15.9 (0.6) 43.0 (0.8) 37.3 (0.7) 19.7 (0.6) 17.8 (0.6) 38.4 (0.8) 43.8 (0.7)
Iceland 60.7 (0.9) 30.8 (0.9) 8.5 (0.5) 46.2 (1.1) 34.8 (1.0) 19.1 (0.7) 20.4 (0.7) 29.4 (0.9) 50.2 (0.9)
Ireland 65.9 (0.7) 23.6 (0.7) 10.5 (0.4) 40.5 (0.7) 25.9 (0.6) 33.7 (0.8) 17.9 (0.5) 26.0 (0.6) 56.1 (0.8)
Israel 53.5 (0.9) 29.9 (0.7) 16.6 (0.7) 38.5 (0.7) 32.1 (0.6) 29.4 (0.7) 30.1 (0.8) 26.2 (0.7) 43.8 (0.9)
Italy 54.7 (0.9) 30.8 (0.8) 14.5 (0.6) 46.5 (0.8) 25.8 (0.7) 27.8 (0.7) 24.1 (0.7) 31.1 (0.7) 44.8 (0.8)
Japan 75.6 (0.7) 18.6 (0.6) 5.8 (0.3) 29.3 (0.8) 28.7 (0.6) 42.1 (0.8) 19.3 (0.6) 26.6 (0.7) 54.1 (1.0)
Korea 47.1 (0.8) 37.9 (0.8) 15.0 (0.5) 34.7 (0.6) 41.7 (0.7) 23.6 (0.5) 11.7 (0.4) 40.3 (0.8) 48.0 (0.9)
Latvia 57.6 (0.8) 32.1 (0.7) 10.3 (0.4) 38.2 (0.6) 33.7 (0.6) 28.2 (0.6) 22.2 (0.6) 30.8 (0.7) 47.0 (0.7)
Lithuania 61.2 (0.7) 27.0 (0.6) 11.8 (0.4) 40.7 (0.6) 30.9 (0.7) 28.4 (0.6) 24.4 (0.6) 25.2 (0.6) 50.4 (0.7)
Luxembourg 59.9 (0.7) 29.3 (0.7) 10.8 (0.4) 39.3 (0.7) 28.4 (0.6) 32.2 (0.6) 29.0 (0.7) 34.0 (0.6) 37.0 (0.7)
Mexico 39.7 (0.9) 38.7 (0.9) 21.6 (0.7) 46.6 (0.7) 32.6 (0.8) 20.8 (0.6) 30.3 (0.7) 31.8 (0.7) 38.0 (0.8)
Netherlands 66.0 (0.9) 26.4 (0.9) 7.6 (0.4) 35.4 (0.8) 28.9 (0.9) 35.8 (0.9) 17.1 (0.8) 32.3 (0.8) 50.5 (1.1)
New Zealand 67.1 (0.8) 25.0 (0.7) 7.9 (0.4) 35.4 (0.6) 29.2 (0.6) 35.3 (0.7) 24.7 (0.6) 32.1 (0.7) 43.2 (0.7)
Norway 65.5 (0.8) 26.3 (0.6) 8.1 (0.4) 39.9 (0.8) 27.6 (0.6) 32.5 (0.8) 28.7 (0.7) 31.1 (0.7) 40.2 (0.8)
Poland 56.2 (0.9) 32.1 (0.8) 11.8 (0.5) 35.2 (0.7) 34.4 (0.7) 30.4 (0.8) 19.5 (0.5) 32.6 (0.8) 48.0 (0.9)
Portugal 59.5 (0.9) 28.8 (0.8) 11.7 (0.5) 38.9 (0.7) 31.3 (0.8) 29.8 (0.7) 17.8 (0.6) 28.3 (0.7) 54.0 (0.9)
Slovak Republic 53.6 (0.8) 34.4 (0.8) 12.0 (0.5) 49.7 (0.6) 33.2 (0.7) 17.0 (0.5) 25.2 (0.6) 32.8 (0.6) 42.0 (0.8)
Slovenia 56.5 (0.6) 33.6 (0.7) 10.0 (0.5) 43.3 (0.8) 33.2 (0.7) 23.5 (0.6) 21.3 (0.6) 33.7 (0.7) 45.0 (0.8)
Spain 55.5 (0.5) 31.5 (0.4) 13.0 (0.3) 43.7 (0.4) 30.6 (0.4) 25.6 (0.3) 20.3 (0.4) 30.0 (0.4) 49.7 (0.4)
Sweden 69.8 (0.9) 22.6 (0.8) 7.6 (0.4) 38.0 (0.8) 26.9 (0.7) 35.1 (0.7) 27.8 (0.8) 31.5 (0.6) 40.7 (0.9)
Switzerland 58.4 (1.1) 28.3 (0.8) 13.3 (0.6) 37.0 (0.8) 28.4 (0.8) 34.7 (0.8) 20.7 (0.7) 33.8 (0.8) 45.5 (0.9)
Turkey 57.4 (0.8) 27.1 (0.7) 15.5 (0.5) 54.9 (0.7) 25.3 (0.5) 19.8 (0.6) 28.2 (0.8) 26.0 (0.6) 45.9 (0.8)
United Kingdom 72.6 (0.6) 19.5 (0.5) 8.0 (0.4) 32.2 (0.8) 26.8 (0.7) 41.0 (0.8) 23.2 (0.5) 28.9 (0.7) 47.9 (0.7)
United States 63.5 (0.9) 25.6 (0.7) 10.8 (0.5) 36.2 (0.9) 27.7 (0.8) 36.1 (0.9) 30.3 (0.8) 31.8 (0.7) 37.9 (0.8)
OECD average 60.3 (0.1) 28.3 (0.1) 11.5 (0.1) 39.2 (0.1) 30.2 (0.1) 30.6 (0.1) 23.3 (0.1) 31.2 (0.1) 45.5 (0.1)
Note: Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one
dagger (†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12https://doi.org/10.1787/888934240693
190
Results for countries and economies Annex B
Table B.5.11 [4/4] Student’s knowledge of reading strategies for assessing the credibility of sources
Based on students' reports
Percentage of students who reported about the usefulness of the following strategies for assessing credibility:
Not very Somewhat Very Not very Somewhat Very Not very Somewhat Very
appropriate appropriate appropriate appropriate appropriate appropriate appropriate appropriate appropriate
% S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s % S.E. s
Albania 34.1 (0.7) 32.3 (0.6) 33.5 (0.9) 61.4 (0.8) 22.5 (0.6) 16.1 (0.6) 29.8 (0.7) 29.0 (0.6) 41.2 (0.8)
Partners
Argentina 59.8 (0.8) 24.1 (0.6) 16.1 (0.7) 37.3 (0.8) 29.6 (0.6) 33.1 (0.8) 28.8 (0.7) 23.2 (0.7) 48.0 (0.8)
Baku (Azerbaijan) 40.0 (0.7) † 34.1 (0.6) † 25.8 (0.5) † 48.4 (0.8) † 29.6 (0.6) † 22.0 (0.6) † 31.1 (0.6) † 30.8 (0.7) † 38.1 (0.7) †
Belarus 56.2 (0.8) 30.4 (0.7) 13.3 (0.5) 41.2 (0.7) 33.8 (0.6) 25.1 (0.6) 22.9 (0.7) 28.4 (0.7) 48.8 (0.9)
Bosnia and Herzegovina 38.5 (0.6) 38.6 (0.7) 22.9 (0.6) 49.4 (0.7) 32.1 (0.6) 18.5 (0.5) 21.5 (0.7) 33.2 (0.6) 45.4 (0.8)
Brazil 59.6 (0.6) 23.2 (0.6) 17.2 (0.4) 59.8 (0.6) 22.1 (0.5) 18.1 (0.5) 37.1 (0.7) 19.9 (0.5) 43.0 (0.8)
Brunei Darussalam 51.5 (0.6) 33.8 (0.6) 14.7 (0.5) 52.0 (0.6) 29.1 (0.6) 18.9 (0.5) 28.3 (0.6) 34.6 (0.6) 37.1 (0.5)
B-S-J-Z (China) 74.9 (0.8) 14.6 (0.5) 10.5 (0.4) 43.1 (0.8) 25.8 (0.5) 31.1 (0.7) 38.2 (0.8) 19.0 (0.5) 42.7 (0.7)
Bulgaria 36.8 (1.0) 37.7 (0.9) 25.5 (0.7) 38.5 (0.8) 33.8 (0.8) 27.7 (0.8) 23.7 (0.8) 32.8 (0.6) 43.4 (0.8)
Costa Rica 58.0 (0.8) 26.1 (0.6) 15.9 (0.6) 50.8 (0.8) 26.8 (0.5) 22.4 (0.7) 33.4 (0.8) 24.4 (0.6) 42.2 (0.9)
Croatia 51.7 (0.8) 33.5 (0.7) 14.8 (0.5) 39.8 (0.7) 34.1 (0.6) 26.1 (0.6) 20.9 (0.5) 34.2 (0.6) 44.9 (0.6)
Cyprus 49.5 (0.8) 32.8 (0.6) 17.7 (0.5) 35.6 (0.7) 33.1 (0.7) 31.3 (0.7) 27.2 (0.7) 29.9 (0.7) 42.9 (0.7)
Dominican Republic 43.3 (0.9) † 28.2 (0.9) † 28.5 (0.9) † 54.1 (0.9) † 24.8 (0.8) † 21.1 (0.8) † 33.0 (0.9) † 22.3 (0.7) † 44.7 (0.9) †
Georgia 41.9 (0.8) 34.1 (0.8) 24.0 (0.6) 49.2 (0.8) 30.3 (0.8) 20.5 (0.8) 28.1 (0.7) 27.6 (0.7) 44.3 (0.9)
Hong Kong (China) 56.2 (0.9) 35.1 (0.9) 8.7 (0.4) 32.2 (0.8) 39.4 (0.7) 28.4 (0.8) 24.3 (0.6) 36.4 (0.8) 39.3 (0.8)
Indonesia 43.6 (0.8) 26.0 (0.6) 30.4 (0.7) 56.2 (1.0) 23.4 (0.7) 20.4 (0.7) 41.7 (1.0) 20.4 (0.7) 37.9 (0.8)
Jordan 34.4 (0.9) 35.2 (0.8) 30.5 (0.8) 34.8 (0.6) 34.9 (0.7) 30.3 (0.6) 21.0 (0.6) 27.7 (0.6) 51.4 (0.7)
Kazakhstan 41.1 (0.6) 36.4 (0.6) 22.5 (0.4) 47.5 (0.5) 31.2 (0.5) 21.3 (0.4) 39.9 (0.5) 26.4 (0.4) 33.7 (0.5)
Kosovo 44.5 (0.9) 32.4 (0.8) 23.1 (0.7) 67.8 (0.8) 19.9 (0.6) 12.2 (0.5) 38.2 (0.8) 27.0 (0.7) 34.8 (0.8)
Lebanon m m m m m m m m m m m m m m m m m m
Macao (China) 56.8 (0.8) 34.0 (0.8) 9.3 (0.4) 41.8 (0.8) 35.2 (0.7) 23.0 (0.7) 27.3 (0.7) 31.7 (0.7) 41.0 (0.9)
Malaysia 43.6 (0.7) 37.7 (0.7) 18.7 (0.6) 50.4 (0.7) 33.1 (0.6) 16.6 (0.6) 27.1 (0.8) 33.0 (0.7) 39.9 (0.9)
Malta 47.2 (0.9) 33.8 (0.9) 19.0 (0.6) 47.4 (1.0) 29.1 (0.8) 23.5 (0.8) 21.3 (0.7) 31.9 (0.8) 46.8 (0.8)
Moldova 52.8 (0.8) 28.1 (0.7) 19.1 (0.6) 50.5 (0.6) 28.1 (0.7) 21.3 (0.6) 20.2 (0.6) 20.1 (0.6) 59.6 (0.8)
Montenegro 37.6 (0.6) 38.9 (0.7) 23.5 (0.6) 47.6 (0.7) 31.1 (0.6) 21.4 (0.6) 28.7 (0.6) 34.4 (0.7) 36.8 (0.6)
Morocco 20.7 (0.6) 45.3 (0.8) 34.0 (0.6) 25.8 (0.6) 51.0 (0.8) 23.2 (0.7) 13.8 (0.6) 31.6 (0.7) 54.6 (0.8)
North Macedonia m m m m m m m m m m m m m m m m m m
Panama 42.2 (1.0) † 32.0 (0.7) † 25.8 (0.9) † 44.5 (0.9) † 30.0 (0.8) † 25.6 (0.7) † 29.9 (0.8) † 27.4 (0.7) † 42.6 (0.9) †
Peru 51.1 (0.7) 33.3 (0.6) 15.6 (0.5) 56.9 (0.8) 28.7 (0.7) 14.4 (0.5) 36.2 (0.8) 27.1 (0.7) 36.7 (0.8)
Philippines 44.3 (0.9) 33.2 (0.7) 22.4 (0.7) 62.5 (0.6) 25.6 (0.5) 11.8 (0.4) 37.6 (0.8) 31.1 (0.7) 31.3 (0.8)
Qatar 45.1 (0.4) 34.2 (0.5) 20.7 (0.4) 40.0 (0.5) 33.9 (0.5) 26.0 (0.4) 25.9 (0.4) 31.1 (0.4) 42.9 (0.5)
Romania 48.8 (1.2) 29.5 (0.8) 21.7 (0.9) 47.9 (1.0) 27.9 (0.8) 24.2 (0.8) 20.2 (0.7) 25.8 (0.7) 54.1 (1.1)
Russia 55.6 (0.8) 31.4 (0.7) 13.0 (0.4) 35.1 (0.7) 33.5 (0.6) 31.4 (0.7) 23.3 (0.6) 28.7 (0.7) 47.9 (0.8)
Saudi Arabia 41.8 (0.8) 34.0 (0.6) 24.2 (0.7) 40.8 (0.7) 34.3 (0.7) 24.9 (0.7) 22.4 (0.7) 30.3 (0.7) 47.3 (0.7)
Serbia 44.1 (0.9) 35.4 (0.8) 20.5 (0.7) 41.6 (0.8) 34.0 (0.7) 24.4 (0.7) 22.7 (0.8) 32.4 (0.7) 44.9 (1.0)
Singapore 67.2 (0.7) 24.3 (0.5) 8.5 (0.3) 31.2 (0.7) 34.3 (0.6) 34.6 (0.6) 18.3 (0.4) 30.6 (0.6) 51.1 (0.6)
Chinese Taipei 55.4 (0.7) 32.2 (0.7) 12.4 (0.4) 27.7 (0.5) 34.2 (0.7) 38.1 (0.7) 41.1 (0.7) 31.2 (0.7) 27.7 (0.7)
Thailand 33.3 (0.8) 45.3 (0.7) 21.4 (0.6) 45.0 (0.8) 39.1 (0.6) 15.8 (0.5) 30.7 (0.7) 40.5 (0.7) 28.8 (0.7)
Ukraine 57.0 (0.9) 27.8 (0.7) 15.2 (0.6) 39.8 (0.7) 28.6 (0.6) 31.6 (0.7) 18.9 (0.7) 19.8 (0.7) 61.3 (0.9)
United Arab Emirates 42.8 (0.6) 32.1 (0.4) 25.1 (0.5) 38.5 (0.6) 31.4 (0.4) 30.2 (0.6) 23.7 (0.4) 31.2 (0.5) 45.1 (0.5)
Uruguay 53.2 (0.8) 33.0 (0.8) 13.8 (0.6) 49.0 (0.9) 31.2 (0.7) 19.8 (0.7) 30.6 (1.0) 29.2 (0.7) 40.2 (1.0)
Viet Nam 43.2 (1.2) 35.7 (0.9) 21.1 (0.8) 47.4 (1.0) 34.8 (0.9) 17.8 (0.9) 12.8 (0.7) 26.2 (0.9) 61.0 (1.2)
Note: Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one
dagger (†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12https://doi.org/10.1787/888934240693
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 191
Annex B Results for countries and economies
Table B.6.11a [1/4] Reading performance by the length of text read for school
Based on students' reports
Reading mean score
Explained variance
Between 11 and Difference between Difference between in reading
10 pages or less 100 pages 101 pages or more B and A C and A performance
All students (A) (B) (C) (B - A) (C - A) (r-squared x 100)
Mean score S.E. s Mean score S.E. s Mean score S.E. s Mean score S.E. s Score dif. S.E. s Score dif. S.E. s % S.E. s
Australia 503 (1.6) 445 (2.9) 469 (2.9) 533 (2.0) 24 (3.8) 88 (3.1) 11.9 (0.7)
OECD
Austria 484 (2.7) 460 (2.8) 472 (4.9) 522 (3.0) 12 (4.4) 62 (3.2) 9.2 (0.9)
Belgium 493 (2.3) 465 (3.3) 449 (5.7) 533 (2.5) -15 (5.1) 68 (3.6) 13.6 (1.2)
Canada 520 (1.8) 477 (2.8) 484 (2.8) 542 (1.7) 8 (3.7) 66 (2.8) 8.0 (0.5)
Chile 452 (2.6) 407 (4.5) 419 (3.1) 476 (2.8) 13 (4.5) 69 (4.4) 10.6 (1.1)
Colombia 412 (3.3) 384 (3.3) 414 (3.4) 458 (5.0) 30 (3.6) 74 (5.3) 11.0 (1.4)
Czech Republic 490 (2.5) 470 (3.4) 493 (3.7) 530 (2.9) 22 (4.5) 60 (3.8) 7.3 (0.9)
Denmark 501 (1.8) 484 (6.8) 486 (3.4) 514 (2.0) 2 (7.4) 30 (7.0) 2.0 (0.5)
Estonia 523 (1.8) 497 (3.0) 500 (2.9) 550 (2.2) 3 (3.9) 53 (3.4) 7.9 (0.8)
Finland 520 (2.3) 464 (4.5) 474 (4.7) 543 (2.0) 9 (5.2) 79 (4.4) 11.5 (1.0)
France 493 (2.3) 443 (3.5) 446 (3.8) 531 (2.8) 4 (4.7) 87 (4.5) 18.0 (1.3)
Germany 498 (3.0) 473 (3.9) 515 (5.0) 536 (4.4) 42 (5.1) 62 (4.9) 7.4 (1.1)
Greece 457 (3.6) 463 (3.3) 439 (5.7) 461 (10.5) -24 (4.8) -2 (9.9) 0.8 (0.3)
Hungary 476 (2.3) 440 (3.2) 492 (3.5) 525 (3.8) 52 (4.3) 85 (4.9) 13.7 (1.4)
Iceland 474 (1.7) 451 (4.5) 477 (3.1) 494 (3.0) 27 (5.4) 43 (5.9) 2.3 (0.6)
Ireland 518 (2.2) 495 (2.6) 507 (3.4) 546 (2.6) 12 (3.3) 52 (2.8) 7.1 (0.7)
Israel 470 (3.7) 460 (3.8) 468 (5.7) 513 (7.0) 9 (6.0) 53 (7.4) 3.0 (0.9)
Italy 476 (2.4) 456 (2.9) 483 (3.6) 516 (3.9) 27 (4.2) 60 (4.7) 7.0 (1.0)
Japan 504 (2.7) 502 (2.8) 511 (3.6) 499 (10.0) 9 (3.5) -4 (9.7) 0.2 (0.2)
Korea 514 (2.9) 511 (3.1) 519 (3.8) 516 (8.9) 7 (3.9) 5 (8.7) 0.1 (0.1)
Latvia 479 (1.6) 462 (2.6) 484 (2.1) 495 (3.1) 22 (3.3) 32 (3.9) 2.0 (0.5)
Lithuania 476 (1.5) 434 (2.5) 454 (3.0) 506 (1.9) 20 (3.8) 72 (3.4) 11.0 (0.9)
Luxembourg 470 (1.1) 448 (2.2) 440 (3.2) 502 (2.2) -8 (4.2) 55 (3.4) 7.3 (0.7)
Mexico 420 (2.7) 415 (3.2) 415 (3.4) 466 (5.5) 1 (3.7) 51 (5.9) 4.6 (1.0)
Netherlands 485 (2.7) 466 (3.8) 463 (6.0) 531 (3.1) -2 (5.9) 65 (4.7) 10.7 (1.4)
New Zealand 506 (2.0) 461 (3.4) 473 (3.7) 540 (2.0) 11 (4.6) 78 (3.2) 12.3 (0.7)
Norway 499 (2.2) 495 (3.0) 503 (2.8) 517 (3.4) 8 (3.5) 23 (4.1) 0.8 (0.3)
Poland 512 (2.7) 459 (3.1) 494 (4.8) 538 (3.1) 35 (5.5) 80 (4.8) 12.5 (1.3)
Portugal 492 (2.4) 489 (2.7) 484 (4.1) 523 (3.7) -5 (3.7) 34 (4.3) 2.2 (0.5)
Slovak Republic 458 (2.2) 442 (2.6) 478 (5.6) 504 (3.8) 37 (5.5) 62 (4.9) 6.9 (1.0)
Slovenia 495 (1.2) 452 (3.1) 467 (3.7) 522 (1.6) 15 (4.8) 71 (3.7) 11.6 (1.0)
Spain2 477 (1.6) 458 (2.0) 457 (2.6) 498 (1.6) -1 (2.4) 40 (2.0) 4.7 (0.4)
Sweden 506 (3.0) 495 (3.4) 498 (4.0) 528 (3.4) 3 (4.0) 33 (4.2) 2.2 (0.5)
Switzerland 484 (3.1) 455 (4.0) 474 (5.1) 508 (3.7) 20 (5.1) 54 (5.3) 5.4 (1.0)
Turkey 466 (2.2) 451 (2.8) 441 (3.6) 490 (3.0) -10 (3.5) 39 (4.1) 6.0 (1.0)
United Kingdom 504 (2.6) 418 (9.3) 489 (2.9) 518 (2.8) 70 (10.0) 98 (9.4) 4.7 (0.7)
United States 505 (3.6) 465 (4.4) 482 (4.8) 538 (3.9) 17 (5.2) 72 (5.2) 9.1 (1.2)
OECD average c (0.4) 460 (0.6) 473 (0.7) 515 (0.7) 14 (0.8) 55 (0.9) 7.2 (0.1)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
192
Results for countries and economies Annex B
Table B.6.11a [2/4] Reading performance by the length of text read for school
Based on students' reports
Reading mean score
Explained variance
Between 11 and Difference between Difference between in reading
10 pages or less 100 pages 101 pages or more B and A C and A performance
All students (A) (B) (C) (B - A) (C - A) (r-squared x 100)
Mean score S.E. s Mean score S.E. s Mean score S.E. s Mean score S.E. s Score dif. S.E. s Score dif. S.E. s % S.E. s
Albania 405 (1.9) 407 (2.1) 395 (2.5) 421 (3.5) -13 (2.8) 14 (3.6) 1.6 (0.4)
Partners
Argentina 402 (3.0) 369 (3.9) 405 (3.2) 448 (4.1) 35 (4.3) 78 (4.9) 9.6 (1.2)
Baku (Azerbaijan) 389 (2.5) 390 (2.3) 393 (2.9) 404 (4.5) 3 (2.6) 14 (3.9) 0.5 (0.3)
Belarus 474 (2.4) 394 (5.3) 440 (2.8) 498 (2.6) 46 (5.7) 104 (5.8) 14.5 (1.1)
Bosnia and Herzegovina 403 (2.9) 373 (2.8) 391 (3.3) 434 (3.2) 18 (3.3) 61 (3.2) 12.1 (1.1)
Brazil 413 (2.1) 413 (2.0) 407 (3.3) 477 (6.9) -6 (3.4) 63 (7.0) 3.9 (0.9)
Brunei Darussalam 408 (0.9) 417 (1.2) 386 (2.0) 437 (4.0) -30 (2.5) 20 (4.2) 2.8 (0.4)
B-S-J-Z (China) 555 (2.7) 544 (2.9) 560 (3.8) 578 (4.0) 16 (3.0) 34 (4.2) 2.3 (0.5)
Bulgaria 420 (3.9) 384 (4.1) 451 (4.1) 449 (6.9) 67 (4.4) 65 (7.3) 10.3 (1.2)
Costa Rica 426 (3.4) 396 (3.2) 427 (3.4) 465 (6.7) 31 (3.8) 68 (7.0) 9.3 (1.9)
Croatia 479 (2.7) 428 (3.5) 454 (3.4) 509 (2.5) 27 (3.9) 81 (3.5) 15.7 (1.0)
Cyprus 424 (1.4) 422 (1.8) 439 (2.7) 428 (3.9) 17 (3.2) 5 (4.3) 0.6 (0.2)
Dominican Republic 342 (2.9) 337 (2.9) 350 (3.8) 388 (8.2) 13 (3.5) 51 (8.5) 3.8 (1.3)
Georgia 380 (2.2) 358 (2.2) 387 (3.4) 406 (3.1) 29 (3.5) 48 (3.3) 5.8 (0.7)
Hong Kong (China) 524 (2.7) 522 (3.9) 510 (4.9) 539 (2.9) -12 (4.9) 18 (4.4) 1.3 (0.4)
Indonesia 371 (2.6) 352 (2.6) 378 (3.1) 398 (4.7) 26 (3.1) 46 (4.7) 5.0 (0.8)
Jordan 419 (2.9) 432 (2.5) 390 (4.2) 408 (12.1) -42 (3.9) -24 (12.2) 3.5 (0.7)
Kazakhstan 387 (1.5) 355 (1.5) 391 (1.4) 433 (3.4) 35 (1.8) 78 (3.7) 12.8 (1.0)
Kosovo 353 (1.1) 357 (1.3) 343 (2.5) 357 (4.5) -13 (2.8) 0 (4.6) 0.6 (0.3)
Lebanon 353 (4.3) m m m m m m m m m m m m
Macao (China) 525 (1.2) 535 (1.8) 506 (3.2) 515 (4.2) -29 (4.0) -20 (4.6) 2.0 (0.5)
Malaysia 415 (2.9) 408 (3.0) 411 (3.3) 443 (3.8) 3 (2.6) 36 (3.6) 2.4 (0.5)
Malta 448 (1.7) 447 (2.7) 444 (3.8) 486 (4.8) -2 (4.5) 40 (6.1) 2.1 (0.6)
Moldova 424 (2.4) 393 (2.7) 438 (3.0) 485 (5.0) 45 (3.8) 92 (5.6) 15.5 (1.6)
Montenegro 421 (1.1) 398 (1.8) 406 (2.7) 454 (1.6) 8 (3.0) 56 (2.7) 9.6 (0.8)
Morocco 359 (3.1) 369 (3.3) 338 (3.4) 332 (5.4) -31 (2.8) -38 (4.5) 3.5 (0.5)
North Macedonia 393 (1.1) m m m m m m m m m m m m
Panama 377 (3.0) 369 (2.5) 389 (4.2) 427 (7.3) 20 (4.0) 58 (7.4) 5.0 (1.3)
Peru 401 (3.0) 392 (3.0) 396 (2.8) 443 (5.5) 4 (2.9) 51 (5.8) 4.7 (1.0)
Philippines 340 (3.3) 338 (3.8) 342 (3.4) 342 (5.3) 4 (2.6) 4 (5.3) 0.1 (0.1)
Qatar 407 (0.8) 406 (1.2) 404 (1.8) 459 (2.9) -1 (2.4) 54 (3.2) 3.3 (0.4)
Romania 428 (5.1) 376 (4.1) 408 (4.9) 476 (4.7) 32 (4.4) 100 (5.5) 21.9 (1.8)
Russia 479 (3.1) 419 (6.0) 462 (2.8) 509 (2.7) 43 (5.0) 89 (5.6) 11.4 (1.1)
Saudi Arabia 399 (3.0) 410 (2.8) 372 (3.9) 350 (7.0) -39 (3.5) -60 (6.4) 4.5 (0.6)
Serbia 439 (3.3) 396 (3.3) 433 (4.2) 477 (3.0) 37 (3.8) 80 (3.6) 12.8 (1.0)
Singapore 549 (1.6) 543 (1.7) 527 (3.7) 604 (4.4) -17 (4.2) 61 (4.7) 4.9 (0.7)
Chinese Taipei 503 (2.8) 495 (3.2) 494 (3.1) 529 (4.6) -1 (3.2) 34 (5.0) 2.2 (0.6)
Thailand 393 (3.2) 385 (2.9) 405 (4.1) 401 (6.9) 20 (3.1) 16 (6.3) 1.6 (0.5)
Ukraine 466 (3.5) 384 (5.0) 448 (3.4) 505 (3.1) 65 (4.5) 122 (5.4) 19.4 (1.4)
United Arab Emirates 432 (2.3) 431 (3.5) 430 (2.2) 464 (3.7) -1 (3.6) 34 (5.2) 1.3 (0.3)
Uruguay 427 (2.8) 409 (3.1) 454 (3.7) 471 (6.8) 45 (4.2) 62 (7.6) 6.8 (1.2)
Viet Nam m m m m m m m m m m m m m m
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 193
Annex B Results for countries and economies
Table B.6.11a [3/4] Reading performance by the length of text read for school
Based on students' reports
Reading mean score
OECD average 5 (0.7) 31 (0.7) 23.8 (0.3) 6 (0.7) 31 (0.7) 25.5 (0.3)
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
194
Results for countries and economies Annex B
Table B.6.11a [4/4] Reading performance by the length of text read for school
Based on students' reports
Reading mean score
1. The socio-economic profile is measured by the PISA index of economic, social and cultural status (ESCS).
2. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 195
Annex B Results for countries and economies
Table B.6.15 [1/6] Frequency of use and time using digital devices for teaching and learning during classroom lessons AND
outside classroom lessons
Based on students' reports
Percentage of students who reported that during the last month a digital device has been used for learning
and teaching during test language lessons:
Yes, both the teacher and
students used it Yes, but only students used it Yes, but only the teacher used it No
% S.E. s % S.E. s % S.E. s % S.E. s
Australia 69.1 (0.8) 9.3 (0.3) 16.0 (0.6) 5.6 (0.3)
OECD
OECD average 37.4 (0.2) 11.5 (0.1) 24.6 (0.2) 26.5 (0.2)
1. Students were allowed to respond in intervals of no time, between 1-30 minutes a week, between 31-60 minutes a week, more than 60 minutes a week, and I do not
study this subject. The subject selected was 'Test language lessons' and students who do not study the subject were excluded from the analysis. The rest of responses were
converted to the average number of minutes in the interval (0, 15.5, 45.5, 90.5). The response time of items IC150 (during classroom lessons) and IC151 (outside of classroom
lessons) were sum to reflect the total time a week using digital devices for school during classroom and outside of classroom lessons.
2. Association after accounting for students' and schools' socio-economic profile, measured by the PISA index of economic, social and cultural status (ESCS).
3. Students enrolled in a programme whose curriculum is general.
4. Students enrolled in a programme whose curriculum is vocational.
5. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
196
Results for countries and economies Annex B
Table B.6.15 [2/6] Frequency
of use and time using digital devices for teaching and learning during classroom lessons AND
outside classroom lessons
Based on students' reports
Percentage of students who reported that during the last month a digital device has been used for learning
and teaching during test language lessons:
Yes, both the teacher and
students used it Yes, but only students used it Yes, but only the teacher used it No
% S.E. s % S.E. s % S.E. s % S.E. s
Albania 33.9 (0.7) 12.5 (0.5) 9.8 (0.5) 43.8 (0.9)
Partners
Argentina m m m m m m m m
Baku (Azerbaijan) m m m m m m m m
Belarus m m m m m m m m
Bosnia and Herzegovina m m m m m m m m
Brazil 27.2 (0.5) † 12.5 (0.5) † 15.2 (0.6) † 45.2 (0.8) †
Brunei Darussalam 28.2 (0.4) 9.5 (0.4) 32.7 (0.5) 29.6 (0.5)
B-S-J-Z (China) m m m m m m m m
Bulgaria 29.8 (0.8) † 19.4 (0.7) † 16.2 (0.8) † 34.6 (1.3) †
Costa Rica 30.2 (1.2) 13.8 (0.6) 12.8 (0.7) 43.2 (1.4)
Croatia 26.2 (0.7) 10.9 (0.5) 28.9 (0.8) 34.1 (1.1)
Cyprus m m m m m m m m
Dominican Republic 32.4 (1.2) 16.1 (0.6) 11.1 (0.5) 40.4 (1.3)
Georgia 26.7 (0.8) † 15.3 (0.6) † 6.2 (0.5) † 51.9 (1.0) †
Hong Kong (China) 30.8 (1.6) 4.4 (0.5) 43.4 (1.4) 21.4 (1.0)
Indonesia m m m m m m m m
Jordan m m m m m m m m
Kazakhstan 43.0 (0.5) 21.1 (0.4) 13.4 (0.4) 22.4 (0.4)
Kosovo m m m m m m m m
Lebanon m m m m m m m m
Macao (China) 27.8 (0.6) 4.7 (0.3) 55.5 (0.8) 12.0 (0.5)
Malaysia m m m m m m m m
Malta 21.9 (0.8) 6.9 (0.5) 53.8 (0.8) 17.5 (0.7)
Moldova m m m m m m m m
Montenegro m m m m m m m m
Morocco 28.1 (0.8) 9.1 (0.5) 8.9 (0.4) 53.9 (1.2)
North Macedonia m m m m m m m m
Panama 23.3 (0.8) † 12.0 (0.6) † 11.0 (0.8) † 53.7 (1.2) †
Peru m m m m m m m m
Philippines m m m m m m m m
Qatar m m m m m m m m
Romania m m m m m m m m
Russia 29.7 (0.7) 22.4 (0.5) 18.9 (0.5) 28.9 (0.9)
Saudi Arabia m m m m m m m m
Serbia 24.8 (0.9) † 14.4 (0.6) † 14.0 (0.7) † 46.8 (1.3) †
Singapore 39.9 (1.0) 10.1 (0.5) 29.6 (0.6) 20.4 (0.6)
Chinese Taipei 25.5 (0.7) 3.3 (0.2) 36.7 (1.0) 34.5 (1.1)
Thailand 37.2 (0.8) 19.0 (0.7) 22.7 (0.9) 21.1 (0.9)
Ukraine m m m m m m m m
United Arab Emirates m m m m m m m m
Uruguay 29.6 (1.0) † 26.5 (1.2) † 8.7 (0.7) † 35.3 (1.2) †
Viet Nam m m m m m m m m
1. Students were allowed to respond in intervals of no time, between 1-30 minutes a week, between 31-60 minutes a week, more than 60 minutes a week, and I do not study
this subject. The subject selected was ‘Test language lessons’ and students who do not study the subject were excluded from the analysis. The rest of responses were converted
to the average number of minutes in the interval (0, 15.5, 45.5, 90.5). The response time of items IC150 (during classroom lessons) and IC151 (outside of classroom lessons)
were sum to reflect the total time a week using digital devices for school during classroom and outside of classroom lessons.
2. Association after accounting for students’ and schools’ socio-economic profile, measured by the PISA index of economic, social and cultural status (ESCS).
3. Students enrolled in a programme whose curriculum is general.
4. Students enrolled in a programme whose curriculum is vocational.
5. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 197
Annex B Results for countries and economies
Table B.6.15 [3/6] Frequency of use and time using digital devices for teaching and learning during classroom lessons
AND outside classroom lessons
Based on students' reports
Total time a week using digital devices for school during Change in reading performance associated with a one-hour
classroom and outside of classroom lessons1 increase in the total time a week using digital devices for school2
Minutes S.E. s Score dif. S.E. s
Australia 79.5 (1.0) 9 (1.5)
OECD
Austria m m m m
Belgium 32.8 (0.7) -4 (2.0)
Canada m m m m
Chile 39.9 (0.9) -5 (1.7)
Colombia m m m m
Czech Republic 29.8 (0.7) -8 (1.7)
Denmark 122.6 (0.9) 9 (1.8)
Estonia 30.2 (0.6) -23 (2.4)
Finland 37.4 (1.2) -3 (2.6)
France 26.7 (0.7) -11 (2.3)
Germany 25.7 (0.8) -27 (2.7)
Greece 24.7 (0.7) -21 (2.3)
Hungary 26.2 (0.7) -12 (2.5)
Iceland 47.9 (0.8) -1 (2.1)
Ireland 25.9 (1.2) -1 (1.9)
Israel 31.1 (1.3) † -16 (3.4) †
Italy 42.1 (1.1) -6 (2.0)
Japan 10.2 (0.7) 0 (3.6)
Korea 36.6 (1.2) 9 (2.4)
Latvia 31.4 (0.8) -10 (2.1)
Lithuania 36.7 (0.7) -17 (1.5)
Luxembourg 28.9 (0.6) -20 (2.4)
Mexico 36.9 (0.8) -2 (1.8)
Netherlands m m m m
New Zealand 83.5 (1.6) 12 (1.8)
Norway m m m m
Poland 36.0 (0.6) -5 (2.4)
Portugal m m m m
Slovak Republic 31.6 (0.7) -17 (2.3)
Slovenia 24.3 (0.5) -15 (2.2)
Spain5 31.3 (0.9) -5 (1.4)
Sweden 87.2 (2.4) 1 (2.2)
Switzerland 26.4 (0.9) -20 (3.3)
Turkey 38.8 (1.1) -5 (1.8)
United Kingdom 36.6 (0.8) -1 (2.0)
United States 62.3 (2.1) 9 (1.9)
1. Students were allowed to respond in intervals of no time, between 1-30 minutes a week, between 31-60 minutes a week, more than 60 minutes a week, and I do not
study this subject. The subject selected was 'Test language lessons' and students who do not study the subject were excluded from the analysis. The rest of responses
were converted to the average number of minutes in the interval (0, 15.5, 45.5, 90.5). The response time of items IC150 (during classroom lessons) and IC151 (outside of
classroom lessons) were sum to reflect the total time a week using digital devices for school during classroom and outside of classroom lessons.
2. Association after accounting for students' and schools' socio-economic profile, measured by the PISA index of economic, social and cultural status (ESCS).
3. Students enrolled in a programme whose curriculum is general.
4. Students enrolled in a programme whose curriculum is vocational.
5. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
198
Results for countries and economies Annex B
Table B.6.15 [4/6] Frequency of use and time using digital devices for teaching and learning during classroom lessons AND
outside classroom lessons
Based on students' reports
Total time a week using digital devices for school during Change in reading performance associated with a one-hour
classroom and outside of classroom lessons1 increase in the total time a week using digital devices for school2
Argentina m m m m
Baku (Azerbaijan) m m m m
Belarus m m m m
Bosnia and Herzegovina m m m m
Brazil 30.5 (0.6) † -6 (2.1) †
Brunei Darussalam 34.7 (0.5) -11 (1.6)
B-S-J-Z (China) m m m m
Bulgaria 39.7 (1.0) † -14 (2.3) †
Costa Rica 37.3 (1.0) -2 (2.0)
Croatia 27.2 (0.6) -12 (2.0)
Cyprus m m m m
Dominican Republic 34.1 (0.8) -10 (1.5)
Georgia 24.7 (0.9) † -13 (2.7) †
Hong Kong (China) 26.8 (1.6) -13 (2.7)
Indonesia m m m m
Jordan m m m m
Kazakhstan 56.2 (0.7) -11 (1.0)
Kosovo m m m m
Lebanon m m m m
Macao (China) 43.6 (0.8) 3 (2.1)
Malaysia m m m m
Malta 36.1 (0.6) -6 (2.2)
Moldova m m m m
Montenegro m m m m
Morocco 30.8 (0.9) -21 (1.3)
North Macedonia m m m m
Panama 30.6 (0.8) † -11 (1.8) †
Peru m m m m
Philippines m m m m
Qatar m m m m
Romania m m m m
Russia 47.3 (0.9) -4 (1.9)
Saudi Arabia m m m m
Serbia 31.9 (0.7) † -13 (1.9) †
Singapore 37.6 (0.7) -8 (2.0)
Chinese Taipei 23.0 (0.7) -8 (2.7)
Thailand 43.5 (0.8) -10 (1.3)
Ukraine m m m m
United Arab Emirates m m m m
Uruguay 35.5 (0.9) † 0 (2.2) †
Viet Nam m m m m
1. Students were allowed to respond in intervals of no time, between 1-30 minutes a week, between 31-60 minutes a week, more than 60 minutes a week, and I do not
study this subject. The subject selected was 'Test language lessons' and students who do not study the subject were excluded from the analysis. The rest of responses were
converted to the average number of minutes in the interval (0, 15.5, 45.5, 90.5). The response time of items IC150 (during classroom lessons) and IC151 (outside of classroom
lessons) were sum to reflect the total time a week using digital devices for school during classroom and outside of classroom lessons.
2. Association after accounting for students' and schools' socio-economic profile, measured by the PISA index of economic, social and cultural status (ESCS).
3. Students enrolled in a programme whose curriculum is general.
4. Students enrolled in a programme whose curriculum is vocational.
5. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 199
Annex B Results for countries and economies
Table B.6.15 [5/6] Frequency of use and time using digital devices for teaching and learning during classroom lessons
AND outside classroom lessons
Based on students' reports
Total time a week using digital devices for school during classroom Change in reading performance associated with a one-hour
and outside of classroom lessons by programme orientation increase in the total time a week using digital devices for school2
Austria m m m m m m m m m m m m
Belgium 31.8 (0.9) 34.4 (1.4) † 2.5 (1.7) † 0 (2.1) -6 (2.8) † -6 (3.5) †
Canada m m m m m m m m m m m m
Chile 40.0 (0.9) 33.7 (4.0) -6.3 (4.0) -5 (1.7) -4 (10.4) 1 (10.6)
Colombia m m m m m m m m m m m m
Czech Republic 27.3 (0.9) 34.9 (1.6) 7.6 (2.0) -8 (2.3) -8 (3.2) 0 (4.1)
Denmark 122.6 (0.9) m m m m 9 (1.8) m m m m
Estonia 30.2 (0.6) c c c c -23 (2.4) m m m m
Finland 37.4 (1.2) m m m m -3 (2.6) m m m m
France 26.3 (0.7) 28.3 (1.5) 2.1 (1.7) -10 (2.6) -13 (4.0) -3 (4.8)
Germany 25.4 (0.8) 37.9 (6.9) † 12.4 (7.0) † -27 (2.8) -9 (8.6) † 18 (9.3) †
Greece 23.8 (0.7) 33.1 (2.3) † 9.3 (2.4) † -20 (2.5) -11 (4.6) † 9 (5.0) †
Hungary 24.7 (0.7) 34.1 (2.4) 9.4 (2.6) -12 (3.1) -7 (4.2) 5 (5.2)
Iceland 47.9 (0.8) m m m m -1 (2.1) m m m m
Ireland 25.8 (1.2) m m m m -1 (2.0) m m m m
Israel 31.1 (1.3) † m m m m -16 (3.4) † m m m m
Italy 40.0 (1.5) 44.4 (1.6) 4.4 (2.2) -2 (2.7) -9 (2.7) -7 (3.7)
Japan 11.0 (0.8) 7.6 (1.2) -3.4 (1.5) 3 (4.1) -14 (6.8) -16 (7.8)
Korea 38.1 (1.3) 28.9 (2.8) -9.2 (2.9) 10 (2.5) 1 (5.3) -9 (5.9)
Latvia 31.4 (0.7) 33.8 (10.0) 2.4 (9.9) -10 (2.1) -10 (32.7) 0 (32.8)
Lithuania 36.7 (0.7) 35.5 (3.9) † -1.2 (4.0) † -17 (1.5) -9 (9.2) † 8 (9.1) †
Luxembourg 28.2 (0.7) 24.0 (1.4) -4.1 (1.6) -18 (2.4) -6 (5.4) 11 (5.6)
Mexico 36.2 (1.0) 38.5 (1.1) 2.2 (1.5) -1 (2.5) -5 (2.7) -4 (3.8)
Netherlands m m m m m m m m m m m m
New Zealand 83.5 (1.6) m m m m 12 (1.8) m m m m
Norway m m m m m m m m m m m m
Poland 36.1 (0.6) c c c c -5 (2.4) c c c c
Portugal m m m m m m m m m m m m
Slovak Republic 30.1 (0.9) 42.1 (2.8) † 12.1 (2.9) † -17 (2.8) -8 (8.2) † 9 (8.5) †
Slovenia 22.8 (0.9) 30.9 (1.4) 8.1 (1.8) -3 (4.5) -20 (3.5) -17 (6.0)
Spain5 31.3 (0.9) m m m m -5 (1.4) m m m m
Sweden 87.2 (2.4) m m m m 1 (2.2) m m m m
Switzerland 26.8 (1.0) 22.8 (1.9) -4.0 (2.2) -20 (3.6) -11 (7.9) 9 (8.8)
Turkey 40.2 (1.4) 36.1 (1.1) -4.1 (1.8) -4 (1.9) -8 (3.0) -4 (3.4)
United Kingdom 36.7 (0.8) 33.2 (5.1) -3.5 (5.1) -1 (2.0) 26 (17.2) 27 (17.4)
United States 62.3 (2.1) m m m m 9 (1.9) m m m m
OECD average 40.4 (0.2) 34.3 (0.9) 1.4 (0.9) -6 (0.5) -9 (2.9) -2 (2.9)
1. Students were allowed to respond in intervals of no time, between 1-30 minutes a week, between 31-60 minutes a week, more than 60 minutes a week, and I do not study this
subject. The subject selected was 'Test language lessons' and students who do not study the subject were excluded from the analysis. The rest of responses were converted to
the average number of minutes in the interval (0, 15.5, 45.5, 90.5). The response time of items IC150 (during classroom lessons) and IC151 (outside of classroom lessons) were
sum to reflect the total time a week using digital devices for school during classroom and outside of classroom lessons.
2. Association after accounting for students' and schools' socio-economic profile, measured by the PISA index of economic, social and cultural status (ESCS).
3. Students enrolled in a programme whose curriculum is general.
4. Students enrolled in a programme whose curriculum is vocational.
5. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
200
Results for countries and economies Annex B
Table B.6.15 [6/6] Frequency of use and time using digital devices for teaching and learning during classroom lessons
AND outside classroom lessons
Based on students' reports
Total time a week using digital devices for school during classroom Change in reading performance associated with a one-hour
and outside of classroom lessons by programme orientation increase in the total time a week using digital devices for school2
Argentina m m m m m m m m m m m m
Baku (Azerbaijan) m m m m m m m m m m m m
Belarus m m m m m m m m m m m m
Bosnia and Herzegovina m m m m m m m m m m m m
Brazil 30.7 (0.6) † 28.8 (1.2) -1.9 (1.4) † -7 (2.0) † 7 (8.6) 14 (8.7) †
Brunei Darussalam 34.6 (0.5) m m m m -12 (1.6) m m m m
B-S-J-Z (China) m m m m m m m m m m m m
Bulgaria 37.0 (1.3) † 43.2 (1.5) † 6.3 (2.0) † -11 (3.0) † -13 (2.8) † -2 (4.1) †
Costa Rica 37.8 (1.1) 33.7 (1.5) -4.1 (1.7) -1 (2.1) -6 (4.7) -5 (5.0)
Croatia 25.0 (0.9) 28.4 (0.8) 3.4 (1.3) -9 (3.4) -11 (2.1) -2 (3.8)
Cyprus m m m m m m m m m m m m
Dominican Republic 33.5 (0.9) † 37.5 (2.1) 4.0 (2.3) † -12 (1.5) † -4 (4.4) 8 (4.6) †
Georgia 24.7 (0.9) † m m m m -13 (2.7) † m m m m
Hong Kong (China) 26.8 (1.6) m m m m -13 (2.7) m m m m
Indonesia m m m m m m m m m m m m
Jordan m m m m m m m m m m m m
Kazakhstan 54.0 (0.8) 65.6 (1.4) 11.6 (1.6) -12 (1.1) -6 (1.8) 5 (2.1)
Kosovo m m m m m m m m m m m m
Lebanon m m m m m m m m m m m m
Macao (China) 43.6 (0.8) m m m m 3 (2.2) m m m m
Malaysia m m m m m m m m m m m m
Malta 36.1 (0.6) m m m m -6 (2.2) m m m m
Moldova m m m m m m m m m m m m
Montenegro m m m m m m m m m m m m
Morocco 30.8 (0.9) m m m m -21 (1.3) m m m m
North Macedonia m m m m m m m m m m m m
Panama 30.8 (0.9) † 35.3 (7.6) 4.5 (7.6) † -12 (2.2) † -16 (14.4) -4 (14.6) †
Peru m m m m m m m m m m m m
Philippines m m m m m m m m m m m m
Qatar m m m m m m m m m m m m
Romania m m m m m m m m m m m m
Russia 46.9 (1.0) 57.1 (4.5) 10.2 (5.0) -4 (1.9) -11 (6.8) -7 (7.3)
Saudi Arabia m m m m m m m m m m m m
Serbia 31.5 (1.4) 32.1 (0.8) † 0.6 (1.6) † -14 (4.9) -12 (1.8) † 1 (5.2) †
Singapore 37.6 (0.7) m m m m -8 (2.0) m m m m
Chinese Taipei 22.6 (0.9) 23.8 (1.1) 1.2 (1.4) -6 (3.9) -9 (3.0) -3 (4.8)
Thailand 41.4 (0.9) 51.5 (2.0) 10.0 (2.3) -9 (1.4) -8 (2.5) 1 (2.9)
Ukraine m m m m m m m m m m m m
United Arab Emirates m m m m m m m m m m m m
Uruguay 35.4 (0.9) † c c c c 0 (2.4) † c c c c
Viet Nam m m m m m m m m m m m m
1. Students were allowed to respond in intervals of no time, between 1-30 minutes a week, between 31-60 minutes a week, more than 60 minutes a week, and I do not study this
subject. The subject selected was ‘Test language lessons’ and students who do not study the subject were excluded from the analysis. The rest of responses were converted to
the average number of minutes in the interval (0, 15.5, 45.5, 90.5). The response time of items IC150 (during classroom lessons) and IC151 (outside of classroom lessons) were
sum to reflect the total time a week using digital devices for school during classroom and outside of classroom lessons.
2. Association after accounting for students’ and schools’ socio-economic profile, measured by the PISA index of economic, social and cultural status (ESCS).
3. Students enrolled in a programme whose curriculum is general.
4. Students enrolled in a programme whose curriculum is vocational.
5. For the comparability of Spain’s data see Note 1 under Table B.3.9.
Notes: Values that are statistically significant are indicated in bold.
Information regarding the proportion of the sample covered is shown next to the standard error. No symbol means at least 75% of the population was covered; one dagger
(†) means at least 50% but less than 75%; and one double-dagger (‡) means less than 50% was covered.
12 https://doi.org/10.1787/888934240712
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 201
Annex B Results for countries and economies
Chapter 3 Dynamic Navigation in PISA 2018 Reading Assessment: Read, Explore and Interact
https://doi.org/10.1787/888934240655
WEB Table B.3.1 Nonresponse rate and navigation behaviours in Rapa Nui unit
WEB Table B.3.2 Navigation quantity in single- and multiple- source items and association with reading performance within each country/
economy
WEB Table B.3.3 Cross tabulation between number of nonresponse items and navigation behaviours
WEB Table B.3.4 Median time spent on initial pages in single- and multiple- source items within each country/economy
WEB Table B.3.5 Average ratio of time spent on initial pages in single- and multiple- source items within each country/economy
WEB Table B.3.6 Ratio of effective page transitions in single- and multiple- source items within each country/economy
WEB Table B.3.7 Descriptive statistics of four clusters derived from page and time sequence cluster analysis in multiple-source item CR551Q11
WEB Table B.3.8 Overall navigation quantity in single- and multiple- source items, clicking hyperlinks and using COPY/PASTE in the reading unit
of Rapa Nui
WEB Table B.3.10 Comparisons between Rapa Nui sample and the whole sample in gender, student socio-economic status, and reading
performance
202
Results for countries and economies Annex B
Chapter 4 The interplay between digital devices, enjoyment, and reading performance
https://doi.org/10.1787/888934240674
WEB Table B.4.2 Enjoyment of reading, by student characteristics
WEB Table B.4.3 Enjoyment of reading, by school’s characteristics
WEB Table B.4.4a Change between 2009 and 2018 in enjoyment of reading
WEB Table B.4.4b Change between 2000 and 2018 in enjoyment of reading
WEB Table B.4.5 Average time reading for enjoyment
WEB Table B.4.6 Average time reading for enjoyment, by student characteristics
WEB Table B.4.7 Average time reading for enjoyment, by school’s characteristics
WEB Table B.4.8 Change between 2000, 2009 and 2018 in time spent reading for enjoyment
WEB Table B.4.9 Students' reading habits towards reading
WEB Table B.4.10 Change between 2009 and 2018 in what students read
WEB Table B.4.11 Percentage of students by the format of reading
WEB Table B.4.12 Percentage of students who read books more often on digital devices, by student characteristics
WEB Table B.4.13 Percentage of students who read books more often on digital devices, by school’s characteristics
WEB Table B.4.14a Change between 2000, 2009 and 2018 in number of books in the student's home [part 1/2]
WEB Table B.4.14b Change between 2000, 2009 and 2018 in number of books in the student's home [part 2/2]
WEB Table B.4.15 Percentage of students who read at least 1 hour a day, by the format of reading and by gender
WEB Table B.4.17 Relationship between enjoyment of reading and format of reading
WEB Table B.4.18 Average time of reading for enjoyment and reading performance, by the way of reading the news
WEB Table B.4.19 Relationship between students' and parents' enjoyment of reading, and students' characteristics
WEB Table B.4.20a Enjoyment of reading of students who expect to work in the following science-related occupations: Science and engineering
professionals
WEB Table B.4.20b Enjoyment of reading of students who expect to work in the following science-related occupations: Health professionals
WEB Table B.4.20c Enjoyment of reading of students who expect to work in the following science-related occupations: ICT professionals
WEB Table B.4.20d Enjoyment of reading of students who expect to work in the following science-related occupations: Science-related technicians
and associate professionals
WEB Table B.4.21a Enjoyment of reading of students who expect to work in the following science-related occupations: Science and engineering
professionals, by gender
WEB Table B.4.21b Enjoyment of reading of students who expect to work in the following science-related occupations: Health professionals, by
gender
WEB Table B.4.21c Enjoyment of reading of students who expect to work in the following science-related occupations: ICT professionals, by gender
WEB Table B.4.21d Enjoyment of reading of students who expect to work in the following science-related occupations: Science-related technicians
and associate professionals, by gender
WEB Table B.4.22a Students' reading habits towards reading, for students who expect to work as science and engineering professionals
WEB Table B.4.22b Students' reading habits towards reading, for students who expect to work as health professionals
WEB Table B.4.22c Students' reading habits towards reading, for students who expect to work as ICT professionals
WEB Table B.4.22d Students' reading habits towards reading, for students who expect to work as science technicians and associate professionals
WEB Table B.4.23a Percentage of students who expect to work as science and engineering professionals, by the format of reading
WEB Table B.4.23b Percentage of students who expect to work as health professionals, by the format of reading
WEB Table B.4.23c Percentage of students who expect to work as ICT professionals, by the format of reading
WEB Table B.4.23d Percentage of students who expect to work as science technicians and associate professionals, by the format of reading
WEB Table B.4.24a Percentage of students who expect to work as science and engineering professionals by the way of reading the news
WEB Table B.4.24b Percentage of students who expect to work as health professionals by the way of reading the news
WEB Table B.4.24c Percentage of students who expect to work as ICT professionals by the way of reading the news
WEB Table B.4.24d Percentage of students who expect to work as science technicians and associate professionals by the way of reading the news
WEB Table B.4.25a Frequency of school activities done on digital devices, by students who expect to work as science and engineering professionals
WEB Table B.4.25b Frequency of school activities done on digital devices, by students who expect to work as health professionals
WEB Table B.4.25c Frequency of school activities done on digital devices, by students who expect to work as ICT professionals
WEB Table B.4.25d Frequency of school activities done on digital devices, by students who expect to work as science technicians and associate
professionals
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 203
Annex B Results for countries and economies
204
Results for countries and economies Annex B
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 205
ANNEX C
Technical notes on analysis in Chapter 3
All figures and tables in Annex C are available on line
Annex C1: Navigation activities and time allocation across the Rapa Nui reading unit
https://doi.org/10.1787/888934240731
https://doi.org/10.1787/888934240750
Annex C2: Algorithms for computing sequence distance by Dynamic Time Warping (DTW)
Method
Annex C3: Consistency analysis for students’ navigation behaviours in reading units within
the same testlets with the Rapa Nui Unit
https://doi.org/10.1787/888934240769
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 207
Annex A1 Construction of indices
ANNEX C1
Navigation activities and time allocation across the Rapa Nui reading unit
Figure C1.1 illustrates how students typically navigated through the whole Rapa Nuit unit. Limited navigation is observed in the
first two single-source items. With the second and third page consequently being activated in item 3 and item 4, the number
of navigations increased correspondingly. A first peak formed at item 4, especially for students above Level 2. The number
of navigations temporarily dropped at item 51 and then rose sharply in the last two items where navigations to other pages
were compulsory to complete the task. The curves of students at higher performance levels (Level 5 and Level 6) show big
waves through the unit, signalling active navigations across the whole reading and navigation process. In contrast, the curves
of students at lower performance levels (Level 2, and Level 1a and below) tend to be flat through all seven items, indicating few
effective navigations were executed in these groups, thus the number of navigations remained at 1.
Figure C1.1 Distribution of number of pages visited on sequential seven items through the Rapa Nui unit
3.5
3.0
OECD average
Overall average
2.5
Average number of visited pages
Level 6
Level 5
2.0
Level 2
1.0
0.5
0.0
Item 1 Item 2 Item 3 Item 4 Item 5 Item 6 Item 7
208 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Construction of indices Annex A1
Figure C1.2 illustrates how students distributed their time in the seven items within the Rapa Nui unit by different reading
performance level. The students in Level 5 and Level 6 – upper reading performance levels – spent the greatest amount of
time on average in almost every item compared to their peers. Students at these levels especially maximised their time in
the last item, which is an open-ended item with multiple-source requirements. The students in Level 6 on average spent over
three minutes in the last item, twice as long as students in Level 3 and three times longer than students in Level 1a and below.
These execution times suggest a high level of persistent engagement through the reading and navigation process. In contrast,
students at lower performance levels (Level 1a and below) spent much less time on all the items. The time spent on items with
multiple-source requirements was even shorter than the items with single requirements, suggesting a lack of motivation and
unfamiliarity with multiple-source environments. These students might have gotten lost in the navigation or felt the items were
too difficult and skipped the items towards the end of the test2.
Figure C1.2 Distribution of time spent on sequential seven items in the Rapa Nui unit
200
180
160
OECD average
140
Average time per item (second)
Overall average
120
Level 6
100 Level 5
Level 2
80
Level 1a and Below
60
40
20
0
Item 1 Item 2 Item 3 Item 4 Item 5 Item 6 Item 7
Notes
1. Item 5 in Rapa Nui is a single-source item; thus, navigation is optional. The multiple-source texts are all activated in item 4, and thus, more
navigation activities are expected. As most students may have already navigated to the new pages in item 4, navigation activities were expected
to decrease in item 5.
2. It is noted that the unit CR551 locates at the second unit within the testlet, thus a higher rate of non-response was expected towards the end
of the test.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 209
ANNEX C2
Algorithms for computing sequence distance by Dynamic Time Warping (DTW) Method
Given sequences X={x1,x2,…,xn} and Y={y1,y2,…,ym} with the same or different lengths, a warping path W is an alignment between
X and Y, involving one-to-many mappings for each pair of elements. The cost of a warping path is calculated by the sum of the
cost of each mapping pair. Furthermore, a warping path contains three constraints: (1) endpoint constraint: The alignment starts
at pair (1,1) and ends at pair (N,M); (2) monotonicity constraint: The order of elements in the path for both X and Y should be
preserved in the same, original order of X and Y, respectively; (3) Step-size constraint: The difference of index for both X and Y
between two adjacent pairs in the path needs to be no more than 1 step. In other words, pair (xi, yj) can be followed by three
possible pairs including (xi+1, yj), (xi, yj+1) and (xi+1, yj+1).
DTW is a distance measure that searches the optimal warping path between two series. In particular, we first construct a cost
matrix C, where each element C(i,j) is a cost of the pair (xi, yj), specified by using Euclidean, Manhattan or other distance function.
DTW is calculated based on dynamic programming. The initial step of DTW algorithm is defined as:
(1)
(2)
where(wh, wv, wd) are weights for the horizontal, vertical and diagonal directions, respectively. DTW(i, j) denotes the distance or
cost between two sub-sequences {x1, x2,…, xi} and {y1, y2,…,yj}, and DTW(N,M) indicates the total cost of the optimal warping path.
210 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
ANNEX C3
Consistency analysis for students’ navigation behaviours in reading units within the same
testlets with the Rapa Nui Unit
With the aim to investigate whether students employed similar navigation strategies across multiple reading units, a consistency
analysis was conducted on students’ navigation quality using two reading units (CR543 – Alfred Nobel and CR544 – Nikola Tesla)
that are in the same testlets (H21 and H25) of the Rapa Nui unit. Among the subsample (N=76 270) who were assigned to the
unit Rapa Nui, 45.7% of students were routed to the H21 testlet (CR543 and CR551) while 54.3% students were routed to the
H25 testlet (CR544 and CR551).
Both reading units (CR543 and CR544) involve two sources of reading materials. Within the reading unit CR543, four items are
designed in the multiple-source environment, where two items are instructed to refer to a single-source while the other two
items are instructed to use information from both sources to solve the task. Similarly, within the CR544 reading unit, four items
are also designed in the multiple-source environment, where only the first item is instructed to refer to a single-source while the
other three items are instructed to refer to information from both resources. It is noted that both CR543 and CR544 are in the
high-difficulty testlet in the multistage adaptive testing in PISA 2018.1
The consistency analysis examined the confusion matrix between two units via the four categories of navigation behaviour,
i.e., (1) actively explorative navigation, (2) strictly focused navigation, (3) limited navigation, and (4) no navigation. The goal was to
identify the commonalities between the two units (CR543 and CR551; CR544 and CR551); that is, whether the students employed
the same or different navigation strategies to solve the reading tasks in two units within the same testlet. The higher values along
the diagonal line in the confusion matrix indicate that more students shared the same navigation behaviours between the two
reading units.
As shown in Table C3.1, the exactly consistent navigation strategies in CR543 and CR551 in the categories of actively explorative,
strictly focused, limited navigation and no navigation were 21%, 33%, 0 and 74%, respectively. The highest consistency was found
in the no navigation group. It implied that 74% of students who did not navigate at all in CR551 showed no navigation behaviour
in CR543 as well. The high consistency in the no navigation group might reflect the similar proficiency level between CR543 and
CR551. No students employed the limited navigation strategy in CR543. Among the students who employed strictly focused
strategy in the Rapa Nui, 33% employed the same strategy in the Alfred Nobel reading unit while a large proportion showed no
navigation in the CR543. See more details of country-level consistency analysis results between CR543 and CR551 in Table C3.2.
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 211
A1 Construction
Annex C3 Consistency analysis
of indices
for students’ navigation behaviours in reading units within the same testlets with the Rapa Nui Unit
Table C3.1 Confusion matrix of overall average percentage of students in four navigation categories in CR543 and CR544
given their behaviour in the Rapa Nui unit (CR551)
CR551: navigation behaviour group
Actively Explorative Strictly Focused Limited Navigation No Navigation
% S.E. % S.E. % S.E. % S.E.
Actively Explorative 21.1 (1.1) 12.9 (0.6) 15.2 (0.8) 5.9 (0.2)
Strictly Focused 34.3 (1.2) 33.0 (0.8) 28.4 (1.0) 20.5 (0.3)
CR543: navigation
behaviour group
Limited Navigation 0.0 (0.0) 0.0 (0.0) 0.0 (0.0) 0.0 (0.0)
Actively Explorative 18.0 (0.7) 13.6 (0.6) 14.5 (0.8) 6.4 (0.2)
Strictly Focused 64.3 (1.1) 60.7 (0.9) 52.1 (1.1) 30.9 (0.4)
CR544: navigation
behaviour group
Limited Navigation 3.0 (0.4) 3.5 (0.3) 7.1 (0.6) 4.0 (0.2)
Analogously, a consistency analysis was conducted between CR544 and CR551. It was interesting to find that student who
employed the strictly focused strategy in the Rapa Nui reading unit (CR551) showed the highest consistency in using the same
strategy in CR544. The active explorative group and limited navigation group in CR551 did not show a high consistency using the
same strategies in the reading unit CR544; instead, over 60% of students who used these two strategies in CR551 switched to
the strictly focused group. One possible reason could be that there was only one single-source requirement item in CR544, thus
less variation in navigation strategy was expected from students in this reading unit. Another reason might come from the lower
proficiency level of CR544 compared with CR551. The lower-difficulty items might have enabled students to exert navigations
relatively more easily. Students who employed no navigation strategy in the Rapa Nui unit also showed high consistency in the
Nikola Tesla reading unit (CR544), indicating students who did not use any navigation strategy were more likely to not navigate
in other reading units either. This phenomenon is more obvious in low-performance countries than high-performance countries.
See more details of country-level consistency analysis results between CR544 and CR551 in Table C3.3.
It is noted that the order of reading units within a testlet was fixed. The CR543 and CR544 locate at the first reading unit in the
H21 and H25 testlets, respectively. The CR551 always locates at the second reading unit in both testlets. Considering the item
location effect, the CR543 and CR544 might have triggered slightly different navigation behaviours from CR551 as students may
have explored a bit more in the first reading unit within a testlet to gain experience and familiarity for the following reading tasks.
• Table C3.3 Crosstabs between the percentage of students in four categories of navigation strategies used in CR544 given
their behaviour in the Rapa Nui unit (CR551)
212 © OECD 2021 » PISA 21st-Century Readers: Developing literacy skills in a digital world
Consistency analysis for students’ navigation behaviours in reading units within the same testletsConstruction
with the Rapa
ofNui
indices
Unit Annex C3
A1
Note
1. E
ach reading unit is composed of items of different difficulties. For instance, in CR543, the proficiency level of the four items are Level 3,
Level 4, Level 3 and Level 6, respectively. For further information about item difficulty level and parameters refer to the Annex A of the PISA 2018
Technical Report (OECD, forthcoming[1]), https://www.oecd.org/pisa/data/pisa2018technicalreport/PISA2018%20TechRep_Final-AnnexA.xlsx.
Reference
OECD (forthcoming), PISA 2018 Technical Report, OECD Publishing, Paris. [1]
PISA 21st-Century Readers: Developing literacy skills in a digital world » © OECD 2021 213
21st-Century Readers
DEVELOPING LITERACY SKILLS IN A DIGITAL WORLD
Literacy in the 21st century is about constructing and validating knowledge. Digital technologies have enabled
the spread of all kinds of information, displacing traditional formats of usually more carefully curated information
such as encyclopaedias and newspapers. The massive information flow of the digital era demands that readers
be able to distinguish between fact and opinion. Readers must learn strategies to detect biased information and
malicious content like fake news and phishing emails.
This report examines how students’ access to digital technologies and training on how to use them vary between
and within countries. It also explores how 15-year-old students are developing reading skills to navigate
the technology-rich 21st century. It sheds light on potential ways to strengthen students’ capacity to navigate
the new world of information.
9HSTCQE*dcecca+