Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

About the study

The International Computer and Information Literacy Study (ICILS) studied the extent to which young people have developed computer and information literacy (CIL) to support their capacity to participate in the digital age. Computer and information literacy is defined as “an individual’s ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in society” (Fraillon, Schulz, & Ainley, 2013, p. 17).

ICILS is a response to the increasing use of information and communication technology (ICT) in modern society and the need for citizens to develop relevant skills in order to participate effectively in the digital age. It also addresses the necessity for policymakers and education systems to have a better understanding of the contexts and outcomes of CIL-related education programs in their countries. ICILS is the first crossnational study commissioned by the International Association for the Evaluation of Educational Achievement (IEA) to collect student achievement data on computer.

ICILS used purpose-designed software for the computer-based student assessment and questionnaire. These instruments were administered primarily by way of USB drives attached to school computers. Although the software could have been delivered via internet, the USB delivery ensured a uniform assessment environment for students regardless of the quality of internet connections in participating schools. Data were either uploaded to a server or delivered to the ICILS research center in that country.

ICILS systematically investigated differences among the participating countries in CIL outcomes and how participating countries were providing CIL-related education. The ICILS team also explored differences within and across countries with respect to relationships between CIL education outcomes and student characteristics and school contexts.

ICILS was based around four research questions focused on the following:

  1. 1.

    Variations in CIL within and across countries;

  2. 2.

    Aspects of schools, education systems, and teaching associated with student achievement in CIL;

  3. 3.

    The extent to which students’ access to, familiarity with, and self-reported proficiency in using computers is associated with student achievement in CIL; and

  4. 4.

    Aspects of students’ personal and social backgrounds associated with CIL.

The publication presenting the ICILS assessment framework (Fraillon et al., 2013) describes the development of these questions. The publication also provides more details relating to the questions and outlines the variables necessary for analyses pertaining to them.

Data

ICILS gathered data from almost 60,000 Grade 8 (or equivalent) students in more than 3,300 schools from 21 countries or education systemsFootnote 1 within countries. These student data were augmented by data from almost 35,000 teachers in those schools and by contextual data collected from school ICT-coordinators, school principals, and the ICILS national research centers.

The main ICILS survey took place in the 21 participating countries between February and December 2013. The survey was carried out in countries with a Northern Hemisphere school calendar between February and June 2013 and in those with a Southern Hemisphere school calendar between October and December 2013.

Students completed a computer-based test of CIL that consisted of questions and tasks presented in four 30-minute modules. Each student completed two modules randomly allocated from the set of four so that the total assessment time for each student was one hour.

After completing the two test modules, students answered (again on computer) a 30-minute international student questionnaire. It included questions relating to students’ background characteristics, their experience and use of computers and ICT to complete a range of different tasks in school and out of school, and their attitudes toward using computers and ICT.

The three instruments designed to gather information from and about teachers and schools could be completed on computer (over the internet) or on paper. These instruments were:

  • A 30-minute teacher questionnaire: This asked teachers several basic background questions followed by questions relating to teachers’ reported use of ICT in teaching, their attitudes about the use of ICT in teaching, and their participation in professional learning activities relating to pedagogical use of ICT.

  • A 10-minute ICT-coordinator questionnaire: This asked ICT-coordinators about the resources available in the school to support the use of ICT in teaching and learning. The questionnaire addressed both technological (e.g., infrastructure, hardware, and software) as well as pedagogical support (such as through professional learning).

  • A 10-minute principal questionnaire: This instrument asked school principals to provide information about school characteristics as well as school approaches to providing CIL-related teaching and incorporating ICT in teaching and learning.

ICILS national research coordinators (NRCs) coordinated information procured from national experts via an online national contexts survey. Experts included education ministry or departmental staff, relevant nongovernmental organizations, specialist organizations concerned with educational technologies, and teacher associations. The information sought concerned the structure of the respective country’s education system, plans and policies for using ICT in education, ICT and student learning at lower-secondary level, ICT and teacher development, and ICT-based learning and administrative management systems.

Computer and information literacy

The construct

The CIL construct was conceptualized in terms of two strands that framed the skills and knowledge addressed by the CIL instruments. Each strand was made up of several aspects, each of which referenced specific content.

Strand 1 of the framework, titled collecting and managing information, focuses on the receptive and organizational elements of information processing and management. It incorporates three aspects:

  • Knowing about and understanding computer use: This refers to a person’s declarative and procedural knowledge of the generic characteristics and functions of computers. It focuses on the basic technical knowledge and skills that underpin our use of computers in order to work with information.

  • Accessing and evaluating information: This refers to the investigative processes that enable a person to find, retrieve, and make judgments about the relevance, integrity, and usefulness of computer-based information.

  • Managing information: This aspect refers to the capacity of individuals to work with computer-based information. The process includes ability to adopt and adapt information-classification and information-organization schemes in order to arrange and store information so that it can be used or reused efficiently.

Strand 2 of the construct, titled producing and exchanging information, focuses on using computers as productive tools for thinking, creating, and communicating. The strand has four aspects:

  • Transforming information: This refers to a person’s ability to use computers to change how information is presented so that it is clearer for specific audiences and purposes.

  • Creating information: This aspect refers to a person’s ability to use computers to design and generate information products for specified purposes and audiences. These original products may be entirely new or they may build on a given set of information in order to generate new understandings.

  • Sharing information: This aspect refers to a person’s understanding of how computers are and can be used as well as his or her ability to use computers to communicate and exchange information with others.

  • Using information safely and securely: This refers to a person’s understanding of the legal and ethical issues of computer-based communication from the perspectives of both the publisher and the consumer of that information.

Assessing computer and information literacy

The student assessment was based on four modules, each of which consisted of a set of questions and tasks based on a realistic theme and following a linear narrative structure. The tasks in the modules comprised a series of small discrete tasks (typically taking less than a minute to complete) followed by a large task that typically took 15 to 20 minutes to complete. Taken together, the modules contained a total of 62 tasks and questions corresponding to 81 score points.

When students began each module, they were presented with an overview of the theme and purpose of the tasks in it. The overview also included a basic description of the content of the large task and what completing it would involve. The narrative of each module typically positioned the smaller discrete tasks as a mix of skill-execution and information-management tasks in preparation for completion of the large task. Students were required to complete the tasks in the allocated sequence and could not return to completed tasks in order to review them.

The four modules were:

  • After School Exercise: Students set up an online collaborative workspace to share information and then selected and adapted information to create an advertising poster for an after-school exercise program.

  • Band Competition: Students planned a website, edited an image, and used a simple website builder to create a webpage containing information about a school band competition.

  • Breathing: Students managed files and collected and evaluated information needed to create a presentation explaining the process of breathing to eight- or nine-year-old students.

  • School Trip: Students helped plan a school trip using online database tools. The task required students to select and adapt information in order to produce an information sheet about the trip for their peers. Students were told that their information sheet had to include a map that they could create using an online mapping tool.

Each test completed by a student consisted of two of the four modules. There were 12 different possible combinations of module pairs altogether. Each module appeared in six of the combinations—three times as the first and three times as the second module when paired with each of the other three. The module combinations were randomly allocated to students.

This test design made it possible to assess a larger amount of content than could be completed by any individual student and was necessary to ensure broad coverage of the content of the ICILS assessment framework. The design also controlled for the influence of item position on difficulty across the sampled students and provided a variety of contexts for the assessment of CIL.

The computer and information literacy scale

We used the Rasch item response theory (IRT) model to derive the cognitive scale from the data collected from the 62 test questions and tasks corresponding to 81 score points. Most questions and tasks each corresponded to one item. However, raters scored each ICILS large task against a set of criteria (each criterion with its own unique set of scores) relating to the properties of the task. Each large-task assessment criterion was therefore also an item in ICILS.

We set the final reporting scale to a metric that had a mean of 500 (the ICILS average score) and a standard deviation of 100 for the equally weighted national samples. We used plausible value methodology with full conditioning to derive summary student achievement statistics.

The ICILS described scale of CIL achievement is based on the content and scaled difficulties of the assessment items. The ICILS research team wrote descriptors for each item. The descriptors designate the CIL knowledge, skills, and understandings demonstrated by a student correctly responding to each item.

Pairing the scaled difficulty of each item with the item descriptors made it possible to order the items from least to most difficult, a process that produced an item map. Analysis of the item map and student achievement data were then used to establish proficiency levels that had a width of 85 scale points.Footnote 2 Student scores below 407 scale points indicate CIL proficiency below the lowest level targeted by the assessment instrument.

The scale description comprises syntheses of the common elements of CIL knowledge, skills, and understanding at each proficiency level. It also describes the typical ways in which students working at a level demonstrate their proficiency. Each level of the scale references the characteristics of students’ use of computers to access and use information and to communicate with others.

The scale thus reflects a broad range of development, extending from students’ application of software commands under direction, through their increasing independence in selecting and using information to communicate with others, and on to their ability to independently and purposefully select information and use a range of software resources in a controlled manner in order to communicate with others. Included in this development is students’ knowledge and understanding of issues relating to online safety and to ethical use of electronic information. This understanding encompasses knowledge of information types and security procedures through to demonstrable awareness of the social, ethical, and legal consequences of a broad range of known and unknown users (potentially) accessing electronic information.

The four described levels of the CIL scale were summarized as follows:

  • Level 4 (above 661 scale points): Students working at Level 4 select the most relevant information to use for communicative purposes. They evaluate usefulness of information based on criteria associated with need and evaluate the reliability of information based on its content and probable origin. These students create information products that demonstrate a consideration of audience and communicative purpose. They also use appropriate software features to restructure and present information in a manner that is consistent with presentation conventions, and they adapt that information to suit the needs of an audience. Students working at Level 4 also demonstrate awareness of problems that can arise with respect to the use of proprietary information on the internet.

  • Level 3 (577 to 661 scale points): Students working at Level 3 demonstrate the capacity to work independently when using computers as information-gathering and information-management tools. These students select the most appropriate information source to meet a specified purpose, retrieve information from given electronic sources to answer concrete questions, and follow instructions to use conventionally recognized software commands to edit, add content to, and reformat information products. They recognize that the credibility of web-based information can be influenced by the identity, expertise, and motives of the creators of that information.

  • Level 2 (492 to 576 score points): Students working at Level 2 use computers to complete basic and explicit information-gathering and information-management tasks. They locate explicit information from within given electronic sources. These students make basic edits and add content to existing information products in response to specific instructions. They create simple information products that show consistency of design and adherence to layout conventions. Students working at Level 2 demonstrate awareness of mechanisms for protecting personal information. They also demonstrate awareness of some of the consequences of public access to personal information.

  • Level 1 (407 to 491 score points): Students working at Level 1 demonstrate a functional working knowledge of computers as tools and a basic understanding of the consequences of computers being accessed by multiple users. They apply conventional software commands to perform basic communication tasks and add simple content to information products. They demonstrate familiarity with the basic layout conventions of electronic documents.

The scale is hierarchical in the sense that CIL proficiency becomes more sophisticated as student achievement progresses up the scale. We can therefore assume that a student located at a particular place on the scale because of his or her achievement score will be able to undertake and successfully accomplish tasks up to that level of achievement.

Variations in student achievement on the CIL scale

Variations across countries

Student CIL varied considerably across ICILS countries. The average national scores on the scale ranged from 361 to 553 scale points, a span that extends from below Level 1 to a standard of proficiency within Level 3. This range was equivalent to almost two standard deviations. However, the distribution of country CIL means was skewed because the means of three countries were significantly below the ICILS 2013 average and the means of 12 other countries were significantly above the ICILS 2013 average. Eighty-one percent of students achieved scores that placed them within CIL Levels 1, 2, and 3. In all but two countries, Turkey and Thailand, the highest percentage of students was in Level 2.

Factors associated with variations in CIL

Higher socioeconomic status was associated with higher CIL proficiency both within and across countries. Female students had higher CIL scale scores in all but two countries. Similarly, students who spoke the language of the CIL assessment (which was also the language of instruction) also performed better on it. Multiple regression techniques showed that the following variables had statistically significant positive associations with CIL in most countries: students’ gender (female compared to male), students’ expected educational attainment, parental educational attainment, parental occupational status, number of books in the home, and ICT home resources.

Student experience of computer use and their frequency of computer use at home were positively associated with CIL scores in most countries. Student access to a home internet connection and the number of computers students had at home had statistically significant associations with CIL scores in about half of the participating education systems. However, the association between number of home computers and CIL scores disappeared after we had controlled for the effect of socioeconomic background. In addition, student reports of having learned about ICT at school were associated with CIL achievement in eight education systems.

CIL achievement was also positively associated with basic ICT self-efficacy but not with advanced ICT self-efficacy. This finding is consistent with the nature of the CIL assessment construct, which is made up of information literacy and communication skills that are not necessarily related to advanced computer skills such as programming or database management. Even though CIL is computer based, in the sense that students demonstrate CIL in the context of computer use, the CIL construct itself does not emphasize high-level computer-based technical skills. Greater interest in and enjoyment of ICT use was associated with higher CIL scores in nine of the 14 countries that met the ICILS sampling requirements.

We observed statistically significant effects of ICT-related school-level factors on CIL achievement in only a few countries. In several education systems, we recorded evidence of effects on CIL of the school average of students’ computer use (at home) and the extent to which students reported learning about ICT-related tasks at school. These findings deserve further analysis in future research. The notion that school learning is an important aspect of developing CIL is a particularly important consideration and therefore worth investigating in greater detail.

Multilevel analyses confirmed that students’ experience with computers as well as regular home-based use of computers had significant positive effects on CIL even after we had controlled for the influence of personal and social context. However, ICT resources, particularly the number of computers at home, no longer had effects once we took socioeconomic background into account. A number of the associations between school-level factors and CIL were not significant after we controlled for the effect of the school’s socioeconomic context.

Student use of ICT

Almost all ICILS students reported that they were experienced users of computers and had access to them at home and at school. On average across the ICILS countries, more than one third of the Grade 8 students said they had been using computers for seven or more years, with a further 29 percent reporting that they had been using computers for between five and seven years. Ninety-four percent of the students on average crossnationally reported having at least one computer (desktop, laptop, notebook, or tablet device) at home, while 48 percent reported having three or more computers at home. Ninety-two percent of students stated that they had some form of internet connection at home.

Students across the ICILS countries reported using computers more frequently at home than elsewhere. On average, 87 percent said they used a computer at home at least once a week, whereas 54 percent and 13 percent reported this same frequency of computer use at school and at other places respectively.

Computer use outside school

ICILS 2013 data indicated that students were making widespread and frequent use of digital technologies when outside school. Students tended to use the internet for social communication and exchanging information, computers for recreation, and computer utilities for school work and other purposes.

On average across ICILS countries, three quarters of the students said they communicated with others by way of messaging or social networks at least weekly. Just over half said that they used the internet for “searching for information for study or school work” at least once a week, and almost half indicated that they engaged in “posting comments to online profiles or blogs” at least once each week. On average, there was evidence of slightly more frequent use of the internet for social communication and exchanging information among females than among males.

Students were also frequently using computers for recreation. On average across the ICILS countries, 82 percent of students reported “listening to music” on a computer at least once a week, 68 percent reported “watching downloaded or streamed video (e.g., movies, TV shows, or clips)” on a weekly basis, and 62 percent said they used the internet to “get news about things of interest,” also on a weekly basis. Just over half of all the ICILS students were “playing games” once a week or more. Overall, males reported slightly higher frequencies of using computers for recreation than did females.

Students also reported using computer utilities (applications) outside school. Generally across the ICILS countries, the most extensive weekly use of computer utilities involved “creating or editing documents” (28% of students). Use of most other utilities was much less frequent. For example, only 18 percent of the students were “using education software designed to help with school study.” We found no significant difference between female and male students with respect to using computer utilities outside school.

Use of ICT for school work

Crossnationally, just under half (45%) of the ICILS students, on average, were using computers to “prepare reports or essays” at least once a week. We recorded a similar extent of use for “preparing presentations” (44%). Forty percent of students reported using ICT when working with other students from their own school at least weekly, and 39 percent of students reported using a computer once a week or more to complete worksheets or exercises.

Two school-related uses of computers were reported by less than one fifth of the students. These were “writing about one’s own learning,” which referred to using a learning log, and “working with other students from other schools.” Nineteen percent of students said they used a computer for the first of these tasks; 13 percent said they used a computer for the second.

The subject area in which computers were most frequently being used was, not surprisingly, information technology or computer studies (56%). On average, about one fifth of the students studying (natural) sciences said they used computers in most or all lessons. The same proportion reported using computers in most or all of their human sciences/humanities lessons. In language arts (the test language) and language arts (foreign languages), students were using computers a little less frequently: about one sixth of the students reported computer use in most or all such lessons. Approximately one in seven students studying mathematics reported computer use in most mathematics lessons or almost every lesson. Of the students studying creative arts, just a little more than one in 10 reported computer use in most or all lessons.

Teacher and school use of ICT

Teacher use of ICT

ICILS teachers were making extensive use of ICT in their schools. Across the ICILS countries, three out of every five teachers said they used computers at least once a week when teaching, and four out of five reported using computers on a weekly basis for other work at their schools. Teachers in most countries were experienced users of ICT. Four out of every five of them said they had been using computers for two years or more when teaching.

In general, teachers were confident about their ability to use a variety of computer applications; two thirds of them expressed confidence in their ability to use these for assessing and monitoring student progress. We observed differences, however, among countries in the level of confidence that teachers expressed with regard to using computer technologies. We also noted that younger teachers tended to be more confident ICT users than their older colleagues.

Teachers recognized the positive aspects of using ICT in teaching and learning at school, especially with respect to accessing and managing information. On balance, teachers reported generally positive attitudes toward the use of ICT, although many were aware that ICT use could have some detrimental aspects.

As already indicated, a substantial majority of the ICILS teachers were using ICT in their teaching. This use was greatest among teachers who were confident about their ICT expertise and who were working in school environments where staff collaborated on and planned ICT use, and where there were fewer resource limitations to that use. These were also the conditions that supported the teaching of CIL. These findings suggest that if schools are to develop students’ CIL to the greatest extent possible, then teacher expertise in ICT use needs to be augmented (lack of teacher expertise in computing is considered to be a substantial obstacle to ICT use), and ICT use needs to be supported by collaborative environments that incorporate institutional planning.

According to the ICILS teachers, the utilities most frequently used in their respective reference classes were those concerned with wordprocessing, presentations, and computer-based information resources, such as websites, wikis, and encyclopedias. Overall, teachers appeared to be using ICT most frequently for relatively simple tasks and less often for more complex tasks.

School-based ICT provision and use

There were substantial differences across countries in the number of students per available computer in a school. The ICILS 2013 average for this ratio ranged from two (Norway) and three (Australia) through to 22 (Chile) and 26 (Croatia). Turkey had a very high ratio of students per computer (80). Students from countries with greater access to computers in schools tended to have stronger CIL skills.

Computers in schools were most often located in computer laboratories and libraries. However, there were differences among countries as to whether schools had portable class-sets of computers on offer or whether students brought their own computers to class.

ICT-coordinators reported a range of impediments to teaching and learning ICT. In general, the coordinators rated personnel and teaching support issues as more problematic than resource issues. However, there was considerable variation in the types of limitation arising from resource inadequacy.

Teachers and principals provided perspectives on the range of professional development activities relevant to pedagogical use of ICT. According to principals, teachers were most likely to participate in school-provided courses on pedagogical use of ICT, to talk about this type of use when they were within groups of teachers, and to discuss ICT use in education as a regular item during meetings of teaching staff. From the teachers’ perspective, the most common professional development activities available included observing other teachers using ICT in their teaching, introductory courses on general applications, and sharing and evaluating digital resources with others via a collaborative workspace.

Conclusion

ICILS has provided a description of the competencies underpinning CIL that incorporates the notions of being able to safely and responsibly access and use digital information as well as produce and develop digital products. ICILS has also provided educational stakeholders with an empirically derived scale and description of CIL learning that they can reference when deliberating about CIL education. This framework and associated measurement scale furthermore provide a basis for understanding variation in CIL at present and for monitoring change in the CIL that results from developments in policy and practice over time.

The CIL construct combines information literacy, critical thinking, technical skills, and communication skills applied across a range of contexts and for a range of purposes. The variations in CIL proficiency show that while some of the young people participating in ICILS were independent and critical users of ICT, there were many who were not. As the volume of computer-based information available to young people continues to increase, so too will the onus on societies to critically evaluate the credibility and value of that information.

Changing technologies (such as social media and mobile technologies) are increasing the ability of young people to communicate with one another and to publish information to a worldwide audience in real time. This facility obliges individuals to consider what is ethically appropriate and to determine how to maximize the communicative efficacy of information products.

ICILS results suggest that the knowledge, skills, and understandings described in the CIL scale can and should be taught. To some extent, this conclusion challenges perspectives of young people as digital natives with a self-developed capacity to use digital technology. Even though we can discern within the ICILS findings high levels of access to ICT and high levels of use of these technologies by young people in and (especially) outside school, we need to remain aware of the large variations in CIL proficiency within and across the ICILS countries. Regardless of whether or not we consider young people to be digital natives, we would be naive to expect them to develop CIL in the absence of coherent learning programs.

The ICILS data furthermore showed that emphases relating to CIL outcomes were most frequently being addressed in technology or computer studies classes, the (natural) sciences, and human sciences or humanities. Queries remain, however, about how schools can and should maintain the continuity, completeness, and coherence of their CIL education programs.

Teachers’ ICT use was greatest when the teachers were confident about their expertise and were working in school environments that collaborated on and planned ICT use and had few resource limitations hindering that use. These were also the conditions that supported teachers’ ability to teach CIL. We therefore suggest that system- and school-level planning should focus on increasing teacher expertise in ICT use. We also consider that schools should endeavor to implement supportive collaborative environments that incorporate institutional planning focused on using ICT and teaching CIL in schools.

ICILS has provided a baseline study for future measurement of CIL and CIL education across countries. A future cycle of ICILS could be developed to support measurement of trends in CIL as well as maintain the study’s relevance to innovations in software, hardware, and delivery technologies. Some possibilities for future iterations of ICILS could include internet delivery of the assessment, accommodation of “bring your own device” in schools, adapting a version for use on tablet devices, and incorporating contemporary and relevant software environments, such as multimedia and gaming. The key to the future of such research is to maintain a strong link to the core elements of the construct while accommodating the new contexts in which CIL achievement can be demonstrated.