Digital divide: A critical context for digitally based assessments

Authors

  • Kadriye Ercikan University of British Columbia
  • Mustafa Asil University of Otago
  • Raman Grover Ministry of Education, British Columbia, Canada

DOI:

https://doi.org/10.14507/epaa.26.3817

Keywords:

digital divide, digital assessments, ICILS

Abstract

Student learning is increasingly taking place in digital environments both within and outside schooling contexts. Educational assessments are following suit, both to take advantage of the conveniences and opportunities that digital environments provide as well as to reflect the mediums of learning increasingly taking place in societies around the world. A social context relevant to learning and assessment in the digital age is the great differences in access to and competence in technology among students from different segments of societies. Therefore, access and competency in relation to technology become critical contexts for evaluations that rely on digitally based assessments. This chapter examines the digital divide between students from different segments of the society and discusses strategies for minimizing effects of digital divide on assessments of student learning. The research focuses on two types of demographic groups—gender and socioeconomic status (SES) groups—that have been highlighted in research on the digital divide. The research utilizes data from IEA’s International Computer and Information Literacy Study (ICILS) 2013 for Grade 8 students administered in 21 jurisdictions around the world. It thus provides an international perspective on digital divide as an important context for international assessments as well as assessments within jurisdictions such as Mexico that are conducting assessments in digitally based environments.

Downloads

Download data is not yet available.

Author Biographies

Kadriye Ercikan, University of British Columbia

Kadriye Ercikan is Professor of Education at the Faculty of Education at the University of British Columbia and Vice President of Statistical Analysis, Data Analysis, and Psychometric Research (SADA&PR) at ETS. Her scholarship focuses on design, analysis, interpretation and validity issues in large-scale assessments of educational outcomes and research methods in education. She has conducted research on translation, language and cultural issues in measurement, validating score meaning using response processes, assessment of historical thinking, and the contribution of different research paradigms to creating knowledge and making generalizations in education research. In 2000, she received an Early Career Award from the University of British Columbia, and in 2010, she received the AERA Division D Significant Contributions to Educational Measurement and Research Methodology Award for her co-edited volume Generalizing from Educational Research: Beyond Qualitative and Quantitative Polarization. She has been a member of the National Academy of Education Committee on Foundations of Educational Measurement and has served as an elected member of the National Council on Measurement in Education Board of Directors. She is currently a Fellow of the International Academy of Education and Vice-President of AERA's Division D

Mustafa Asil, University of Otago

Mustafa Asil is a research fellow at the Educational Assessment Research Unit (EARU), University of Otago, New Zealand. He is responsible for providing research and psychometric support in the execution of the National Monitoring Study of Student Achievement (NMSSA) Project. Mustafa is a psychometrician and a quantitative data analyst with a strong research interest in comparability of large-scale assessments across languages and cultures, and measurement equivalence.

Raman Grover, Ministry of Education, British Columbia, Canada

Raman Grover is a psychometrician at the Ministry of Education for the province of British Columbia in Canada. He provides psychometric expertise for large-scale provincial assessment programs. Raman’s research interests are focused on issues of fairness and bias for disadvantaged populations taking multilingual assessments, including investigating score comparability and differential item functioning in heterogeneously diverse testing populations.

Downloads

Published

2018-04-16

How to Cite

Ercikan, K., Asil, M., & Grover, R. (2018). Digital divide: A critical context for digitally based assessments. Education Policy Analysis Archives, 26, 51. https://doi.org/10.14507/epaa.26.3817

Issue

Section

Historical and Contemporary Perspectives on Educational Evaluation