Evaluating the Validity of Portfolio Assessments for Licensure Decisions

Authors

  • Mark Wilson University of California, Berkeley
  • P J Hallam California Department of Education
  • Raymond L. Pecheone Stanford University
  • Pamela A. Moss University of Michigan

DOI:

https://doi.org/10.14507/epaa.v22n6.2014

Keywords:

Teacher portfolio assessments, teacher standardized tests, correlational evidence of validity

Abstract

This study examines one part of a validity argument for portfolio assessments of teaching practice used as an indicator of teaching quality to inform a licensure decision.  We investigate the relationship among portfolio assessment scores, a test of teacher knowledge (ETS’s Praxis I and II),  and changes in student achievement (on Touchstone’s Degrees of Reading Power Test [DRP]).  Key questions are the extent to which the assessment of teaching practice (a) predict gains in students’ achievement and (b) contribute unique information to this prediction beyond what is contributed by the tests of teacher knowledge.  The venue for our study is Connecticut State Department of Education’s (CSDE) support and licensure system for beginning teachers, the Beginning Educator Support and Training (BEST) program (as it was implemented at the time of our study). We investigated whether elementary teachers’ mean effects on their students’ reading achievement support the use of BEST elementary literacy portfolio scores as a measure of teaching quality for licensure, using a data set gathered from both State and two urban school district sources.  The HLM findings indicate that BEST portfolio scores do indeed distinguish among teachers who were more and less successful in enhancing their students’ achievement. An additional analysis indicated that the BEST portfolios add information that is not contained in the Praxis tests, and are more powerful predictors of teachers’ contributions to student achievement gains.

Downloads

Download data is not yet available.

Author Biographies

Mark Wilson, University of California, Berkeley

          Mark Wilson is a professor of Education at UC, Berkeley.  He received his PhD degree from the University of Chicago in 1984.  His interests focus on measurement and applied statistics, and he has published just over 100 refereed articles in those areas.  Recently he was elected president of the Psychometric Society, and also became a member of the US National Academy of Education, and a Fellow of the American Educational Research Association.   In the past few years he has published three books: one, Constructing measures: An item response modeling approach (Routledge Academic), is an introduction to modern measurement; the second (with Paul De Boeck of the University of Ohio), Explanatory item response models: A generalized linear and nonlinear approach (Springer-Verlag), introduces an overarching framework for the statistical modeling of measurements; the third, Towards coherence between classroom assessment and accountability (University of Chicago Press—National Society for the Study of Education) is about the relationships between large-scale assessment and classroom-level assessment.  He has also recently co-chaired a US National Research Council committee on assessment of science achievement—Developing Assessments for the Next Generation Science Standards.

P J Hallam, California Department of Education

Dr. Hallam is an Education Program Consultant in the Professional Learning Support Division in the California Department of Education. Prior to 2011, she was a Research and Dissemination Monitor for Title II Part A, Improving Teacher Quality Grants, for the California Post-Secondary Education Commission. From 2002 to 2006, she was a post-doctorate researcher at Berkeley Evaluation Assessment and Research (BEAR) Center. Dr. Hallam’s passion to learn more about the validity of large-scale educational assessments evolved as a public school teacher for fifteen years in low socio-economic communities.

Raymond L. Pecheone, Stanford University

Raymond Pecheone is currently a Professor of Practice in the Graduate School of Education at Stanford University.  Over the course of his career, Dr. Pecheone has been a leader in high stakes educational reform through assessment, research and policy work that has shaped district and state policies in curriculum and assessment by building broad-based grassroots support for strategic new approaches to assessment and learning. Dr. Pecheone has had national impact in educational assessment through the development of nationally available assessments of teaching (edTPA) and student learning (Smarter Balanced Performance Assessment). 

Pamela A. Moss, University of Michigan

Pamela Moss is a Professor of Education at the University of Michigan. Her work lies at the intersections of educational assessment, philosophy of social science, and interpretive or qualitative research methods. Two edited books illustrate these intersections:  Evidence and Decision Making (2007) illuminates the crucial roles that teachers, administrators, and other education professionals play in constructing and using evidence to make decisions that support learning. Assessment, Equity, and Opportunity to Learn (2008) explores the synergies and disjunctions between psychometric and sociocultural orientations to opportunity to learn and assessment. Her current research agenda focuses on validity theory in educational assessment, assessment as a social practice, and the assessment of teaching.  She is a Fellow of the American Educational Research Association.  She was a member of the AERA, APA, NCME committee revising the 1999 Standards for Educational and Psychological Testing, of the National Research Council's Committee on Assessment and Teacher Quality, and chair of AERA’s Task Force on developing Standards for Reporting on Empirical Social Science Research.

Downloads

Published

2014-02-10

How to Cite

Wilson, M., Hallam, P. J., Pecheone, R. L., & Moss, P. A. (2014). Evaluating the Validity of Portfolio Assessments for Licensure Decisions. Education Policy Analysis Archives, 22, 6. https://doi.org/10.14507/epaa.v22n6.2014

Issue

Section

Articles