Using a constructed-response instrument to explore the effects of item position and item features on the assessment of students’ written scientific explanations

Meghan Rector Federer, Ross H. Nehm, John E. Opfer, Dennis Pearl

Research output: Contribution to journalArticle

12 Scopus citations

Abstract

A large body of work has been devoted to reducing assessment biases that distort inferences about students’ science understanding, particularly in multiple-choice instruments (MCI). Constructed-response instruments (CRI), however, have invited much less scrutiny, perhaps because of their reputation for avoiding many of the documented biases of MCIs. In this study we explored whether known biases of MCIs—specifically item sequencing and surface feature effects—were also apparent in a CRI designed to assess students’ understanding of evolutionary change using written explanation (Assessment of COntextual Reasoning about Natural Selection [ACORNS]). We used three versions of the ACORNS CRI to investigate different aspects of assessment structure and their corresponding effect on inferences about student understanding. Our results identified several sources of (and solutions to) assessment bias in this practice-focused CRI. First, along the instrument item sequence, items with similar surface features produced greater sequencing effects than sequences of items with dissimilar surface features. Second, a counterbalanced design (i.e., Latin Square) mitigated this bias at the population level of analysis. Third, ACORNS response scores were highly correlated with student verbosity, despite verbosity being an intrinsically trivial aspect of explanation quality. Our results suggest that as assessments in science education shift toward the measurement of scientific practices (e.g., explanation), it is critical that biases inherent in these types of assessments be investigated empirically.

Original languageEnglish (US)
Article numberA003
Pages (from-to)527-553
Number of pages27
JournalResearch in Science Education
Volume45
Issue number4
DOIs
StatePublished - Aug 1 2015

All Science Journal Classification (ASJC) codes

  • Education

Fingerprint Dive into the research topics of 'Using a constructed-response instrument to explore the effects of item position and item features on the assessment of students’ written scientific explanations'. Together they form a unique fingerprint.

  • Cite this